WorldWideScience

Sample records for 100-year tool applied

  1. Solving the Supreme Problem: 100 years of selection and recruitment at the Journal of Applied Psychology.

    Science.gov (United States)

    Ployhart, Robert E; Schmitt, Neal; Tippins, Nancy T

    2017-03-01

    This article reviews 100 years of research on recruitment and selection published in the Journal of Applied Psychology. Recruitment and selection research has been present in the Journal from the very first issue, where Hall (1917) suggested that the challenge of recruitment and selection was the Supreme Problem facing the field of applied psychology. As this article shows, the various topics related to recruitment and selection have ebbed and flowed over the years in response to business, legal, and societal changes, but this Supreme Problem has captivated the attention of scientist-practitioners for a century. Our review starts by identifying the practical challenges and macro forces that shaped the sciences of recruitment and selection and helped to define the research questions the field has addressed. We then describe the evolution of recruitment and selection research and the ways the resulting scientific advancements have contributed to staffing practices. We conclude with speculations on how recruitment and selection research may proceed in the future. Supplemental material posted online provides additional depth by including a summary of practice challenges and scientific advancements that affected the direction of selection and recruitment research and an outline of seminal articles published in the Journal and corresponding time line. The 100-year anniversary of the Journal of Applied Psychology is very much the celebration of recruitment and selection research, although predictions about the future suggest there is still much exciting work to be done. (PsycINFO Database Record

  2. 100 years of applied psychology research on individual careers: From career management to retirement.

    Science.gov (United States)

    Wang, Mo; Wanberg, Connie R

    2017-03-01

    This article surveys 100 years of research on career management and retirement, with a primary focus on work published in the Journal of Applied Psychology. Research on career management took off in the 1920s, with most attention devoted to the development and validation of career interest inventories. Over time, research expanded to attend to broader issues such as the predictors and outcomes of career interests and choice; the nature of career success and who achieves it; career transitions and adaptability to change; retirement decision making and adjustment; and bridge employment. In this article, we provide a timeline for the evolution of the career management and retirement literature, review major theoretical perspectives and findings on career management and retirement, and discuss important future research directions. (PsycINFO Database Record

  3. 100 years of superconductivity

    CERN Document Server

    Rogalla, Horst

    2011-01-01

    Even a hundred years after its discovery, superconductivity continues to bring us new surprises, from superconducting magnets used in MRI to quantum detectors in electronics. 100 Years of Superconductivity presents a comprehensive collection of topics on nearly all the subdisciplines of superconductivity. Tracing the historical developments in superconductivity, the book includes contributions from many pioneers who are responsible for important steps forward in the field.The text first discusses interesting stories of the discovery and gradual progress of theory and experimentation. Emphasizi

  4. [Osteology--100 years].

    Science.gov (United States)

    Götte, S

    2001-10-01

    As is the case for many other subspecialties of medical science, osteology has developed in tandem with technological progress over the last 100 years. The discover of X-rays made visualization of the skeletal system possible. Progress in surgery and hygiene permitted examination and treatment of bones in vivo. Optical techniques made it possible to gain insight into the microarchitecture of the bone. Chemistry and biochemistry opened the door for pathophysiology and microcellular assessment of the bone so that modern osteology deals with interventions in cellular mechanisms, in particular for the treatment of bone diseases. The realization that the bone represents a dynamic tissue, characterized by processes of generation and degeneration, was decisive. These events have a profound influence on the treatment of osteoporosis. Questions pertaining to osteology have been subject to heightened interdisciplinary debate in the past few years, which is reflected in interdisciplinary associations and co-operative groups, and ultimately the umbrella Society of Osteology. Contemplation of the subject from an interdisciplinary viewpoint shows what a significant and natural role orthopedics plays in research on bone metabolism, but also in the treatment of bone diseases. Interdisciplinary cooperation aids quality control and is also reflected in the formulation of common guidelines for the clinical picture of osteoporosis, which constitutes a major epidemiological disease.

  5. 100 Years of General Relativity

    CERN Document Server

    Ellis, George F R

    2015-01-01

    This is Chapter 1 in the book General Relativity and Gravitation: A Centennial Perspective, Edited by Abhay Ashtekar (Editor in Chief), Beverly Berger, James Isenberg, Malcolm MacCallum. Publisher: Cambridge University Press (June, 2015). It gives a survey of themes that have been developed during the 100 years of progress in general relativity theory.

  6. 100 years of Philips Research

    Science.gov (United States)

    van Delft, Dirk

    2014-03-01

    On Thursday 23 October 1913, a Dutch newspaper published the following advertisement: Hiring: A capable young scientist with a doctorate in physics. Must be a good experimenter. Letters containing information on age, life history and references may be submitted to Philips in Eindhoven. Two days later, a candidate applied: Gilles Holst. At that time, Holst was working in Leiden as an assistant to Heike Kamerlingh Onnes, a recent Nobel Prize winner.

  7. 100-Year Floodplains, 100 Year Floodplains, Published in 2007, Not Applicable scale, Dunn County, WI.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at Not Applicable scale, was produced all or in part from LIDAR information as of 2007. It is described as '100 Year...

  8. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  9. Energizer keep going: 100 years of superconductivity

    Institute of Scientific and Technical Information of China (English)

    Pengcheng Dai; Xing-jiang Zhou; Dao-xin Yao

    2011-01-01

    It has been 100 years since Heike Kamerlingh Onnes discovered superconductivity on April 8,1911.Amazingly,this field is still very active and keeps booming,like a magic.A lot of new phenomena and materials have been found,and superconductors have been used in many different fields to improve our lives.Onnes won the Nobel Prize for this incredible discovery in 1913 and used the word superconductivity for the first time.Onnes believed that quantum mechanics would explain the effect,but he could not produce a theory at that time.Now we know superconductivity is a macroscopic quantum phenomenon.

  10. Analysis of 100 Years of Curriculum Designs

    Directory of Open Access Journals (Sweden)

    Lynn Kelting-Gibson

    2013-01-01

    Full Text Available Fifteen historical and contemporary curriculum designs were analyzed for elements of assessment that support student learning and inform instructional decisions. Educational researchers are purposely paying attention to the role assessment plays in a well-designed planning and teaching process. Assessment is a vital component to educational planning and teaching because it is a way to gather accurate evidence of student learning and information to inform instructional decisions. The purpose of this review was to analyze 100 years of curriculum designs to uncover elements of assessment that will support teachers in their desire to improve student learning. Throughout this research the author seeks to answer the question: Do historical and contemporary curriculum designs include elements of assessment that help teachers improve student learning? The results of the review reveal that assessment elements were addressed in all of the curricular designs analyzed, but not all elements of assessment were identified using similar terminology. Based on the analysis of this review, it is suggested that teachers not be particular about the terminology used to describe assessment elements, as all curriculum models discussed use one or more elements similar to the context of pre, formative, and summative assessments.

  11. [Radical prostatectomy--100 years of evolution].

    Science.gov (United States)

    Gofrit, Ofer N; Shalhav, Arieh L

    2008-07-01

    Prostate cancer is the most common malignant disease in men. The incidence of prostate cancer has been rising since the early 1990s. Not all men inflicted by prostate cancer will develop clinical disease. Therefore, sorting these cases is a great clinical challenge. Radical prostatectomy has undergone evolution in the last 100 years. Better understanding of the pelvic anatomy has led to a decrease in the blood loss during surgery and in the rate of urinary incontinence and erectile dysfunction following surgery. The introduction of laparoscopy in the late 1990s to this surgery provided the surgeon with a magnified multi-angle field of view and facilitated accurate dissection and suturing. Decreased damage to neighboring tissue made recovery hastier. Nevertheless, laparoscopic radical prostatectomy is a technically challenging surgery and did not become popular. The last step in the evolution of radical prostatectomy is the introduction of robotic systems for assistance in laparoscopic radical prostatectomy. A master-slave robotic system is composed of console and mechanical arms. The surgeon is provided with a magnified three dimensional view of the operative field and with two mechanical arms that accurately replicate its fingers movements. The initial results of robotic-assisted laparoscopic prostatectomy seem promising, however, long-term follow-up and comparison to open surgeries are lacking. Robotic systems were rapidly implemented in the American market and in the year 2006, 40% of all radical prostatectomies were robotic assisted. Future systems may reveal deep structures to the visualized surface by superimposing MRI images on the surgical field.

  12. Healthcare, molecular tools and applied genome research.

    Science.gov (United States)

    Groves, M

    2000-11-01

    Biotechnology 2000 offered a rare opportunity for scientists from academia and industry to present and discuss data in fields as diverse as environmental biotechnology and applied genome research. The healthcare section of the meeting encompassed a number of gene therapy delivery systems that are successfully treating genetic disorders. Beta-thalassemia is being corrected in mice by continous erythropoeitin delivery from engineered muscles cells, and from naked DNA electrotransfer into muscles, as described by Dr JM Heard (Institut Pasteur, Paris, France). Dr Reszka (Max-Delbrueck-Centrum fuer Molekulare Medizin, Berlin, Germany), meanwhile, described a treatment for liver metastasis in the form of a drug carrier emolization system, DCES (Max-Delbrueck-Centrum fuer Molekulare Medizin), composed of surface modified liposomes and a substance for chemo-occlusion, which drastically reduces the blood supply to the tumor and promotes apoptosis, necrosis and antiangiogenesis. In the molecular tools section, Willem Stemmer (Maxygen Inc, Redwood City, CA, USA) gave an insight into the importance that techniques, such as molecular breeding (DNA shuffling), have in the evolution of molecules with improved function, over a range of fields including pharmaceuticals, vaccines, agriculture and chemicals. Technologies, such as ribosome display, which can incorporate the evolution and the specific enrichment of proteins/peptides in cycles of selection, could play an enormous role in the production of novel therapeutics and diagnostics in future years, as explained by Andreas Plückthun (Institute of Biochemistry, University of Zurich, Switzerland). Applied genome research offered technologies, such as the 'in vitro expression cloning', described by Dr Zwick (Promega Corp, Madison, WI, USA), are providing a functional analysis for the overwhelming flow of data emerging from high-throughput sequencing of genomes and from high-density gene expression microarrays (DNA chips). The

  13. Beam Line: 100 years of elementary particles

    Science.gov (United States)

    Pais, A.; Weinberg, S.; Quigg, C.; Riordan, M.; Panofsky, W. K. H.

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  14. Opening the 100-Year Window for Time Domain Astronomy

    CERN Document Server

    Grindlay, Jonathan; Los, Edward; Servillat, Mathieu

    2012-01-01

    The large-scale surveys such as PTF, CRTS and Pan-STARRS-1 that have emerged within the past 5 years or so employ digital databases and modern analysis tools to accentuate research into Time Domain Astronomy (TDA). Preparations are underway for LSST which, in another 6 years, will usher in the second decade of modern TDA. By that time the Digital Access to a Sky Century @ Harvard (DASCH) project will have made available to the community the full sky Historical TDA database and digitized images for a century (1890--1990) of coverage. We describe the current DASCH development and some initial results, and outline plans for the "production scanning" phase and data distribution which is to begin in 2012. That will open a 100-year window into temporal astrophysics, revealing rare transients and (especially) astrophysical phenomena that vary on time-scales of a decade. It will also provide context and archival comparisons for the deeper modern surveys

  15. Advances of Bioinformatics Tools Applied in Virus Epitopes Prediction

    Institute of Scientific and Technical Information of China (English)

    Ping Chen; Simon Rayner; Kang-hong Hu

    2011-01-01

    In recent years, the in silico epitopes prediction tools have facilitated the progress of vaccines development significantly and many have been applied to predict epitopes in viruses successfully. Herein, a general overview of different tools currently available, including T cell and B cell epitopes prediction tools, is presented. And the principles of different prediction algorithms are reviewed briefly. Finally, several examples are present to illustrate the application of the prediction tools.

  16. The Royal Aircraft Establishment - 100 Years of Research.

    Science.gov (United States)

    1981-10-02

    OF RESIARCH.(U) OCT 61 A J SMITH UNCLASSIFIED RAE-TN-FS-432 DRIC-BR-80894 NLslomommmmoimco IE-E,,.IuENE 1111U1IIIjj -. ’ jI2 MICROCOPY RESOLUTION TESI ...Aircraft Estpblishment - 100 years of research 7a. (For Translations ) .Title in Foreign Language 7b. (For Conference Papers) Title, Place and Date of

  17. A 100-year review: Carbohydrates - characterization, digestion, and utilization

    Science.gov (United States)

    Our knowledge of the role of carbohydrates in dairy cattle nutrition has advanced substantially during the 100 years in which the Journal of Dairy Science has been published. In this review, we traced the history of scientific investigation and discovery from crude fiber, nitrogen-free extract, and ...

  18. Spring wheat gliadins: Have they changed in 100 years?

    Science.gov (United States)

    There have been many hard red spring (HRS) wheat cultivars released in North Dakota during the last 100 years. These cultivars have been improved for various characteristics such as, adaptation to weather conditions, high yield, and good milling and baking quality. The objectives of this study wer...

  19. Bacteriophages, revitalized after 100 years in the shadow of antibiotics

    Institute of Scientific and Technical Information of China (English)

    Hongping; Wei

    2015-01-01

    <正>The year 2015 marks 100 years since Dr.Frederick Twort discovered the"filterable lytic factor",which was later independently discovered and named "bacteriophage" by Dr.Felix d’Herelle.On this memorable centennial,it is exciting to see a special issue published by Virologica Sinica on Phages and Therapy.In this issue,readers will not only fi nd that bacteriophage research is a

  20. Improving durability of hot forging tools by applying hybrid layers

    Directory of Open Access Journals (Sweden)

    Z. Gronostajski

    2015-10-01

    Full Text Available This paper deals with problems relating to the durability of the dies used for the hot forging of spur gears. The results of industrial tests carried out on dies with a hybrid layer (a nitrided layer (PN + physical vapor deposition (PVD coating applied to improve their durability are presented. Two types of hybrid layers, differing in their PVD coating, were evaluated with regard to their durability improvement effectiveness. The tests have shown that by applying hybrid layers of the nitrided layer/PVD coating type one can effectively increase the durability of hot forging tools.

  1. Hygrothermal Numerical Simulation Tools Applied to Building Physics

    CERN Document Server

    Delgado, João M P Q; Ramos, Nuno M M; Freitas, Vasco Peixoto

    2013-01-01

    This book presents a critical review on the development and application of hygrothermal analysis methods to simulate the coupled transport processes of Heat, Air, and Moisture (HAM) transfer for one or multidimensional cases. During the past few decades there has been relevant development in this field of study and an increase in the professional use of tools that simulate some of the physical phenomena that are involved in Heat, Air and Moisture conditions in building components or elements. Although there is a significant amount of hygrothermal models referred in the literature, the vast majority of them are not easily available to the public outside the institutions where they were developed, which restricts the analysis of this book to only 14 hygrothermal modelling tools. The special features of this book are (a) a state-of-the-art of numerical simulation tools applied to building physics, (b) the boundary conditions importance, (c) the material properties, namely, experimental methods for the measuremen...

  2. Coating-substrate-simulations applied to HFQ® forming tools

    Directory of Open Access Journals (Sweden)

    Leopold Jürgen

    2015-01-01

    Full Text Available In this paper a comparative analysis of coating-substrate simulations applied to HFQTM forming tools is presented. When using the solution heat treatment cold die forming and quenching process, known as HFQTM, for forming of hardened aluminium alloy of automotive panel parts, coating-substrate-systems have to satisfy unique requirements. Numerical experiments, based on the Advanced Adaptive FE method, will finally present.

  3. Applying computer simulation models as learning tools in fishery management

    Science.gov (United States)

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  4. Extending dry storage of spent LWR fuel for 100 years.

    Energy Technology Data Exchange (ETDEWEB)

    Einziger, R. E.

    1998-12-16

    Because of delays in closing the back end of the fuel cycle in the U.S., there is a need to extend dry inert storage of spent fuel beyond its originally anticipated 20-year duration. Many of the methodologies developed to support initial licensing for 20-year storage should be able to support the longer storage periods envisioned. This paper evaluates the applicability of existing information and methodologies to support dry storage up to 100 years. The thrust of the analysis is the potential behavior of the spent fuel. In the USA, the criteria for dry storage of LWR spent fuel are delineated in 10 CFR 72 [1]. The criteria fall into four general categories: maintain subcriticality, prevent the release of radioactive material above acceptable limits, ensure that radiation rates and doses do not exceed acceptable levels, and maintain retrievability of the stored radioactive material. These criteria need to be considered for normal, off-normal, and postulated accident conditions. The initial safety analysis report submitted for licensing evaluated the fuel's ability to meet the requirements for 20 years. It is not the intent to repeat these calculations, but to look at expected behavior over the additional 80 years, during which the temperatures and radiation fields are lower. During the first 20 years, the properties of the components may change because of elevated temperatures, presence of moisture, effects of radiation, etc. During normal storage in an inert atmosphere, there is potential for the cladding mechanical properties to change due to annealing or interaction with cask materials. The emissivity of the cladding could also change due to storage conditions. If there is air leakage into the cask, additional degradation could occur through oxidation in breached rods, which could lead to additional fission gas release and enlargement of cladding breaches. Air in-leakage could also affect cover gas conductivity, cladding oxidation, emissivity changes, and

  5. 100 years after Smoluchowski: stochastic processes in cell biology

    Science.gov (United States)

    Holcman, D.; Schuss, Z.

    2017-03-01

    100 years after Smoluchowski introduced his approach to stochastic processes, they are now at the basis of mathematical and physical modeling in cellular biology: they are used for example to analyse and to extract features from a large number (tens of thousands) of single molecular trajectories or to study the diffusive motion of molecules, proteins or receptors. Stochastic modeling is a new step in large data analysis that serves extracting cell biology concepts. We review here Smoluchowski’s approach to stochastic processes and provide several applications for coarse-graining diffusion, studying polymer models for understanding nuclear organization and finally, we discuss the stochastic jump dynamics of telomeres across cell division and stochastic gene regulation.

  6. Total Hip Arthroplasty – over 100 years of operative history

    Directory of Open Access Journals (Sweden)

    Stephen Richard Knight

    2011-11-01

    Full Text Available Total hip arthroplasty (THA has completely revolutionised the nature in which the arthritic hip is treated, and is considered to be one of the most successful orthopaedic interventions of its generation (1. With over 100 years of operative history, this review examines the progression of the operation from its origins, together with highlighting the materials and techniques that have contributed to its development. Knowledge of its history contributes to a greater understanding of THA, such as the reasons behind selection of prosthetic materials in certain patient groups, while demonstrating the importance of critically analyzing research to continually determine best operative practice. Finally, we describe current areas of research being undertaken to further advance techniques and improve outcomes.

  7. Relativity and Gravitation : 100 Years After Einstein in Prague

    CERN Document Server

    Ledvinka, Tomáš; General Relativity, Cosmology and Astrophysics : Perspectives 100 Years After Einstein's Stay in Prague

    2014-01-01

    In early April 1911 Albert Einstein arrived in Prague to become full professor of theoretical physics at the German part of Charles University. It was there, for the first time, that he concentrated primarily on the problem of gravitation. Before he left Prague in July 1912 he had submitted the paper “Relativität und Gravitation: Erwiderung auf eine Bemerkung von M. Abraham” in which he remarkably anticipated what a future theory of gravity should look like. At the occasion of the Einstein-in-Prague centenary an international meeting was organized under a title inspired by Einstein's last paper from the Prague period: "Relativity and Gravitation, 100 Years after Einstein in Prague". The main topics of the conference included: classical relativity, numerical relativity, relativistic astrophysics and cosmology, quantum gravity, experimental aspects of gravitation, and conceptual and historical issues. The conference attracted over 200 scientists from 31 countries, among them a number of leading experts in ...

  8. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  9. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  10. Overuse syndrome in musicians--100 years ago. An historical review.

    Science.gov (United States)

    Fry, H J

    Overuse syndrome in musicians was extensively reported 100 years ago. The clinical features and results of treatment, which were recorded in considerable detail, match well the condition that is described today. The medical literature that is reviewed here extends from 1830 to 1911 and includes 21 books and 54 articles from the English language literature, apart from two exceptions; however, the writers of the day themselves reviewed French, German and Italian literature on the subject. The disorder was said to result from the overuse of the affected parts. Two theories of aetiology, not necessarily mutually exclusive, were argued. The central theory regarded the lesion as being in the central nervous system, the peripheral theory implied a primary muscle disorder. No serious case was put forward for a psychogenic origin, though emotional factors were believed to aggravate the condition. Advances in musical instrument manufacture--particularly the development of the concert piano and the clarinet--may have played a part in the prevalence of overuse syndrome in musicians. Total rest from the mechanical use of the hand was the only effective treatment recorded.

  11. [Sheehan's syndrome--a forgotten disease with 100 years' history].

    Science.gov (United States)

    Krysiak, Robert; Okopień, Bogusław

    2015-01-01

    Although named after Harold Sheehan, postpartum ischemic pituitary necrosis was reported for the first time 100 years ago in Przeglad Lekarski by Leon Konrad Gliński. In the majority of cases, the syndrome is a consequence of severe postpartum bleeding episode resulting in severe hypotension or hemorrhagic shock. The frequency of Sheehan's syndrome has decreased in developed countries as a result of improved obstetrical care, but this clinical entity remains a common cause of hypopituitarism in developing countries. The syndrome is characterized by varying degrees of anterior pituitary dysfunction resulting from the deficiency of multiple pituitary hormones. The order of frequency of hormone loss has generally been found to be growth hormone and prolactin, gonadotropins, ACTH and thyrotropin. Women with Sheehan's syndrome exhibit a variety of signs and symptoms including failure to lactate or resume menses, loss of genital and axillary hair, and often occurring long after delivery clinical manifestations of central hypothyroidism and secondary adrenal insufficiency. Diagnosis is based on laboratory studies, including hormone levels and hormone stimulation tests. Treatment of Sheehan's syndrome involves hormone replacement therapy. The aim of this study is to review current knowledge on clinically relevant aspects of this clinical entity and to provide the reader with recommendations concerning its diagnosis and treatment.

  12. Framing 100-year overflowing and overtopping marine submersion hazard resulting from the propagation of 100-year joint hydrodynamic conditions

    Science.gov (United States)

    Nicolae Lerma, A.; Bulteau, T.; Elineau, S.; Paris, F.; Pedreros, R.

    2016-12-01

    Marine submersion is an increasing concern for coastal cities as urban development reinforces their vulnerabilities while climate change is likely to foster the frequency and magnitude of submersions. Characterising the coastal flooding hazard is therefore of paramount importance to ensure the security of people living in such places and for coastal planning. A hazard is commonly defined as an adverse phenomenon, often represented by a magnitude of a variable of interest (e.g. flooded area), hereafter called response variable, associated with a probability of exceedance or, alternatively, a return period. Characterising the coastal flooding hazard consists in finding the correspondence between the magnitude and the return period. The difficulty lies in the fact that the assessment is usually performed using physical numerical models taking as inputs scenarios composed by multiple forcing conditions that are most of the time interdependent. Indeed, a time series of the response variable is usually not available so we have to deal instead with time series of forcing variables (e.g. water level, waves). Thus, the problem is twofold: on the one hand, the definition of scenarios is a multivariate matter; on the other hand, it is tricky and approximate to associate the resulting response, being the output of the physical numerical model, to the return period defined for the scenarios. In this study, we illustrate the problem on the district of Leucate, located in the French Mediterranean coast. A multivariate extreme value analysis of waves and water levels is performed offshore using a conditional extreme model, then two different methods are used to define and select 100-year scenarios of forcing variables: one based on joint exceedance probability contours, a method classically used in coastal risks studies, the other based on environmental contours, which are commonly used in the field of structure design engineering. We show that these two methods enable one to

  13. Creating Long Term Income Streams for the 100 Year Starship Study Initiative

    Science.gov (United States)

    Sylvester, A. J.

    Development and execution of long term research projects are very dependent on a consistent application of funding to maximize the potential for success. The business structure for the 100 Year Starship Study project should allow for multiple income streams to cover the expenses of the research objectives. The following examples illustrate the range of potential avenues: 1) affiliation with a charitable foundation for creating a donation program to fund a long term endowment for research, 2) application for grants to fund initial research projects and establish the core expertise of the research entity, 3) development of intellectual property which can then be licensed for additional revenue, 4) creation of spinout companies with equity positions retained by the lab for funding the endowment, and 5) funded research which is dual use for the technology goals of the interstellar flight research objectives. With the establishment of a diversified stream of funding options, then the endowment can be funded at a level to permit dedicated research on the interstellar flight topics. This paper will focus on the strategy of creating spinout companies to create income streams which would fund the endowment of the 100 Year Starship Study effort. This technique is widely used by universities seeking to commercially develop and market technologies developed by university researchers. An approach will be outlined for applying this technique to potentially marketable technologies generated as a part of the 100 Year Starship Study effort.

  14. Progress of Cometary Science in the Past 100 Years

    Science.gov (United States)

    Sekanina, Zdenek

    1999-01-01

    Enormous strides made by cometary science during the 20th century defy any meaningful comparison of its state 100 years ago and now. The great majority of the subfields enjoying much attention nowadays did not exist in the year 1900. Dramatic developments, especially in the past 30-50 years, have equally affected observational and theoretical studies of comets. The profound diversification of observing techniques has been documented by the ever widening limits on the electromagnetic spectrum covered. While the time around 1900 marked an early period of slow and painful experimentation with photographic methods in cometary studies, observations of comets from the x-ray region to the radio waves have by now become routine. Many of the new techniques, and all those involved with the wavelengths shorter than about 300 nm, were made possible by another major breakthrough of this century - observing from space. Experiments on dedicated Earth-orbiting satellites as well as several deep-space probes have provided fascinating new information on the nature and makeup of comets. In broader terms, much of the progress has been achieved thanks to fundamental discoveries and major advances in electronics, whose applications resulted in qualitatively new instruments (e.g. radiotelescopes) and sensors or detectors (e.g. CCD arrays). The most universal effect on the entire cometary science, from observing to data handling to quantitative interpretations, has been, as in any other branch of science, due to the introduction of electronic computers, with their processing capabilities not only unheard of, but literally unimaginable, in the age of classical desk calculators. As if all this should not be enough, the today's generations of comet scientists have, in addition, been blessed with nature's highly appreciated cooperation. Indeed, in the span of a dozen years, between 1985 and 1997, we were privileged to witness four remarkable cometary events: (i) a return of Halley

  15. 100 years of seismic research on the Moho

    DEFF Research Database (Denmark)

    Prodehl, Claus; Kennett, Brian; Artemieva, Irina

    2013-01-01

    on the Moho is primarily based on the comprehensive overview of the worldwide history of seismological studies of the Earth's crust using controlled sources from 1850 to 2005, by Prodehl and Mooney (2012). Though the art of applying explosions, so-called “artificial events”, as energy sources for studies......The detection of a seismic boundary, the “Moho”, between the outermost shell of the Earth, the Earth's crust, and the Earth's mantle by A. Mohorovičić was the consequence of increased insight into the propagation of seismic waves caused by earthquakes. This short history of seismic research...

  16. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Science.gov (United States)

    Biggs, Matthew B; Papin, Jason A

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  17. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Directory of Open Access Journals (Sweden)

    Matthew B Biggs

    Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  18. Experiences & Tools from Modeling Instruction Applied to Earth Sciences

    Science.gov (United States)

    Cervenec, J.; Landis, C. E.

    2012-12-01

    The Framework for K-12 Science Education calls for stronger curricular connections within the sciences, greater depth in understanding, and tasks higher on Bloom's Taxonomy. Understanding atmospheric sciences draws on core knowledge traditionally taught in physics, chemistry, and in some cases, biology. If this core knowledge is not conceptually sound, well retained, and transferable to new settings, understanding the causes and consequences of climate changes become a task in memorizing seemingly disparate facts to a student. Fortunately, experiences and conceptual tools have been developed and refined in the nationwide network of Physics Modeling and Chemistry Modeling teachers to build necessary understanding of conservation of mass, conservation of energy, particulate nature of matter, kinetic molecular theory, and particle model of light. Context-rich experiences are first introduced for students to construct an understanding of these principles and then conceptual tools are deployed for students to resolve misconceptions and deepen their understanding. Using these experiences and conceptual tools takes an investment of instructional time, teacher training, and in some cases, re-envisioning the format of a science classroom. There are few financial barriers to implementation and students gain a greater understanding of the nature of science by going through successive cycles of investigation and refinement of their thinking. This presentation shows how these experiences and tools could be used in an Earth Science course to support students developing conceptually rich understanding of the atmosphere and connections happening within.

  19. Process for selecting engineering tools : applied to selecting a SysML tool.

    Energy Technology Data Exchange (ETDEWEB)

    De Spain, Mark J.; Post, Debra S. (Sandia National Laboratories, Livermore, CA); Taylor, Jeffrey L.; De Jong, Kent

    2011-02-01

    Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.

  20. Process for selecting engineering tools : applied to selecting a SysML tool.

    Energy Technology Data Exchange (ETDEWEB)

    De Spain, Mark J.; Post, Debra S. (Sandia National Laboratories, Livermore, CA); Taylor, Jeffrey L.; De Jong, Kent

    2011-02-01

    Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.

  1. INNOVATIVE SOLUTIONS APPLIED IN TOOLS FOR DETERMINING COAL MECHANICAL PROPRERTIES

    Directory of Open Access Journals (Sweden)

    Witold BIAŁY

    2015-10-01

    Full Text Available Due to very specific conditions of work of machines and equipment used in coal mining industry, the manner of their selection, taking into account the changing working conditions, is very important. Appropriate selection influences the increased durability and reliability of machines and equipment, which translates into the economic effects achieved. As the issue of measurement and evaluation of coal mechanical properties (including coal workability measurement is of great importance, previously applied methods for coal workability evaluation have been shortly reviewed. The im-portance of the problem is confirmed by the number of methods developed in various research centres all over the world. The article presents new instruments for determining and evaluating the mechanical properties of coal material (workability. The instruments have been developed in Poland and the author of this article is their co-inventor. The construction, principle of operation and innovative character of solutions applied in the instruments have been presented.

  2. 100-Year Floodplains, 100 year flood plain data, Published in 2006, 1:1200 (1in=100ft) scale, Washoe County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:1200 (1in=100ft) scale, was produced all or in part from Field Survey/GPS information as of 2006. It is described...

  3. Applying macro design tools to the design of MEMS accelerometers

    Energy Technology Data Exchange (ETDEWEB)

    Davies, B.R.; Rodgers, M.S.; Montague, S.

    1998-02-01

    This paper describes the design of two different surface micromachined (MEMS) accelerometers and the use of design and analysis tools intended for macro sized devices. This work leverages a process for integrating both the micromechanical structures and microelectronics circuitry of a MEMS accelerometer on the same chip. In this process, the mechanical components of the sensor are first fabricated at the bottom of a trench etched into the wafer substrate. The trench is then filled with oxide and sealed to protect the mechanical components during subsequent microelectronics processing. The wafer surface is then planarized in preparation for CMOS processing. Next, the CMOS electronics are fabricated and the mechanical structures are released. The mechanical structure of each sensor consists of two polysilicon plate masses suspended by multiple springs (cantilevered beam structures) over corresponding polysilicon plates fixed to the substrate to form two parallel plate capacitors. One polysilicon plate mass is suspended using compliant springs forming a variable capacitor. The other polysilicon plate mass is suspended using very stiff springs acting as a fixed capacitor. Acceleration is measured by comparing the variable capacitance with the fixed capacitance during acceleration.

  4. Geo-environmental mapping tool applied to pipeline design

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, Karina de S.; Calle, Jose A.; Gil, Euzebio J. [Geomecanica S/A Tecnologia de Solo Rochas e Materiais, Rio de Janeiro, RJ (Brazil); Sare, Alexandre R. [Geomechanics International Inc., Houston, TX (United States); Soares, Ana Cecilia [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    The Geo-Environmental Mapping is an improvement of the Geological-Geotechnical Mapping used for basic pipeline designs. The main purpose is to assembly the environmental, geotechnical and geological concepts in a methodological tool capable to predict constrains and reduce the pipeline impact to the environment. The Geo-Environmental mapping was built to stress the influence of soil/structure interaction, related to the physical effect that comes from the contact between structures and soil or rock. A Geological-Geotechnical-Environmental strip (chart) was presented to emphasize the pipeline operational constrains and its influence to the environment. The mapping was developed to clearly show the occurrence and properties of geological materials divided into geotechnical domain units (zones). The strips present construction natural properties, such as: excavability, stability of the excavation and soil re-use capability. Also, the environmental constrains were added to the geological-geotechnical mapping. The Geo-Environmental Mapping model helps the planning of the geotechnical and environmental inquiries to be carried out during executive design, the discussion on the types of equipment to be employed during construction and the analysis of the geological risks and environmental impacts to be faced during the built of the pipeline. (author)

  5. Big Data Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2017-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and to...

  6. Quality control tools applied to a PV microgrid in Ecuador

    Energy Technology Data Exchange (ETDEWEB)

    Camino-Villacorta, M.; Egido-Aguilera, M.A. [Ciudad Univ., Madrid (Spain). Inst. de Energia Solar - UPM; Gamez, J.; Arranz-Piera, P. [Trama Tecnoambiental (TTA), Barcelona (Spain)

    2010-07-01

    The Instituto de Energia Solar has been dealing with quality control issues for rural electrification for many years. In the framework of project DOSBE (Development of Electricity Service Operators for Poverty Alleviation in Ecuador and Peru), a technical toolkit has been developed to implement adapted integral quality control procedures for photovoltaic systems (covering all components and equipment, installation and servicing), applicable at a local and regional scale, with the overall aim of increasing the confidence in photovoltaic systems. This toolkit was applied in the evaluation of an existing microgrid in Ecuador, which is described in this paper. The toolkit and the detailed results of its application are presented in a published document which is being widely distributed among the stakeholders of rural electrification in Ecuador and Peru. It can be downloaded from the web page of the DOSBE project: www.dosbe.org (orig.)

  7. Design Science Methodology Applied to a Chemical Surveillance Tool

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Zhuanyi; Han, Kyungsik; Charles-Smith, Lauren E.; Henry, Michael J.

    2017-05-11

    Public health surveillance systems gain significant benefits from integrating existing early incident detection systems,supported by closed data sources, with open source data.However, identifying potential alerting incidents relies on finding accurate, reliable sources and presenting the high volume of data in a way that increases analysts work efficiency; a challenge for any system that leverages open source data. In this paper, we present the design concept and the applied design science research methodology of ChemVeillance, a chemical analyst surveillance system.Our work portrays a system design and approach that translates theoretical methodology into practice creating a powerful surveillance system built for specific use cases.Researchers, designers, developers, and related professionals in the health surveillance community can build upon the principles and methodology described here to enhance and broaden current surveillance systems leading to improved situational awareness based on a robust integrated early warning system.

  8. Monitoring operational data production applying Big Data tooling

    Science.gov (United States)

    Som de Cerff, Wim; de Jong, Hotze; van den Berg, Roy; Bos, Jeroen; Oosterhoff, Rijk; Klein Ikkink, Henk Jan; Haga, Femke; Elsten, Tom; Verhoef, Hans; Koutek, Michal; van de Vegte, John

    2015-04-01

    Within the KNMI Deltaplan programme for improving the KNMI operational infrastructure an new fully automated system for monitoring the KNMI operational data production systems is being developed: PRISMA (PRocessflow Infrastructure Surveillance and Monitoring Application). Currently the KNMI operational (24/7) production systems consist of over 60 applications, running on different hardware systems and platforms. They are interlinked for the production of numerous data products, which are delivered to internal and external customers. All applications are individually monitored by different applications, complicating root cause and impact analysis. Also, the underlying hardware and network is monitored separately using Zabbix. Goal of the new system is to enable production chain monitoring, which enables root cause analysis (what is the root cause of the disruption) and impact analysis (what other products will be effected). The PRISMA system will make it possible to dispose all the existing monitoring applications, providing one interface for monitoring the data production. For modeling the production chain, the Neo4j Graph database is used to store and query the model. The model can be edited through the PRISMA web interface, but is mainly automatically provided by the applications and systems which are to be monitored. The graph enables us to do root case and impact analysis. The graph can be visualized in the PRISMA web interface on different levels. Each 'monitored object' in the model will have a status (OK, error, warning, unknown). This status is derived by combing all log information available. For collecting and querying the log information Splunk is used. The system is developed using Scrum, by a multi-disciplinary team consisting of analysts, developers, a tester and interaction designer. In the presentation we will focus on the lessons learned working with the 'Big data' tooling Splunk and Neo4J.

  9. 100 Years Jubilee for the discovery of the enzymes in yeast

    DEFF Research Database (Denmark)

    Berg, Rolf W.

    1997-01-01

    The work by Prof. E. Buchner 100 years ago which led to the discovery of the enzymes in yeast for brewing beer is reviewed.......The work by Prof. E. Buchner 100 years ago which led to the discovery of the enzymes in yeast for brewing beer is reviewed....

  10. 100 Years Jubilee for the discovery of the enzymes in yeast

    DEFF Research Database (Denmark)

    Berg, Rolf W.

    1997-01-01

    The work by Prof. E. Buchner 100 years ago which led to the discovery of the enzymes in yeast for brewing beer is reviewed.......The work by Prof. E. Buchner 100 years ago which led to the discovery of the enzymes in yeast for brewing beer is reviewed....

  11. Under Connecticut Skies: Exploring 100 Years of Astronomy at Van Vleck Observatory in Middletown, Connecticut

    Science.gov (United States)

    Kilgard, Roy E.; Williams, Amrys; Erickson, Paul; Herbst, William; Redfield, Seth

    2017-01-01

    Under Connecticut Skies examines the history of astronomy at Van Vleck Observatory, located on the campus of Wesleyan University in Middletown, Connecticut. Since its dedication in June of 1916, Van Vleck has been an important site of astronomical research, teaching, and public outreach. Over a thousand visitors pass through the observatory each year, and regular public observing nights happen year-round in cooperation with the Astronomical Society of Greater Hartford. Our project explores the place-based nature of astronomical research, the scientific instruments, labor, and individuals that have connected places around the world in networks of observation, and the broader history of how observational astronomy has linked local people, amateur observers, professional astronomers, and the tools and objects that have facilitated their work under Connecticut’s skies over the past 100 years. Our research team has produced a historical exhibition to help commemorate the observatory’s centennial that opened to the public in May of 2016. Our work included collecting, documenting, and interpretting this history through objects, archival documents, oral histories, photographs, and more. The result is both a museum and a working history "laboratory" for use by student and professional researchers. In addition to the exhibit itself, we have engaged in new interpretive programs to help bring the history of astronomy to life. Future work will include digitization of documents and teaching slides, further collection of oral histories, and expanding the collection to the web for use by off-site researches.

  12. Individual differences and their measurement: A review of 100 years of research.

    Science.gov (United States)

    Sackett, Paul R; Lievens, Filip; Van Iddekinge, Chad H; Kuncel, Nathan R

    2017-03-01

    This article reviews 100 years of research on individual differences and their measurement, with a focus on research published in the Journal of Applied Psychology. We focus on 3 major individual differences domains: (a) knowledge, skill, and ability, including both the cognitive and physical domains; (b) personality, including integrity, emotional intelligence, stable motivational attributes (e.g., achievement motivation, core self-evaluations), and creativity; and (c) vocational interests. For each domain, we describe the evolution of the domain across the years and highlight major theoretical, empirical, and methodological developments, including relationships between individual differences and variables such as job performance, job satisfaction, and career development. We conclude by discussing future directions for individual differences research. Trends in the literature include a growing focus on substantive issues rather than on the measurement of individual differences, a differentiation between constructs and measurement methods, and the use of innovative ways of assessing individual differences, such as simulations, other-reports, and implicit measures. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. 100-Year Floodplains, Digitized FEMA flood maps, Published in unknown, Eureka County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, was produced all or in part from Hardcopy Maps information as of unknown. It is described as 'Digitized FEMA flood maps'. Data by...

  14. 100-Year Floodplains, Floodplain, Published in 2000, Smaller than 1:100000 scale, Taylor County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at Smaller than 1:100000 scale, was produced all or in part from Hardcopy Maps information as of 2000. It is described...

  15. 100-Year Floodplains, FEMA FIRM Mapping, Published in 2014, Not Applicable scale, GIS.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at Not Applicable scale, was produced all or in part from Other information as of 2014. It is described as 'FEMA FIRM...

  16. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can...... for applying activity theory to GSE. We analyze and explain the fundamental concepts of activity theory, and how they can be applied by using examples of software architecture design and evaluation processes. We describe the kind of data model and architectural support required for applying activity theory...

  17. Applying open source data visualization tools to standard based medical data.

    Science.gov (United States)

    Kopanitsa, Georgy; Taranik, Maxim

    2014-01-01

    Presentation of medical data in personal health records (PHRs) requires flexible platform independent tools to ensure easy access to the information. Different backgrounds of the patients, especially elder people require simple graphical presentation of the data. Data in PHRs can be collected from heterogeneous sources. Application of standard based medical data allows development of generic visualization methods. Focusing on the deployment of Open Source Tools, in this paper we applied Java Script libraries to create data presentations for standard based medical data.

  18. An assessment tool applied to manure management systems using innovative technologies

    DEFF Research Database (Denmark)

    Sørensen, Claus G.; Jacobsen, Brian H.; Sommer, Sven G.

    2003-01-01

    of operational and cost-effective animal manure handling technologies. An assessment tool covering the whole chain of the manure handling system from the animal houses to the field has been developed. The tool enables a system-oriented evaluation of labour demand, machinery capacity and costs related...... to the handling of manure. By applying the tool to a pig farm and a dairy farm scenario, the competitiveness of new technologies was compared with traditional manure handling. The concept of a continuous flow of transport and application of slurry using umbilical transportation systems rather than traditional...

  19. LOW FREQUENCY VARIABILITY OF INTERANNUAL CHANGE PATTERNS FOR GLOBAL MEAN TEMPERATURE DURING THE RECENT 100 YEARS

    Institute of Scientific and Technical Information of China (English)

    刘晶淼; 丁裕国; 等

    2002-01-01

    The TEEOF method that expands temporally is used to conduct a diagnostic study of the variation patterns of 1,3,6and 10 years with regard to mean air temperature over the globe and Southern and Northern Hemispheres over the course of 100 years.The results show that the first mode of TEEOF takes up more than 50%in the total variance,with each of the first mode in the interannual osicllations generally standing for annually varying patterns which are related with climate and reflecting long-term tendency of change in air temperature.It is particularly true for the first mode on the 10-year scale.which shows an obvious ascending ascending trend concerning the temperature in winter and consistently the primary component of time goes in a way that is very close to the sequence of actual temperature,Apart from the first mode of all time sections of TEEOF for the globe and the two hemispheres and the second mode of the 1-year TEEOF.interannual variation described by other characteristic vectors are showing various patterns,with corresponding primary components having relation with longterm variability of specific interannual quasi-periodic oscillation structures.A T2 test applied to the annual variation pattern shows that the abrupt changes for the southern Hemisphere and the globe come close to the result of a uni-element t test for mean temperature than those for the Northern Hemisphere do.It indicates that the T2 test,when carried out with patterns of multiple variables.Seems more reasonable than the t test with single elements.

  20. Induced metamorphosis in crustacean y-larvae: towards a solution to a 100-year-old riddle

    DEFF Research Database (Denmark)

    Glenner, Henrik; Høeg, Jens T; Grygier, Mark J

    2008-01-01

    BACKGROUND: The y-larva, a crustacean larval type first identified more than 100 years ago, has been found in marine plankton samples collected in the arctic, temperate and tropical regions of all oceans. The great species diversity found among y-larvae (we have identified more than 40 species......-larvae into a novel, highly reduced juvenile stage by applying the crustacean molting hormone 20-HE. The new stage is slug-like, unsegmented and lacks both limbs and almost all other traits normally characterizing arthropods, but it is capable of vigorous peristaltic motions. CONCLUSION: From our observations on live...

  1. Base (100-year) flood elevations for selected sites in Marion County, Missouri

    Science.gov (United States)

    Southard, Rodney E.; Wilson, Gary L.

    1998-01-01

    The primary requirement for community participation in the National Flood Insurance Program is the adoption and enforcement of floodplain management requirements that minimize the potential for flood damages to new construction and avoid aggravating existing flooding conditions. This report provides base flood elevations (BFE) for a 100-year recurrence flood for use in the management and regulation of 14 flood-hazard areas designated by the Federal Emergency Management Agency as approximate Zone A areas in Marion County, Missouri. The one-dimensional surface-water flow model, HEC-RAS, was used to compute the base (100-year) flood elevations for the 14 Zone A sites. The 14 sites were located at U.S., State, or County road crossings and the base flood elevation was determined at the upstream side of each crossing. The base (100-year) flood elevations for BFE 1, 2, and 3 on the South Fork North River near Monroe City, Missouri, are 627.7, 579.2, and 545.9 feet above sea level. The base (100-year) flood elevations for BFE 4, 5, 6, and 7 on the main stem of the North River near or at Philadelphia and Palmyra, Missouri, are 560.5, 539.7, 504.2, and 494.4 feet above sea level. BFE 8 is located on Big Branch near Philadelphia, a tributary to the North River, and the base (100-year) flood elevation at this site is 530.5 feet above sea level. One site (BFE 9) is located on the South River near Monroe City, Missouri. The base (100-year) flood elevation at this site is 619.1 feet above sea level. Site BFE 10 is located on Bear Creek near Hannibal, Missouri, and the base (100-year) elevation is 565.5 feet above sea level. The four remaining sites (BFE 11, 12, 13, and 14) are located on the South Fabius River near Philadelphia and Palmyra, Missouri. The base (100-year) flood elevations for BFE 11, 12, 13, and 14 are 591.2, 578.4, 538.7, and 506.9 feet above sea level.

  2. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    Science.gov (United States)

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  3. Liverpool's Discovery: A University Library Applies a New Search Tool to Improve the User Experience

    Science.gov (United States)

    Kenney, Brian

    2011-01-01

    This article features the University of Liverpool's arts and humanities library, which applies a new search tool to improve the user experience. In nearly every way imaginable, the Sydney Jones Library and the Harold Cohen Library--the university's two libraries that serve science, engineering, and medical students--support the lives of their…

  4. Liverpool's Discovery: A University Library Applies a New Search Tool to Improve the User Experience

    Science.gov (United States)

    Kenney, Brian

    2011-01-01

    This article features the University of Liverpool's arts and humanities library, which applies a new search tool to improve the user experience. In nearly every way imaginable, the Sydney Jones Library and the Harold Cohen Library--the university's two libraries that serve science, engineering, and medical students--support the lives of their…

  5. Assessment Gaze, Refraction, and Blur: The Course of Achievement Testing in the Past 100 Years

    Science.gov (United States)

    Baker, Eva L.; Chung, Gregory K. W. K.; Cai, Li

    2016-01-01

    This chapter addresses assessment (testing) with an emphasis on the 100-year period since the American Education Research Association was formed. The authors start with definitions and explanations of contemporary tests. They then look backward into the 19th century to significant work by Horace Mann and Herbert Spencer, who engendered two…

  6. Blister rust in North America: What we have not learned in the past 100 years

    Science.gov (United States)

    Eugene P. Van Arsdel; Brian W. Geils

    2011-01-01

    Introduction of Cronartium ribicola (white pine blister rust) greatly motivated development of tree disease control and research in America. Although foresters and pathologists have learned much in the past 100 years, more remains to learn. The most important lesson is that fear of blister rust has reduced pine regeneration more than the disease itself. Based on six...

  7. 100 years of selection of sugar beet at the Ivanivska research-selection station.

    Directory of Open Access Journals (Sweden)

    А. С. Лейбович

    2009-10-01

    Full Text Available In given article the historical way of development of selection of sugar beet at the Ivanivska research-selection station is opened. For 100 years of selection work at station by scientific employees are created and introduced into manufacture over 20 grades of sugar beet.

  8. 100 years of mapping the Holocene Rhine-Meuse delta plain: combining research and teaching

    NARCIS (Netherlands)

    Cohen, K. M.; Stouthamer, E.; Hoek, W. Z.; Middelkoop, H.

    2012-01-01

    The history of modern soil, geomorphological and shallow geological mapping in the Holocene Rhine-Meuse delta plain goes back about 100 years. The delta plain is of very heterogeneous build up, with clayey and peaty flood basins, dissected by sandy fluvial distributary channel belts with fine textur

  9. The Observation Of Defects Of School Buildings Over 100 Years Old In Perak

    Directory of Open Access Journals (Sweden)

    Alauddin Kartina

    2016-01-01

    Full Text Available Malaysia is blessed with a rich legacy of heritage buildings with unique architectural and historical values. The heritage buildings become a symbol of the national identity of our country. Therefore, heritage buildings, as important monuments should be conserved well to ensure the extension of the building’s life span and to make sure continuity functions of the building for future generations. The aim of this study is to analyze the types of defects attached in school buildings over 100 years located in Perak. The data were collected in four different schools aged over 100 years in Perak. The finding of the study highlighted the types of defects which were categorized based on building elements, including external wall, roof, door, ceiling, staircase, column, internal wall, floor and windows. Finding showed that the type of defects occurred in school buildings over 100 years in Perak is the same as the other heritage buildings. This finding can be used by all parties to take serious actions in preventing defects from occurring in buildings over 100 years. This would ensure that buildings’ functional life span can be extended for future use.

  10. Molecules at surfaces: 100 years of physical chemistry in Berlin-Dahlem.

    Science.gov (United States)

    Ertl, Gerhard

    2013-01-02

    Scratching the surface: for over 100 years the interactions of molecules at surfaces have been studied at the Fritz Haber Institute of the Max Planck Society, Berlin. Nobel Laureate Gerhard Ertl looks back at some of the key developments in this time, and the people who made them. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. REVIEW OF MODERN NON‐SURGICAL TOOLS APPLIED IN CARDIAC SURGERY

    Directory of Open Access Journals (Sweden)

    Marcin MARUSZEWSKI

    2013-04-01

    Full Text Available Surgical intervention is commonly associated with the use of hardware that facilitates invasive medical treatment. Nowadays surgeons apply a new set of tools that help them anticipate the outcome of the intervention and define potential risk factors. Increasing patient migration inspired healthcare professionals to introduce universal standards of care, supported by medical guidelines and checklists. Today, prior to skin incision, every modern cardiac surgeon is enabled in the whole range of tools that are designed to increase patient safety and provide thorough information to the whole medical team.

  12. Murine features of neurogenesis in the human hippocampus across the lifespan from 0 to 100 years.

    Directory of Open Access Journals (Sweden)

    Rolf Knoth

    Full Text Available BACKGROUND: Essentially all knowledge about adult hippocampal neurogenesis in humans still comes from one seminal study by Eriksson et al. in 1998, although several others have provided suggestive findings. But only little information has been available in how far the situation in animal models would reflect the conditions in the adult and aging human brain. We therefore here mapped numerous features associated with adult neurogenesis in rodents in samples from human hippocampus across the entire lifespan. Such data would not offer proof of adult neurogenesis in humans, because it is based on the assumption that humans and rodents share marker expression patterns in adult neurogenesis. Nevertheless, together the data provide valuable information at least about the presence of markers, for which a link to adult neurogenesis might more reasonably be assumed than for others, in the adult human brain and their change with increasing age. METHODS AND FINDINGS: In rodents, doublecortin (DCX is transiently expressed during adult neurogenesis and within the neurogenic niche of the dentate gyrus can serve as a valuable marker. We validated DCX as marker of granule cell development in fetal human tissue and used DCX expression as seed to examine the dentate gyrus for additional neurogenesis-associated features across the lifespan. We studied 54 individuals and detected DCX expression between birth and 100 years of age. Caveats for post-mortem analyses of human tissues apply but all samples were free of signs of ischemia and activated caspase-3. Fourteen markers related to adult hippocampal neurogenesis in rodents were assessed in DCX-positive cells. Total numbers of DCX expressing cells declined exponentially with increasing age, and co-expression of DCX with the other markers decreased. This argued against a non-specific re-appearance of immature markers in specimen from old brains. Early postnatally all 14 markers were co-expressed in DCX-positive cells

  13. A MODEL TO EVALUATE 100-YEAR ENERGY-MIX SCENARIOS TO FACILITATE DEEP DECARBONIZATION IN THE SOUTHEASTERN UNITED STATES

    Energy Technology Data Exchange (ETDEWEB)

    Adkisson, Mary A [ORNL; Qualls, A L [ORNL

    2016-08-01

    The Southeast United States consumes approximately one billion megawatt-hours of electricity annually; roughly two-thirds from carbon dioxide (CO2) emitting sources. The balance is produced by non-CO2 emitting sources: nuclear power, hydroelectric power, and other renewables. Approximately 40% of the total CO2 emissions come from the electric grid. The CO2 emitting sources, coal, natural gas, and petroleum, produce approximately 372 million metric tons of CO2 annually. The rest is divided between the transportation sector (36%), the industrial sector (20%), the residential sector (3%), and the commercial sector (2%). An Energy Mix Modeling Analysis (EMMA) tool was developed to evaluate 100-year energy mix strategies to reduce CO2 emissions in the southeast. Current energy sector data was gathered and used to establish a 2016 reference baseline. The spreadsheet-based calculation runs 100-year scenarios based on current nuclear plant expiration dates, assumed electrical demand changes from the grid, assumed renewable power increases and efficiency gains, and assumed rates of reducing coal generation and deployment of new nuclear reactors. Within the model, natural gas electrical generation is calculated to meet any demand not met by other sources. Thus, natural gas is viewed as a transitional energy source that produces less CO2 than coal until non-CO2 emitting sources can be brought online. The annual production of CO2 and spent nuclear fuel and the natural gas consumed are calculated and summed. A progression of eight preliminary scenarios show that nuclear power can substantially reduce or eliminate demand for natural gas within 100 years if it is added at a rate of only 1000 MWe per year. Any increases in renewable energy or efficiency gains can offset the need for nuclear power. However, using nuclear power to reduce CO2 will result in significantly more spent fuel. More efficient advanced reactors can only marginally reduce the amount of spent fuel generated in

  14. Struggles for Perspective: A Commentary on ""One Story of Many to Be Told": Following Empirical Studies of College and Adult Writing through 100 Years of NCTE Journals"

    Science.gov (United States)

    Brandt, Deborah

    2011-01-01

    In this article, the author comments on Kevin Roozen and Karen Lunsford's insightful examination of empirical studies of college and adult writing published in NCTE journals over the last 100 years. One sees in their account the struggles for perspective that marked writing studies in this period, as researchers applied ever wider lenses to the…

  15. Flood-hazard study: 100-year flood stage for Lucerne Lake, San Bernadino County, California

    Science.gov (United States)

    Busby, Mark William

    1977-01-01

    A study of the flood hydrology of Lucerne Valley, Calif., was made to develop the 100-year stage for Lucerne Lake. Synthetic-hydrologic techniques were used; and the 100-year flood stage was estimated to be at an elevation of 2,849.3 feet above mean sea level. Channel dimensions were measured at 59 sites in Lucerne Valley. Dranage area-discharge relations developed from channel-geometry data for sites nearby were used to estimate the discharge at 12 additional sites where channel geometry could not be measured. In order to compute the total volume discharge into the playa, the peak discharges were converted to volumes. From the Apple Valley report (Busby, 1975) the equation formulated from the relation between peak discharge and flood volume for the deserts of California was used to compute the flood volumes for routing into Lucerne Lake. (Woodard-USGS)

  16. 100 years of mapping the Holocene Rhine-Meuse delta plain: combining research and teaching

    Science.gov (United States)

    Cohen, K. M.; Stouthamer, E.; Hoek, W. Z.; Middelkoop, H.

    2012-04-01

    The history of modern soil, geomorphological and shallow geological mapping in the Holocene Rhine-Meuse delta plain goes back about 100 years. The delta plain is of very heterogeneous build up, with clayey and peaty flood basins, dissected by sandy fluvial distributary channel belts with fine textured levees grading into tidal-influenced rivers and estuaries. Several generations of precursor rivers occur as alluvial ridges and buried ribbon sands. They form an intricate network originating from repeated avulsions, back to 8000 years ago. Present rivers have been embanked since ca. 1250 AD and the delta plain (~ 3000 km2) has been reclaimed for agriculture. Soils are young and subject to oxidation and compaction. The first detailed field map of channel belts and floodbasins was made in 1926 by Vink, a geography teacher from Amsterdam. Soil mapping and Holocene geology gained interest after WW-II, with Wageningen soil scientists Edelman, Hoeksema and Pons taking lead. Utrecht University started teaching and research on the subject in 1959, launching an undergraduate mapping field course based on hand augering and field observation. An archive of borehole logs and local maps started to build up. Initially focused on soil mapping, from 1973 the course shifted to a geomorphological-geological focus. Berendsen took over supervision, introduced standard description protocols and legends and increased coring depth. This resulted in 1982 in his influential PhD thesis on the Rhine delta's genesis. New coring and sampling methods came and extensive 14C dating campaigns began. With steadily increasing numbers of students, accumulation of data speeded up, and increasingly larger parts of the delta were mapped. The academic mapping ran in parallel with soil survey and geological survey mapping campaigns. The computer was introduced in the field course and digital data archiving began in 1989. A series of PhD studies on thematic aspects of delta evolution and an increasing number

  17. 100 years of Epilepsia: landmark papers and their influence in neuropsychology and neuropsychiatry.

    Science.gov (United States)

    Hermann, Bruce

    2010-07-01

    As part of the 2009 International League Against Epilepsy (ILAE) Centenary Celebration, a special symposium was dedicated to Epilepsia (100 Years of Epilepsia: Landmark Papers and Their Influence). The Associate Editors were asked to identify a particularly salient and meaningful paper in their areas of expertise. From the content areas of neuropsychology and neuropsychiatry two very interesting papers were identified using quite different ascertainment techniques. One paper addressed the problem of psychosis in temporal lobe epilepsy, whereas the other represents the first paper to appear in Epilepsia presenting quantitative assessment of cognitive status in epilepsy. These two papers are reviewed in detail and placed in historical context.

  18. Accuracy assessment of the UT1 prediction method based on 100-year series analysis

    CERN Document Server

    Malkin, Z; Tolstikov, A

    2013-01-01

    A new method has been developed at the Siberian Research Institute of Metrology (SNIIM) for highly accurate prediction of UT1 and Pole coordinates. The method is based on construction of a general polyharmonic model of the variations of the Earth rotation parameters using all the data available for the last 80-100 years, and modified autoregression technique. In this presentation, a detailed comparison was made of real-time UT1 predictions computed making use of this method in 2006-2010 with simultaneous predictions computed at the International Earth Rotation and Reference Systems Service (IERS). Obtained results have shown that proposed method provides better accuracy at different prediction lengths.

  19. Fascia Research Congress evidence from the 100 year perspective of Andrew Taylor Still.

    Science.gov (United States)

    Findley, Thomas W; Shalwala, Mona

    2013-07-01

    More than 100 years ago A.T. Still MD founded osteopathic medicine, and specifically described fascia as a covering, with common origins of layers of the fascial system despite diverse names for individual parts. Fascia assists gliding and fluid flow and is highly innervated. Fascia is intimately involved with respiration and with nourishment of all cells of the body, including those of disease and cancer. This paper reviews information presented at the first three International Fascia Research Congresses in 2007, 2009 and 2012 from the perspective of Dr Still, that fascia is vital for organism's growth and support, and it is where disease is sown. Published by Elsevier Ltd.

  20. Social Network Analysis and Big Data tools applied to the Systemic Risk supervision

    Directory of Open Access Journals (Sweden)

    Mari-Carmen Mochón

    2016-03-01

    Full Text Available After the financial crisis initiated in 2008, international market supervisors of the G20 agreed to reinforce their systemic risk supervisory duties. For this purpose, several regulatory reporting obligations were imposed to the market participants. As a consequence, millions of trade details are now available to National Competent Authorities on a daily basis. Traditional monitoring tools may not be capable of analyzing such volumes of data and extracting the relevant information, in order to identify the potential risks hidden behind the market. Big Data solutions currently applied to the Social Network Analysis (SNA, can be successfully applied the systemic risk supervision. This case of study proposes how relations established between the financial market participants could be analyzed, in order to identify risk of propagation and market behavior, without the necessity of expensive and demanding technical architectures.

  1. 100-year history of the development of bread winter wheat breeding programs

    Directory of Open Access Journals (Sweden)

    М. А. Литвиненко

    2016-05-01

    Full Text Available Purpose. Review of the main achievements of the Wheat Breeding and Seed ProductionDepartment in the Plant Breeding and Genetic Institute – National Centre of Seed and Cultivar Investigation in the developing theoretical principles of breeding and creation of winter wheat varieties of different types during 100-year (1916–2016 period of breeding programs realization. Results. The main theoretical, methodical developments and breeding achievements of Wheat Breeding and Seed Production Department during 100-year (1916–2016 history have been considered. In the course of the Department activity, the research and metho­dology grounds of bread winter wheat breeding and seed production have been laid, 9 stages of breeding programs development have been accomplished. As a result, more than 130 varieties of different types have been created, 87 of them have been released in some periods or registered in the State registers of plants varieties of Ukraine and other countries and grown in the total sowing area about 220 million hectares.

  2. Oceanic environmental changes of subarctic Bering Sea in recent 100 years: Evidence from molecular fossils

    Institute of Scientific and Technical Information of China (English)

    LU; Bing; CHEN; Ronghua; ZHOU; Huaiyang; WANG; Zipan; CHEN

    2005-01-01

    The core sample B2-9 from the seafloor of the subarctic Bering Sea was dated with 210Pb to obtain a consecutive sequence of oceanic sedimentary environments at an interval of a decade during 1890-1999. A variety of molecular fossils were detected, including n-alkanes, isoprenoids, fatty acids, sterols, etc. By the characteristics of these fine molecules (C27, C28, and C29 sterols) and their molecular indices (Pr/Ph, ∑C+22/∑C?21, CPI and C18∶2/C18∶0) and in consideration of the variation of organic carbon content, the 100-year evolution history of subarctic sea paleoenvironment was reestablished. It is indicated that during the past 100 years in the Arctic, there were two events of strong climate warming (1920-1950 and 1980-1999), which resulted in an oxidated sediment environment owing to decreasing terrigenous organic matters and increasing marine-derived organic matters, and two events of transitory climate cooling (1910 and 1970-1980), which resulted in a slightly reduced sediment environment owing to increasing terrigenous organic matters and decreasing marine-derived organic matters. It is revealed that the processes of warming/cooling alternated climate are directly related to the Arctic and global climate variations.

  3. Did Open Solar Magnetic Field Increase during the Last 100 Years: A Reanalysis of Geomagnetic Activity

    CERN Document Server

    Mursula, K; Karinen, A

    2004-01-01

    Long-term geomagnetic activity presented by the aa index has been used to show that the heliospheric magnetic field has more than doubled during the last 100 years. However, serious concern has been raised on the long-term consistency of the aa index and on the centennial rise of the solar magnetic field. Here we reanalyze geomagnetic activity during the last 100 years by calculating the recently suggested IHV (Inter-Hour Variability) index as a measure of local geomagnetic activity for seven stations. We find that local geomagnetic activity at all stations follows the same qualitative long-term pattern: an increase from early 1900s to 1960, a dramatic dropout in 1960s and a (mostly weaker) increase thereafter. Moreover, at all stations, the activity at the end of the 20th century has a higher average level than at the beginning of the century. This agrees with the result based on the aa index that global geomagnetic activity, and thereby, the open solar magnetic field has indeed increased during the last 100...

  4. Sedimentary records of eutrophication and hypoxia in the Changjiang Estuary over the last 100 years

    Science.gov (United States)

    Xuwen, F.; Hongliang, L.; Zhao, M.; Xuefa, S.

    2012-12-01

    We selected two cores in the Changjiang Estuary, one located in the Changjiang Estuary mud area (CEMA) within the region of seasonal hypoxia, the other located in the Cheju Island mud area (SCIMA) and outside the hypoxia region. The grain size, total organic carbon (TOC), stable carbon isotopic ratios (δ13 Corg), biomarkers (the sum of brassicasterol, dinosterol and alkenone) and some redox sensitive elements (RSEs) were determined on the 210Pb-dated sediment cores to study potential hundrend-years eutrophication and hypoxia. The sediment record in CEMA showed that an increase in TOC (21%), biomarkers (141%) and δ13 Corg (1.6‰PDB ) occurred since 1950s and a marked increase since 1970s. These distributions indicated the enhanced productivity and establshed the history of eutrophication in the Changjiang Estuary during the past 100 years. Some RSEs have been enriched significantly since the late 1960s to 1970s, the rates of Mo/Al, Cd/Al and As/Al increased about 83%, 73% and 50% respectively. These data may indicate the onset of hypoxia in the Changjiang Estuary during the last 100 years. The increasing of marine organic matter and RSEs accumulation was corresponding with the fertilizer consumption and high nutrient inputs from the Changjiang River. The riverine runoff of fertilizers and nutrients stimulated the algae (e g. brassicasterol, dinosterol) blooming. Enhanced primary production resulted in an enrichment of organic matter and hypoxia invoked organic matter preserved in the sediment. For the core sediment in SCIMA, the geochemical indicators (TOC, biomarkers and δ13Corg ) increased in difference degrees before 1950s~1970s and then were almost the constant. Productivity in the SCIMA have been mainly influenced by climate ocean circulation changes over the last 100 years. The RSEs were controlled by "grain size effects" which indicated no hypoxia occurred. This study concluded that δ13 Corg, RSEs and biomarkers in sediment could be used to trace or

  5. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  6. Operation of the aircraft as a discipline of knowledge after 100 years of experience

    Directory of Open Access Journals (Sweden)

    Stanisław Danilecki

    2015-12-01

    Full Text Available The paper presents the course of the most important stages of development of the discipline related to the maintenance of aircrafts, as a synthesis of the 100-year experience. It is presented linking of technical maintenance of the aircraft with other disciplines of knowledge. The scope of technical maintenance was defined. The division and analysis of the methods for the maintenance was made. There were defined and discussed definitions used in the theory of maintenance, in conjunction with the construction of the aircraft and the tasks arising for constructor-manufacturer. MSG subsequent versions of the document constituting the logistical procedures for determining the programmable handling of civil airplanes were discussed.[b]Keywords[/b]: aviation, aircraft, maintenance aircrafts

  7. Sustainable Foods and Medicines Support Vitality, Sex and Longevity for a 100-Year Starship Expedition

    Science.gov (United States)

    Edwards, M. R.

    Extended space flight requires foods and medicines that sustain crew health and vitality. The health and therapeutic needs for the entire crew and their children for a 100-year space flight must be sustainable. The starship cannot depend on resupply or carry a large cargo of pharmaceuticals. Everything in the starship must be completely recyclable and reconstructable, including food, feed, textiles, building materials, pharmaceuticals, vaccines, and medicines. Smart microfarms will produce functional foods with superior nutrition and sensory attributes. These foods provide high-quality protein and nutralence (nutrient density), that avoids obesity, diabetes, and other Western diseases. The combination of functional foods, lifestyle actions, and medicines will support crew immunity, energy, vitality, sustained strong health, and longevity. Smart microfarms enable the production of fresh medicines in hours or days, eliminating the need for a large dispensary, which eliminates concern over drug shelf life. Smart microfarms are adaptable to the extreme growing area, resource, and environmental constraints associated with an extended starship expedition.

  8. [The 100-year anniversary of Eugene Jamot's (1879-1937) admittance to the Pharo School].

    Science.gov (United States)

    Milleliri, J M; Louis, F J

    2010-04-01

    For the 100-year anniversary of Dr. Eugene Jamot's (1879-1937) admittance to the Pharo School (then known as the Training School of the Colonial Army Health Corps), the authors describe the life of a French military physician working in Africa. Eugene Jamot devoted 22 years of his life to fighting sleeping sickness. Using a standardized approach that has become a textbook example, he was highly successful in controlling this dreaded tropical disease. Despite being criticized by some officials of the colonial administration and becoming the target of an obvious smear campaign because of his strong personality and growing fame, Jamot handed down a set of values that are recognized by most physicians working to improve the living conditions of the unfortunately still suffering African population.

  9. Prediction of Climatic Change for the Next 100 Years in the Apulia Region, Southern Italy

    Directory of Open Access Journals (Sweden)

    Mladen Todorovic

    2007-12-01

    Full Text Available The impact of climate change on water resources and use for agricultural production has become a critical question for sustainability. Our objective was investigate the impact of the expected climate changes for the next 100 years on the water balance variations, climatic classifications, and crop water requirements in the Apulia region (Southern Italy. The results indicated that an increase of temperature, in the range between 1.3 and 2,5 °C, is expected in the next 100 years. The reference evapotranspiration (ETo variations would follow a similar trend; as averaged over the whole region, the ETo increase would be about 15.4%. The precipitation will not change significantly on yearly basis although a slight decrease in summer months and a slight increase during the winter season are foreseen. The climatic water deficit (CWD is largely caused by ETo increase, and it would increase over the whole Apulia region in average for more than 200 mm. According to Thornthwaite and Mather climate classification, the moisture index will decrease in the future, with decreasing of humid areas and increasing of aridity zones. The net irrigation requirements (NIR, calculated for ten major crops in the Apulia region, would increase significantly in the future. By the end of the 21st Century, the foreseen increase of NIR, in respect to actual situation, is the greatest for olive tree (65%, wheat (61%, grapevine (49%, and citrus (48% and it is slightly lower for maize (35%, sorghum (34%, sunflower (33%, tomato (31%, and winter and spring sugar beet (both 27%.

  10. 77 FR 66823 - Freedom of Information Act Request for Papers Submitted to DARPA for the 2011 100 Year Starship...

    Science.gov (United States)

    2012-11-07

    ... of the Secretary Freedom of Information Act Request for Papers Submitted to DARPA for the 2011 100 Year Starship Symposium AGENCY: Defense Advanced Research Projects Agency (DARPA), DoD. ACTION: Notice... panels at the 2011 100 Year Starship Symposium must provide DARPA a written response explaining...

  11. Control of solid tobacco emissions in industrial fact ories applying CDF tools

    Directory of Open Access Journals (Sweden)

    G Polanco

    2016-09-01

    Full Text Available The emission of light solid aromatic particles from any tobacco industry affects the surrounding inhabitants, commonly causing allergies and eye irritation and, of course, uncomfortable odours, therefore, these emissions to the air must be regulated. An increasing in production must be considered in the sizing of mechanisms used to achieve the precipitation and final filtration, before discharging to the atmosphere. A numerical tool was applied to study the internal behaviour of low velocity precipitation tunnel and discharge chimney of the refuses treatment system. The characterization of the two-phase flow streamlines allows determining the velocity gradient profiles across the whole tunnel; which is intimately related with the particle concentration, and deposition zones locations. The application of CFD techniques gives the bases to find new design parameters to improve the precipitation tunnel behaviour capability to manage the increment of the mass flow of particles, due to changes in mass cigarette production.

  12. The DPSIR approach applied to marine eutrophication in LCIA as a learning tool

    DEFF Research Database (Denmark)

    Cosme, Nuno Miguel Dias; Olsen, Stig Irving

    the State (S) of the ecosystem, causing the Impacts (I) on these, and contributing to the management strategies and Responses ®. The latter are designed to modify the drivers, minimise the pressures and restore the state of the receiving ecosystem. In our opinion the DPSIR provides a good conceptual...... understanding that is well suited for sustainability teaching and communication purposes. Life Cycle Impact Assessment (LCIA) indicators aim at modelling the P-S-I parts and provide a good background for understanding D and R. As an example, the DPSIR approach was applied to the LCIA indicator marine...... assessment and response design ultimately benefit from spatial differentiation in the results. DPSIR based on LCIA seems a useful tool to improve communication and learning, as it bridges science and management while promoting the basic elements of sustainable development in a practical educational...

  13. Applying a Participatory Design Approach to Define Objectives and Properties of a "Data Profiling" Tool for Electronic Health Data.

    Science.gov (United States)

    Estiri, Hossein; Lovins, Terri; Afzalan, Nader; Stephens, Kari A

    2016-01-01

    We applied a participatory design approach to define the objectives, characteristics, and features of a "data profiling" tool for primary care Electronic Health Data (EHD). Through three participatory design workshops, we collected input from potential tool users who had experience working with EHD. We present 15 recommended features and characteristics for the data profiling tool. From these recommendations we derived three overarching objectives and five properties for the tool. A data profiling tool, in Biomedical Informatics, is a visual, clear, usable, interactive, and smart tool that is designed to inform clinical and biomedical researchers of data utility and let them explore the data, while conveniently orienting the users to the tool's functionalities. We suggest that developing scalable data profiling tools will provide new capacities to disseminate knowledge about clinical data that will foster translational research and accelerate new discoveries.

  14. Applying a visual language for image processing as a graphical teaching tool in medical imaging

    Science.gov (United States)

    Birchman, James J.; Tanimoto, Steven L.; Rowberg, Alan H.; Choi, Hyung-Sik; Kim, Yongmin

    1992-05-01

    Typical user interaction in image processing is with command line entries, pull-down menus, or text menu selections from a list, and as such is not generally graphical in nature. Although applying these interactive methods to construct more sophisticated algorithms from a series of simple image processing steps may be clear to engineers and programmers, it may not be clear to clinicians. A solution to this problem is to implement a visual programming language using visual representations to express image processing algorithms. Visual representations promote a more natural and rapid understanding of image processing algorithms by providing more visual insight into what the algorithms do than the interactive methods mentioned above can provide. Individuals accustomed to dealing with images will be more likely to understand an algorithm that is represented visually. This is especially true of referring physicians, such as surgeons in an intensive care unit. With the increasing acceptance of picture archiving and communications system (PACS) workstations and the trend toward increasing clinical use of image processing, referring physicians will need to learn more sophisticated concepts than simply image access and display. If the procedures that they perform commonly, such as window width and window level adjustment and image enhancement using unsharp masking, are depicted visually in an interactive environment, it will be easier for them to learn and apply these concepts. The software described in this paper is a visual programming language for imaging processing which has been implemented on the NeXT computer using NeXTstep user interface development tools and other tools in an object-oriented environment. The concept is based upon the description of a visual language titled `Visualization of Vision Algorithms' (VIVA). Iconic representations of simple image processing steps are placed into a workbench screen and connected together into a dataflow path by the user. As

  15. Surveillance as an innovative tool for furthering technological development as applied to the plastic packaging sector

    Directory of Open Access Journals (Sweden)

    Freddy Abel Vargas

    2010-04-01

    Full Text Available The demand for production process efficiency and quality has made it necessary to resort to new tools for development and technological innovation. Surveillance of the enviroment has thus bee identified as beign a priority, paying special attention to technology which (by its changing nature is a key factor in competitiveness. Surveillance is a routine activity in developed countries ' organisations; however, few suitable studies have been carried out in Colombia and few instruments produced for applying it to existing sectors of the economy. The present article attempts to define a methodology for technological awareness (based on transforming the information contained in databases by means of constructing technological maps contributing useful knowledge to production processes. This methodology has been applied to the flexible plastic packaging sector. The main trends in this industry's technological development were identified allowing strategies to be proposed for incorporating these advances and tendencies in national companies and research groups involved in flexible plastic packaging technological development and innovation. Technological mappiong's possibilities as an important instrument for producing technological development in a given sector are the analysed as are their possibilities for being used in other production processes.

  16. [Nutrient dynamics over the past 100 years and its restoration baseline in Dianshan Lake].

    Science.gov (United States)

    Li, Xiao-Ping; Chen, Xiao-Hua; Dong, Xu-Hui; Dong, Zhi; Sun, Dun-Ping

    2012-10-01

    The restoration of eutrophic lakes requires a good knowledge on the history and baseline of nutrients in the lakes. This work conducted an analysis on 210Pb/137Cs, water content, loss-on-ignition, sedimentary total phosphorus (TP), total nitrogen (TN), total organic carbon (TOC) and diatoms in the four sediment cores from Dianshan Lake (near Shanghai City). Good coherence in palaeoproxies between the cores indicates a relatively stable sedimentary environment. With increasing human impact, diatom communities shifted from oligo-trophic species Cyclotella bodanica, C. ocelata, Achnanthes minutissima, Cocconeis placentula var lineate, Cymbella sp. , Fragilaria pintata, F. brevistrata, F. construens var venter to recent eutrophic species including Cyclostephanos dubias, C. atomus, Stephanodiscus minitulus, S. hantzschi, Aulacoseria alpigena. The epilimnetic TP over the past 100 years reconstructed using an established diatom-TP transfer function matches well with the monitoring TP where exists. Based on the sedimentary nutrient characteristics and diatom-reconstructed nutrient dynamics, we proposed that the nutrient baseline for Dianshan Lake is 50-60 microg x L(-1), 500 mg x kg(-1) and 550 mg x kg(-1) for water TP concentration, sedimentary TP and TN, respectively.

  17. The Current Status of Ticks in Turkey: A 100-Year Period Review from 1916 to 2016.

    Science.gov (United States)

    İnci, Abdullah; Yıldırım, Alparslan; Düzlü, Önder

    2016-09-01

    Environmental and bio-ecological changes, some administrative and political mistakes, and global warming seriously affect the behaviors of ticks in Turkey and globally. The global public sensitivity toward tick infestations has increased along with increases in tick-borne diseases (TBDs). Recently, the World Health Organization (WHO) developed a new political concept, "One Health," for specific struggle strategies against tick infestations and TBDs. To highlight the importance of the issue, the WHO had declared the year 2015 for vector-borne diseases and adopted the slogan "small bites big threat". In global struggle strategies, the epidemiological aspects and dynamics of increasing tick populations and their effects on the incidence of the TBDs mainly with zoonotic characteristics have been specifically targeted. In Turkey, during the last century, approximately 47 tick species, including eight soft and 39 hard tick species in three and six genera belonging to Argasidae and Ixodidae, respectively, had already been reported. In this article, the recorded tick species, regional infestations, and medical and veterinary importance in Turkey were chronologically reviewed based on a 100-year period between 1916 and 2016.

  18. Lessons to be learned from an analysis of ammonium nitrate disasters in the last 100 years

    Energy Technology Data Exchange (ETDEWEB)

    Pittman, William; Han, Zhe; Harding, Brian; Rosas, Camilo; Jiang, Jiaojun; Pineda, Alba; Mannan, M. Sam, E-mail: mannan@tamu.edu

    2014-09-15

    Highlights: • Root causes and contributing factors from ammonium nitrate incidents are categorized into 10 lessons. • The lessons learned from the past 100 years of ammonium nitrate incidents can be used to improve design, operation, and maintenance procedures. • Improving organizational memory to help improve safety performance. • Combating and changing organizational cultures. - Abstract: Process safety, as well as the safe storage and transportation of hazardous or reactive chemicals, has been a topic of increasing interest in the last few decades. The increased interest in improving the safety of operations has been driven largely by a series of recent catastrophes that have occurred in the United States and the rest of the world. A continuous review of past incidents and disasters to look for common causes and lessons is an essential component to any process safety and loss prevention program. While analyzing the causes of an accident cannot prevent that accident from occurring, learning from it can help to prevent future incidents. The objective of this article is to review a selection of major incidents involving ammonium nitrate in the last century to identify common causes and lessons that can be gleaned from these incidents in the hopes of preventing future disasters. Ammonium nitrate has been involved in dozens of major incidents in the last century, so a subset of major incidents were chosen for discussion for the sake of brevity. Twelve incidents are reviewed and ten lessons from these incidents are discussed.

  19. Physiological and morphological acclimation to height in cupressoid leaves of 100-year-old Chamaecyparis obtusa.

    Science.gov (United States)

    Shiraki, Ayumi; Azuma, Wakana; Kuroda, Keiko; Ishii, H Roaki

    2016-10-15

    Cupressoid (scale-like) leaves are morphologically and functionally intermediate between stems and leaves. While past studies on height acclimation of cupressoid leaves have focused on acclimation to the vertical light gradient, the relationship between morphology and hydraulic function remains unexplored. Here, we compared physiological and morphological characteristics between treetop and lower-crown leaves of 100-year-old Chamaecyparis obtusa Endl. trees (~27 m tall) to investigate whether height-acclimation compensates for hydraulic constraints. We found that physiological acclimation of leaves was determined by light, which drove the vertical gradient of evaporative demand, while leaf morphology and anatomy were determined by height. Compared with lower-crown leaves, treetop leaves were physiologically acclimated to water stress. Leaf hydraulic conductance was not affected by height, and this contributed to higher photosynthetic rates of treetop leaves. Treetop leaves had higher leaf area density and greater leaf mass per area, which increase light interception but could also decrease hydraulic efficiency. We inferred that transfusion tissue flanking the leaf vein, which was more developed in the treetop leaves, contributes to water-stress acclimation and maintenance of leaf hydraulic conductance by facilitating osmotic adjustment of leaf water potential and efficient water transport from xylem to mesophyll. Our findings may represent anatomical adaptation that compensates for hydraulic constraints on physiological function with increasing height.

  20. Revisiting extreme storms of the past 100 years for future safety of large water management infrastructures

    Science.gov (United States)

    Chen, Xiaodong; Hossain, Faisal

    2016-07-01

    Historical extreme storm events are widely used to make Probable Maximum Precipitation (PMP) estimates, which form the cornerstone of large water management infrastructure safety. Past studies suggest that extreme precipitation processes can be sensitive to land surface feedback and the planetary warming trend, which makes the future safety of large infrastructures questionable given the projected changes in land cover and temperature in the coming decades. In this study, a numerical modeling framework was employed to reconstruct 10 extreme storms over CONUS that occurred during the past 100 years, which are used by the engineering profession for PMP estimation for large infrastructures such as dams. Results show that the correlation in daily rainfall for such reconstruction can range between 0.4 and 0.7, while the correlation for maximum 3-day accumulation (a standard period used in infrastructure design) is always above 0.5 for post-1948 storms. This suggests that current numerical modeling and reanalysis data allow us to reconstruct big storms after 1948 with acceptable accuracy. For storms prior to 1948, however, reconstruction of storms shows inconsistency with observations. Our study indicates that numerical modeling and data may not have advanced to a sufficient level to understand how such old storms (pre-1948) may behave in future warming and land cover conditions. However, the infrastructure community can certainly rely on the use of model reconstructed extreme storms of the 1948-present period to reassess safety of our large water infrastructures under assumed changes in temperature and land cover.

  1. The Emergence of Gravitational Wave Science: 100 Years of Development of Mathematical Theory, Detectors, Numerical Algorithms, and Data Analysis Tools

    CERN Document Server

    Holst, Michael; Tiglio, Manuel; Vallisneri, Michele

    2016-01-01

    On September 14, 2015, the newly upgraded Laser Interferometer Gravitational-wave Observatory (LIGO) recorded a loud gravitational-wave (GW) signal, emitted a billion light-years away by a coalescing binary of two stellar-mass black holes. The detection was announced in February 2016, in time for the hundredth anniversary of Einstein's prediction of GWs within the theory of general relativity (GR). The signal represents the first direct detection of GWs, the first observation of a black-hole binary, and the first test of GR in its strong-field, high-velocity, nonlinear regime. In the remainder of its first observing run, LIGO observed two more signals from black-hole binaries, one moderately loud, another at the boundary of statistical significance. The detections mark the end of a decades-long quest, and the beginning of GW astronomy: finally, we are able to probe the unseen, electromagnetically dark Universe by listening to it. In this article, we present a short historical overview of GW science: this youn...

  2. Land use mapping from CBERS-2 images with open source tools by applying different classification algorithms

    Science.gov (United States)

    Sanhouse-García, Antonio J.; Rangel-Peraza, Jesús Gabriel; Bustos-Terrones, Yaneth; García-Ferrer, Alfonso; Mesas-Carrascosa, Francisco J.

    2016-02-01

    Land cover classification is often based on different characteristics between their classes, but with great homogeneity within each one of them. This cover is obtained through field work or by mean of processing satellite images. Field work involves high costs; therefore, digital image processing techniques have become an important alternative to perform this task. However, in some developing countries and particularly in Casacoima municipality in Venezuela, there is a lack of geographic information systems due to the lack of updated information and high costs in software license acquisition. This research proposes a low cost methodology to develop thematic mapping of local land use and types of coverage in areas with scarce resources. Thematic mapping was developed from CBERS-2 images and spatial information available on the network using open source tools. The supervised classification method per pixel and per region was applied using different classification algorithms and comparing them among themselves. Classification method per pixel was based on Maxver algorithms (maximum likelihood) and Euclidean distance (minimum distance), while per region classification was based on the Bhattacharya algorithm. Satisfactory results were obtained from per region classification, where overall reliability of 83.93% and kappa index of 0.81% were observed. Maxver algorithm showed a reliability value of 73.36% and kappa index 0.69%, while Euclidean distance obtained values of 67.17% and 0.61% for reliability and kappa index, respectively. It was demonstrated that the proposed methodology was very useful in cartographic processing and updating, which in turn serve as a support to develop management plans and land management. Hence, open source tools showed to be an economically viable alternative not only for forestry organizations, but for the general public, allowing them to develop projects in economically depressed and/or environmentally threatened areas.

  3. Induced metamorphosis in crustacean y-larvae: Towards a solution to a 100-year-old riddle

    Directory of Open Access Journals (Sweden)

    Grygier Mark J

    2008-05-01

    Full Text Available Abstract Background The y-larva, a crustacean larval type first identified more than 100 years ago, has been found in marine plankton samples collected in the arctic, temperate and tropical regions of all oceans. The great species diversity found among y-larvae (we have identified more than 40 species at our study site alone indicates that the adult organism may play a significant ecological role. However, despite intense efforts, the adult y-organism has never been identified, and nothing is therefore known about its biology. Results We have successfully and repeatedly induced metamorphosis of y-larvae into a novel, highly reduced juvenile stage by applying the crustacean molting hormone 20-HE. The new stage is slug-like, unsegmented and lacks both limbs and almost all other traits normally characterizing arthropods, but it is capable of vigorous peristaltic motions. Conclusion From our observations on live and preserved material we conclude that adult Facetotecta are endoparasitic in still to be identified marine hosts and with a juvenile stage that represents a remarkable convergence to that seen in parasitic barnacles (Crustacea Cirripedia Rhizocephala. From the distribution and abundance of facetotectan y-larvae in the world's oceans we furthermore suggest that these parasites are widespread and could play an important role in the marine environment.

  4. To Humbly Go: Guarding Against Perpetuating Models of Colonization in the 100-Year Starship Study

    Science.gov (United States)

    Kramer, W. R.

    Past patterns of exploration, colonization and exploitation on Earth continue to provide the predominant paradigms that guide many space programs. Any project of crewed space exploration, especially of the magnitude envisioned by the 100-Year Starship Study, must guard against the hubris that may emerge among planners, crew, and others associated with the project, including those industries and bureaucracies that will emerge from the effort. Maintaining a non-exploitative approach may be difficult in consideration of the century of preparatory research and development and the likely multigenerational nature of the voyage itself. Starting now with mission dreamers and planners, the purpose of the voyage must be cast as one of respectful learning and humble discovery, not of conquest (either actual or metaphorical) or other inappropriate models, including military. At a minimum, the Study must actively build non-violence into the voyaging culture it is beginning to create today. References to exploitive colonization, conquest, destiny and other terms from especially American frontier mythology, while tempting in their propagandizing power, should be avoided as they limit creative thinking about alternative possible futures. Future voyagers must strive to adapt to new environments wherever possible and be assimilated by new worlds both biologically and behaviorally rather than to rely on attempts to recreate the Earth they have left. Adaptation should be strongly considered over terraforming. This paper provides an overview of previous work linking the language of colonization to space programs and challenges the extension of the myth of the American frontier to the Starship Study. It argues that such metaphors would be counter-productive at best and have the potential to doom long-term success and survival by planting seeds of social decay and self-destruction. Cautions and recommendations are suggested.

  5. Rainfall and Drought In Equatorial East Africa During The Past 1,100 Years

    Science.gov (United States)

    Verschuren, D.; Laird, K. R.; Cumming, B. F.

    Knowledge of natural long-term rainfall variability is essential for water-resource and land-use management in all dry-land regions of the world. In tropical Africa, data rel- evant to determining this variability are scarce because of the lack of long instrumen- tal climate records and the limited potential of some high-resolution proxy climate archives such as tree rings and ice cores. An 1,100-year reconstruction of decadel- scale variability in African rainfall and drought based on lake-level and salinity fluc- tuations of Lake Naivasha (Eastern Rift Valley, Kenya) now indicates that eastern equatorial Africa within the last millennia has alternated between strongly contrast- ing climatic conditions, including significantly reduced effective moisture during the 'Medieval Warm Period' (~AD 900-1270), and a generally higher effective moisture than today during the 'Little Ice Age' (~AD 1270-1850) interrupted by three episodes of prolonged aridity (~AD 1380-1420, 1560-1620, and 1760-1840) more severe than any historically recorded drought. Pattern and timing of the reconstructed Little Ice Age climate fluctuations correlate with the residual record of atmospheric radiocar- bon production, suggesting that long-term variations in solar radiation may have con- tributed to African rainfall variability at these time scales. In agreement with other recently documented instances of a solar influence on long-term variations in hydro- logical balance, solar minima correlated with increases in effective moisture and solar maxima with increased aridity. It remains unclear, however, whether variation in solar radiation generated fluctuations in effective moisture mainly by means of a direct or indirect influence on rainfall, or rather through the influence of temperature variations on evaporation rates.

  6. The Archives of the Department of Terrestrial Magnetism: Documenting 100 Years of Carnegie Science

    Science.gov (United States)

    Hardy, S. J.

    2005-12-01

    The archives of the Department of Terrestrial Magnetism (DTM) of the Carnegie Institution of Washington document more than a century of geophysical and astronomical investigations. Primary source materials available for historical research include field and laboratory notebooks, equipment designs, plans for observatories and research vessels, scientists' correspondence, and thousands of expedition and instrument photographs. Yet despite its history, DTM long lacked a systematic approach to managing its documentary heritage. A preliminary records survey conducted in 2001 identified more than 1,000 linear feet of historically-valuable records languishing in dusty, poorly-accessible storerooms. Intellectual control at that time was minimal. With support from the National Historical Publications and Records Commission, the "Carnegie Legacy Project" was initiated in 2003 to preserve, organize, and facilitate access to DTM's archival records, as well as those of the Carnegie Institution's administrative headquarters and Geophysical Laboratory. Professional archivists were hired to process the 100-year backlog of records. Policies and procedures were established to ensure that all work conformed to national archival standards. Records were appraised, organized, and rehoused in acid-free containers, and finding aids were created for the project web site. Standardized descriptions of each collection were contributed to the WorldCat bibliographic database and the AIP International Catalog of Sources for History of Physics. Historic photographs and documents were digitized for online exhibitions to raise awareness of the archives among researchers and the general public. The success of the Legacy Project depended on collaboration between archivists, librarians, historians, data specialists, and scientists. This presentation will discuss key aspects (funding, staffing, preservation, access, outreach) of the Legacy Project and is aimed at personnel in observatories, research

  7. Is Scores Derived from the Most Internationally Applied Patient Safety Culture Assessment Tool Correct?

    Directory of Open Access Journals (Sweden)

    Javad Moghri

    2013-09-01

    Full Text Available Background: Hospital Survey on Patient Safety Culture, known as HSOPS, is an internationally well known and widely used tool for measuring patient safety culture in hospitals. It includes 12 dimensions with positive and negative wording questions. The distribution of these questions in different dimensions is uneven and provides the risk of acquiescence bias. The aim of this study was to assess the questionnaire against this bias.Methods: Three hundred nurses were assigned into study and control groups randomly. Short form of HSOPS was distributed in the control group and totally reversed form of it was given to the study group. Percent positive scores and t-test were applied for data analysis. Statistical analyses were conducted using SPSS Version 16.Results: Finally a total of 272 nurses completed the questionnaire. All dimensions with positive wording items in both groups had higher scores compared with their negative worded format. The first dimension "organizational learning and continued improvement" which had the only statistically significant difference, got 16.2% less score in the study group comparing the other group. In addition six out of 18 differences in questions were statistically significant.Conclusion: The popular and widely used HSOPS is subject to acquiescence bias. The bias might lead to exaggerate the status of some patient safety culture composites. Balancing the number of positive and negative worded items in each composite could mitigate the mentioned bias and provide a more valid estimation of different elements of patient safety culture.

  8. Underworld-GT Applied to Guangdong, a Tool to Explore the Geothermal Potential of the Crust

    Institute of Scientific and Technical Information of China (English)

    Steve Quenette; Yufei Xi; John Mansour; Louis Moresi; David Abramson

    2015-01-01

    Geothermal energy potential is usually discussed in the context of conventional or engi-neered systems and at the scale of an individual reservoir. Whereas exploration for conventional reser-voirs has been relatively easy, with expressions of resource found close to or even at the surface, explora-tion for non-conventional systems relies on temperature inherently increasing with depth and searching for favourable geological environments that maximise this increase. To utilitise the information we do have, we often assimilate available exploration data with models that capture the physics of the domi-nant underlying processes. Here, we discuss computational modelling approaches to exploration at a re-gional or crust scale, with application to geothermal reservoirs within basins or systems of basins. Tar-get reservoirs have (at least) appropriate temperature, permeability and are at accessible depths. We discuss the software development approach that leads to effective use of the tool Underworld. We ex-plore its role in the process of modelling, understanding computational error, importing and exporting geological knowledge as applied to the geological system underpinning the Guangdong Province, China.

  9. 100-Year Floodplains, flood plain, Published in 2009, 1:24000 (1in=2000ft) scale, Washington County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Other information as of 2009. It is described as 'flood...

  10. The Hunterian Neurosurgical Laboratory: the first 100 years of neurosurgical research.

    Science.gov (United States)

    Sampath, P; Long, D M; Brem, H

    2000-01-01

    Modern neurosurgery has long had a strong laboratory foundation, and much of this tradition can be traced to the Hunterian Neurosurgical Laboratory of the Johns Hopkins Hospital. Founded with the basic goals of investigating the causes and symptoms of disease and establishing the crucial role that surgeons may play in the treatment of disease, the Hunterian laboratory has adhered to these tenets, despite the dramatic changes in neurosurgery that have occurred in the last 100 years. Named for the famous English surgeon John Hunter (1728-1793), the Hunterian laboratory was conceived by William Welch and William Halsted as a special laboratory for experimental work in surgery and pathology. In 1904, Harvey Cushing was appointed by Halsted to direct the laboratory. With the three primary goals of student education, veterinary surgery that stressed surgical techniques, and meticulous surgical and laboratory record-keeping, the laboratory was quite productive, introducing the use of physiological saline solutions, describing the anatomic features and function of the pituitary gland, and establishing the field of endocrinology. In addition, the original development of hanging drop tissue culture, fundamental investigations into cerebrospinal fluid, and countless contributions to otolaryngology by Samuel Crowe all occurred during this "crucible" period. In 1912, Cushing was succeeded by Walter Dandy, whose work on experimental hydrocephalus and cerebrospinal fluid circulation led to the development of pneumoencephalography. The early days of neurosurgery evolved with close ties to general surgery, and so did the Hunterian laboratory. After Dandy began devoting his time to clinical work, general surgeons (first Jay McLean and then, in 1922, Ferdinand Lee) became the directors of the laboratory. Between 1928 and 1942, more than 150 original articles were issued from the Hunterian laboratory; these articles described significant advances in surgery, including pioneering

  11. Evolution of iron minerals in a 100years-old Technosol. Consequences on Zn mobility

    Energy Technology Data Exchange (ETDEWEB)

    Coussy, Samuel; Grangeon, Sylvain; Bataillard, Philippe; Khodja, Hicham; Maubec, Nicolas; Faure, Pierre; Schwartz, Christophe; Dagois, Robin (BRGM- France); (CNRS-UMR)

    2017-03-01

    The prediction of the long term trace element mobility in anthropogenic soils would be a way to anticipate land management and should help in reusing slightly contaminated materials. In the present study, iron (Fe) and zinc (Zn) status evolution was investigated in a 100-year old Technosol. The site of investigation is an old brownfield located in the Nord-Pas-de-Calais region (France) which has not been reshaped since the beginning of the last century. The whole soil profile was sampled as a function of depth, and trace elements mobility at each depth was determined by batch leaching test. A specific focus on Fe and Zn status was carried out by bulk analyses, such as selective dissolution, X-ray diffraction (XRD) and X-ray absorption spectroscopy (XAS). Fe and Zn status in the profile samples was also studied using laterally resolved techniques such as μ-particle induced X-ray emission (μ-PIXE) and μ-Rutherford backscattering spectroscopy (μ-RBS). The results indicate that (i) Fe is mainly under Fe(III) form, except a minor contribution of Fe(II) in the deeper samples, (ii) some Fe species inherited from the past have been weathered and secondary minerals are constituted of metal-bearing sulphates and Fe (hydr)oxides, (iii) ferrihydrite is formed during pedogenesis (iv) 20 to 30% more Fe (hydr)oxides are present in the surface than in depth and (v) Zn has tetrahedral coordination and is sorbed to phases of increasing crystallinity when depth increases. Zn-bearing phases identified in the present study are: complex Fe, Mn, Zn sulphides, sulphates, organic matter, and ferrihydrite. Soil formation on such material does not induce a dramatic increase of Zn solubility since efficient scavengers are concomitantly formed in the system. However, Technosols are highly heterogeneous and widely differ from one place to another. The behavior examined in this study is not generic and will depend on the type of Technosol and on the secondary minerals formed as well as on

  12. Statistical tools applied for the reduction of the defect rate of coffee degassing valves

    Directory of Open Access Journals (Sweden)

    Giorgio Olmi

    2015-04-01

    Full Text Available Coffee is a very common beverage exported all over the world: just after roasting, coffee beans are packed in plastic or paper bags, which then experience long transfers with long storage times. Fresh roasted coffee emits large amounts of CO2 for several weeks. This gas must be gradually released, to prevent package over-inflation and to preserve aroma, moreover beans must be protected from oxygen coming from outside. Therefore, one-way degassing valves are applied to each package: their correct functionality is strictly related to the interference coupling between their bodies and covers and to the correct assembly of the other involved parts. This work takes inspiration from an industrial problem: a company that assembles valve components, supplied by different manufacturers, observed a high level of defect rate, affecting its valve production. An integrated approach, consisting in the adoption of quality charts, in an experimental campaign for the dimensional analysis of the mating parts and in the statistical processing of the data, was necessary to tackle the question. In particular, a simple statistical tool was made available to predict the defect rate and to individuate the best strategy for its reduction. The outcome was that requiring a strict protocol, regarding the combinations of parts from different manufacturers for assembly, would have been almost ineffective. Conversely, this study led to the individuation of the weak point in the manufacturing process of the mating components and to the suggestion of a slight improvement to be performed, with the final result of a significant (one order of magnitude decrease of the defect rate.

  13. Tool for Experimenting with Concepts of Mobile Robotics as Applied to Children's Education

    Science.gov (United States)

    Jimenez Jojoa, E. M.; Bravo, E. C.; Bacca Cortes, E. B.

    2010-01-01

    This paper describes the design and implementation of a tool for experimenting with mobile robotics concepts, primarily for use by children and teenagers, or by the general public, without previous experience in robotics. This tool helps children learn about science in an approachable and interactive way, using scientific research principles in…

  14. Applying New Diabetes Teaching Tools in Health-Related Extension Programming

    Science.gov (United States)

    Grenci, Alexandra

    2010-01-01

    In response to the emerging global diabetes epidemic, health educators are searching for new and better education tools to help people make positive behavior changes to successfully prevent or manage diabetes. Conversation Maps[R] are new learner-driven education tools that have been developed to empower individuals to improve their health…

  15. PySCIs: a user friendly Python tool to quickly applying Small Circle methods

    Science.gov (United States)

    Calvín, Pablo; José Villalaín, Juan; Casas, Antonio; Torres, Sara

    2017-04-01

    Small Circle (SC) methods are common tools in paleomagnetic working on synfolding paleomagnetic components. These methods have a twofold applicability. On one hand, the Small Circle Intersection (SCI) method allows obtaining the local remagnetization direction and on the other the SCs can be used to restitute the attitude of the sedimentary beds at the moment of the remagnetization acquisition. The bases of the SCI method are as follows. (i) The paleomagnetic direction for each site follows a path which draws a SC under progressive untinting of the beds; this SC links the paleomagnetic direction before and after the tectonic correction. (ii) Considering that the beds have been deformed only by tilting around the bedding strike, the remagnetization direction is placed upon the small circle of each site. (iii) The acquisition of the remagnetization was simultaneous for the analyzed rocks. Therefore, the remagnetization direction must to be placed upon the small circle for all sites and hence the all small circle must to intersect in one direction which corresponds with the remagnetization direction. Actually the method looks for the direction in the space closest to the set of SCs by means of A/n parameter (this is the sum of the angular distances between one direction and each SC normalized by the number of sites). Once the remagnetization direction is known, it is possible to calculate the paleomagnetic direction upon each SC closest to the calculated remagnetization direction, called as Best Fit Direction (BFD). After that the paleodip of the bed (i.e. the dip of the bed at the moment of the remagnetization event) can be calculated for each site (the paleodip is the angle measured over the SC between the BFD and the paleomagnetic direction after the complete bedding correction) and perform a palinspastic reconstruction of a region. We present pySCIs, a new python tool which allows applying this methodology in an easy way. The program has two different modules, py

  16. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    Science.gov (United States)

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  17. Predictive Maintenance--An Effective Money Saving Tool Being Applied in Industry Today.

    Science.gov (United States)

    Smyth, Tom

    2000-01-01

    Looks at preventive/predictive maintenance as it is used in industry. Discusses core preventive maintenance tools that must be understood to prepare students. Includes a list of websites related to the topic. (JOW)

  18. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis.

    Science.gov (United States)

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f .

  19. Cast Iron And Mineral Cast Applied For Machine Tool Bed - Dynamic Behavior Analysis

    OpenAIRE

    2015-01-01

    Cast iron and mineral cast are the materials most often used in the machine structural elements design (bodies, housings, machine tools beds etc.). The materials significantly differ in physical and mechanical properties. The ability to suppress vibration is one of the most important factors determining the dynamic properties of the machine and has a significant impact on the machining capabilities of a machine tool. Recent research and development trends show that there is a clear tendency t...

  20. Operation reliability assessment for cutting tools by applying a proportional covariate model to condition monitoring information.

    Science.gov (United States)

    Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia

    2012-09-25

    The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools.

  1. 100 years of elementary particles [Beam Line, vol. 27, issue 1, Spring 1997

    Energy Technology Data Exchange (ETDEWEB)

    Pais, Abraham; Weinberg, Steven; Quigg, Chris; Riordan, Michael; Panofsky, Wolfgang K.H.; Trimble, Virginia

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  2. 100 years of Elementary Particles [Beam Line, vol. 27, issue 1, Spring 1997

    Science.gov (United States)

    Pais, Abraham; Weinberg, Steven; Quigg, Chris; Riordan, Michael; Panofsky, Wolfgang K. H.; Trimble, Virginia

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  3. 100 years of training and development research: What we know and where we should go.

    Science.gov (United States)

    Bell, Bradford S; Tannenbaum, Scott I; Ford, J Kevin; Noe, Raymond A; Kraiger, Kurt

    2017-03-01

    Training and development research has a long tradition within applied psychology dating back to the early 1900s. Over the years, not only has interest in the topic grown but there have been dramatic changes in both the science and practice of training and development. In the current article, we examine the evolution of training and development research using articles published in the Journal of Applied Psychology (JAP) as a primary lens to analyze what we have learned and to identify where future research is needed. We begin by reviewing the timeline of training and development research in JAP from 1918 to the present in order to elucidate the critical trends and advances that define each decade. These trends include the emergence of more theory-driven training research, greater consideration of the role of the trainee and training context, examination of learning that occurs outside the classroom, and understanding training's impact across different levels of analysis. We then examine in greater detail the evolution of 4 key research themes: training criteria, trainee characteristics, training design and delivery, and the training context. In each area, we describe how the focus of research has shifted over time and highlight important developments. We conclude by offering several ideas for future training and development research. (PsycINFO Database Record

  4. A Development of the Calibration Tool Applied on Analog I/O Modules for Safety-related Controller

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong-Kyun; Yun, Dong-Hwa; Lee, Myeong-Kyun; Yoo, Kwan-Woo [SOOSAN ENS Co., Seoul (Korea, Republic of)

    2016-10-15

    The purpose of this paper is to develop the calibration tool for analog input/output(I/O) modules. Those modules are components in POSAFE-Q which is a programmable logic controller(PLC) that has been developed for the evaluation of safety-related. In this paper, performance improvement of analog I/O modules is presented by developing and applying the calibration tool for each channel in analog I/O modules. With this tool, the input signal to an analog input module and the output signal from an analog output module are able to be satisfied with a reference value of sensor type and an accuracy of all modules. With RS-232 communication, the manual calibration tool is developed for analog I/O modules of an existing and up-to-date version in POSAFE-Q PLC. As a result of applying this tool, the converted value is performant for a type of input sensor and an accuracy of analog I/O modules.

  5. History of right heart catheterization: 100 years of experimentation and methodology development.

    Science.gov (United States)

    Nossaman, Bobby D; Scruggs, Brittni A; Nossaman, Vaughn E; Murthy, Subramanyam N; Kadowitz, Philip J

    2010-01-01

    The development of right heart catheterization has provided the clinician the ability to diagnose patients with congenital and acquired right heart disease, and to monitor patients in the intensive care unit with significant cardiovascular illnesses. The development of bedside pulmonary artery catheterization has become a standard of care for the critically ill patient since its introduction into the intensive care unit almost 40 years ago. However, adoption of this procedure into the mainstream of clinical practice occurred without prior evaluation or demonstration of its clinical or cost-effectiveness. Moreover, current randomized, controlled trials provide little evidence in support of the clinical utility of pulmonary artery catheterization in the management of critically ill patients. Nevertheless, the right heart catheter is an important diagnostic tool to assist the clinician in the diagnosis of congenital heart disease and acquired right heart disease, and moreover, when catheter placement is proximal to the right auricle (atria), this catheter provides an important and safe route for administration of fluids, medications, and parenteral nutrition. The purpose of this manuscript is to review the development of right heart catheterization that led to the ability to conduct physiologic studies in cardiovascular dynamics in normal individuals and in patients with cardiovascular diseases, and to review current controversies of the extension of the right heart catheter, the pulmonary artery catheter.

  6. Applying Behavior-Based Robotics Concepts to Telerobotic Use of Power Tooling

    Energy Technology Data Exchange (ETDEWEB)

    Noakes, Mark W [ORNL; Hamel, Dr. William R. [University of Tennessee, Knoxville (UTK)

    2011-01-01

    While it has long been recognized that telerobotics has potential advantages to reduce operator fatigue, to permit lower skilled operators to function as if they had higher skill levels, and to protect tools and manipulators from excessive forces during operation, relatively little laboratory research in telerobotics has actually been implemented in fielded systems. Much of this has to do with the complexity of the implementation and its lack of ability to operate in complex unstructured remote systems environments. One possible solution is to approach the tooling task using an adaptation of behavior-based techniques to facilitate task decomposition to a simpler perspective and to provide sensor registration to the task target object in the field. An approach derived from behavior-based concepts has been implemented to provide automated tool operation for a teleoperated manipulator system. The generic approach is adaptable to a wide range of typical remote tools used in hot-cell and decontamination and dismantlement-type operations. Two tasks are used in this work to test the validity of the concept. First, a reciprocating saw is used to cut a pipe. The second task is bolt removal from mockup process equipment. This paper explains the technique, its implementation, and covers experimental data, analysis of results, and suggestions for implementation on fielded systems.

  7. The Theory of Planned Behaviour Applied to Search Engines as a Learning Tool

    Science.gov (United States)

    Liaw, Shu-Sheng

    2004-01-01

    Search engines have been developed for helping learners to seek online information. Based on theory of planned behaviour approach, this research intends to investigate the behaviour of using search engines as a learning tool. After factor analysis, the results suggest that perceived satisfaction of search engine, search engines as an information…

  8. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    Science.gov (United States)

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  9. Integrated tools and techniques applied to the TES ground data system

    Science.gov (United States)

    Morrison, B. A.

    2000-01-01

    The author of this paper will dicuss the selection of CASE tools, a decision making process, requirements tracking and a review mechanism that leads to a highly integrated approach to software development that must deal with the constant pressure to change software requirements and design that is associated with research and development.

  10. The Theory of Planned Behaviour Applied to Search Engines as a Learning Tool

    Science.gov (United States)

    Liaw, Shu-Sheng

    2004-01-01

    Search engines have been developed for helping learners to seek online information. Based on theory of planned behaviour approach, this research intends to investigate the behaviour of using search engines as a learning tool. After factor analysis, the results suggest that perceived satisfaction of search engine, search engines as an information…

  11. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    Science.gov (United States)

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  12. Hamiltonian Systems and Optimal Control in Computational Anatomy: 100 Years Since D'Arcy Thompson.

    Science.gov (United States)

    Miller, Michael I; Trouvé, Alain; Younes, Laurent

    2015-01-01

    The Computational Anatomy project is the morphome-scale study of shape and form, which we model as an orbit under diffeomorphic group action. Metric comparison calculates the geodesic length of the diffeomorphic flow connecting one form to another. Geodesic connection provides a positioning system for coordinatizing the forms and positioning their associated functional information. This article reviews progress since the Euler-Lagrange characterization of the geodesics a decade ago. Geodesic positioning is posed as a series of problems in Hamiltonian control, which emphasize the key reduction from the Eulerian momentum with dimension of the flow of the group, to the parametric coordinates appropriate to the dimension of the submanifolds being positioned. The Hamiltonian viewpoint provides important extensions of the core setting to new, object-informed positioning systems. Several submanifold mapping problems are discussed as they apply to metamorphosis, multiple shape spaces, and longitudinal time series studies of growth and atrophy via shape splines.

  13. Accumulation of pharmaceuticals, Enterococcus, and resistance genes in soils irrigated with wastewater for zero to 100 years in central Mexico.

    Directory of Open Access Journals (Sweden)

    Philipp Dalkmann

    Full Text Available Irrigation with wastewater releases pharmaceuticals, pathogenic bacteria, and resistance genes, but little is known about the accumulation of these contaminants in the environment when wastewater is applied for decades. We sampled a chronosequence of soils that were variously irrigated with wastewater from zero up to 100 years in the Mezquital Valley, Mexico, and investigated the accumulation of ciprofloxacin, enrofloxacin, sulfamethoxazole, trimethoprim, clarithromycin, carbamazepine, bezafibrate, naproxen, diclofenac, as well as the occurrence of Enterococcus spp., and sul and qnr resistance genes. Total concentrations of ciprofloxacin, sulfamethoxazole, and carbamazepine increased with irrigation duration reaching 95% of their upper limit of 1.4 µg/kg (ciprofloxacin, 4.3 µg/kg (sulfamethoxazole, and 5.4 µg/kg (carbamazepine in soils irrigated for 19-28 years. Accumulation was soil-type-specific, with largest accumulation rates in Leptosols and no time-trend in Vertisols. Acidic pharmaceuticals (diclofenac, naproxen, bezafibrate were not retained and thus did not accumulate in soils. We did not detect qnrA genes, but qnrS and qnrB genes were found in two of the irrigated soils. Relative concentrations of sul1 genes in irrigated soils were two orders of magnitude larger (3.15 × 10(-3 ± 0.22 × 10(-3 copies/16S rDNA than in non-irrigated soils (4.35 × 10(-5± 1.00 × 10(-5 copies/16S rDNA, while those of sul2 exceeded the ones in non-irrigated soils still by a factor of 22 (6.61 × 10(-4 ± 0.59 × 10(-4 versus 2.99 × 10(-5 ± 0.26 × 10(-5 copies/16S rDNA. Absolute numbers of sul genes continued to increase with prolonging irrigation together with Enterococcus spp. 23S rDNA and total 16S rDNA contents. Increasing total concentrations of antibiotics in soil are not accompanied by increasing relative abundances of resistance genes. Nevertheless, wastewater irrigation enlarges the absolute concentration of resistance genes in soils due to a

  14. Systems thinking tools as applied to community-based participatory research: a case study.

    Science.gov (United States)

    BeLue, Rhonda; Carmack, Chakema; Myers, Kyle R; Weinreb-Welch, Laurie; Lengerich, Eugene J

    2012-12-01

    Community-based participatory research (CBPR) is being used increasingly to address health disparities and complex health issues. The authors propose that CBPR can benefit from a systems science framework to represent the complex and dynamic characteristics of a community and identify intervention points and potential "tipping points." Systems science refers to a field of study that posits a holistic framework that is focused on component parts of a system in the context of relationships with each other and with other systems. Systems thinking tools can assist in intervention planning by allowing all CBPR stakeholders to visualize how community factors are interrelated and by potentially identifying the most salient intervention points. To demonstrate the potential utility of systems science tools in CBPR, the authors show the use of causal loop diagrams by a community coalition engaged in CBPR activities regarding youth drinking reduction and prevention.

  15. Applying quality management tools to medical photography services: a pilot project.

    Science.gov (United States)

    Murray, Peter

    2003-03-01

    The Medical Photography Department at Peterborough Hospitals NHS Trust set up a pilot project to reduce the turnaround time of fundus fluorescein angiograms to the Ophthalmology Department. Quality management tools were used to analyse current photographic practices and develop more efficient methods of service delivery. The improved service to the Ophthalmology Department demonstrates the value of quality management in developing medical photography services at Peterborough Hospitals.

  16. Drugs on the internet, part IV: Google's Ngram viewer analytic tool applied to drug literature.

    Science.gov (United States)

    Montagne, Michael; Morgan, Melissa

    2013-04-01

    Google Inc.'s digitized book library can be searched based on key words and phrases over a five-century time frame. Application of the Ngram Viewer to drug literature was assessed for its utility as a research tool. The results appear promising as a method for noting changes in the popularity of specific drugs over time, historical epidemiology of drug use and misuse, and adoption and regulation of drug technologies.

  17. Applying Dataflow Architecture and Visualization Tools to In Vitro Pharmacology Data Automation.

    Science.gov (United States)

    Pechter, David; Xu, Serena; Kurtz, Marc; Williams, Steven; Sonatore, Lisa; Villafania, Artjohn; Agrawal, Sony

    2016-12-01

    The pace and complexity of modern drug discovery places ever-increasing demands on scientists for data analysis and interpretation. Data flow programming and modern visualization tools address these demands directly. Three different requirements-one for allosteric modulator analysis, one for a specialized clotting analysis, and one for enzyme global progress curve analysis-are reviewed, and their execution in a combined data flow/visualization environment is outlined.

  18. Is Scores Derived from the Most Internationally Applied Patient Safety Culture Assessment Tool Correct?

    OpenAIRE

    Javad Moghri; Ali Akbari Sari; Mehdi Yousefi; Hasan Zahmatkesh; Ranjbar Mohammad Ezzatabadi; Pejman Hamouzadeh; Satar Rezaei; Jamil Sadeghifar

    2013-01-01

    Abstract Background Hospital Survey on Patient Safety Culture, known as HSOPS, is an internationally well known and widely used tool for measuring patient safety culture in hospitals. It includes 12 dimensions with positive and negative wording questions. The distribution of these questions in different dimensions is uneven and provides the risk of acquiescence bias. The aim of this study was to assess the questionnaire against this bias. Methods Three hundred nurses were assigned into study ...

  19. Why do we live for much less than 100 years? A fluid mechanics view and approach

    Science.gov (United States)

    Messaris, Gerasimos A. T.; Hadjinicolaou, Maria; Karahalios, George T.

    2017-08-01

    Blood flow in arteries induces shear stresses on the arterial walls. The present work is motivated by the implications of low shear stress on the human arterial system and its effect on the duration of the life of a subject. The low and/or bidirectional wall shear stress stiffens the arterial wall and in synergy with the fluctuating tissue stress due to the fluctuating blood pressure activates the mechanism of aging. If the shear stress were not low and/or bidirectional and if it did not contribute to local endothelium dysfunctions, the tissue stress alone would take more than 100 yr to cause a failure on the human arterial system. Applying the s-n diagram (tissue stress against the number of cycles to failure) to determine the fatigue life of the aorta, for example, we find that in the absence of other pathogenic factors, for a tissue stress 1.2 times bigger than the tissue stress of a non-stiff aorta, the potential 100 yr of life are reduced to nearly 80 yr. Calculation of the rate of variation of the tissue stress of a subject with time may lead to a possible prognosis about the evolution of wall stiffness and its impact on the arterial aging of this subject. Further patient-specific in vivo mechanistic studies complemented by molecular imaging are needed to contribute to the formation of a data base, from which improved models describing the evolution of the arterial stiffness can be developed. Accordingly, the degree of stiffness of the aorta compared with existing data from a corresponding data base may provide with information about the degree of the fatigue of the aortic wall and its possible future behavior and lead to a patient-adapted medical treatment as a means of a would-be preventive medication.

  20. Applying knowledge translation tools to inform policy: the case of mental health in Lebanon.

    Science.gov (United States)

    Yehia, Farah; El Jardali, Fadi

    2015-06-06

    Many reform efforts in health systems fall short because the use of research evidence to inform policy remains scarce. In Lebanon, one in four adults suffers from a mental illness, yet access to mental healthcare services in primary healthcare (PHC) settings is limited. Using an "integrated" knowledge framework to link research to action, this study examines the process of influencing the mental health agenda in Lebanon through the application of Knowledge Translation (KT) tools and the use of a KT Platform (KTP) as an intermediary between researchers and policymakers. This study employed the following KT tools: 1) development of a policy brief to address the lack of access to mental health services in PHC centres, 2) semi-structured interviews with 10 policymakers and key informants, 3) convening of a national policy dialogue, 4) evaluation of the policy brief and dialogue, and 5) a post-dialogue survey. Findings from the key informant interviews and a comprehensive synthesis of evidence were used to develop a policy brief which defined the problem and presented three elements of a policy approach to address it. This policy brief was circulated to 24 participants prior to the dialogue to inform the discussion. The policy dialogue validated the evidence synthesized in the brief, whereby integrating mental health into PHC services was the element most supported by evidence as well as participants. The post-dialogue survey showed that, in the following 6 months, several implementation steps were taken by stakeholders, including establishing national taskforce, training PHC staff, and updating the national essential drug list to include psychiatric medications. Relationships among policymakers, researchers, and stakeholders were strengthened as they conducted their own workshops and meetings after the dialogue to further discuss implementation, and their awareness about and demand for KT tools increased. This case study showed that the use of KT tools in Lebanon to

  1. Atomic Force Microscopy as a Tool for Applied Virology and Microbiology

    Science.gov (United States)

    Zaitsev, Boris

    2003-12-01

    Atomic force microscope (AFM) can be successfully used for simple and fast solution of many applied biological problems. In this paper the survey of the results of the application of atomic force microscope SolverP47BIO (NT-MDT, Russia) in State Research Center of Virology and Biotechnology "Vector" is presented. The AFM has been used: - in applied virology for the counting of viral particles and examination of virus-cell interaction; - in microbiology for measurements and indication of bacterial spores and cells; - in biotechnology for control of biotechnological processes and evaluation of the distribution of particle dimension for viral and bacterial diagnostic assays. The main advantages of AFM in applied researches are simplicity of the processing of sample preparation and short time of the examination.

  2. Visual operations management tools applied to the oil pipelines and terminals standardization process: the experience of TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Santiago, Adilson; Ribeiro, Kassandra Senra; Arruda, Daniela Mendonca [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes the process by which visual operations management (VOM) tools were implemented, concerning standards and operational procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. It provides: a brief literature review of visual operations management tools applied to total quality management and the standardization processes; a discussion of the assumptions from the second level of VOM (visual standards) upon which TRANSPETRO's oil pipelines and terminals business processes and operational procedures are based; and a description of the VOM implementation process involving more than 100 employees and one illustrative example of 'Quick Guides' for right-of- way management activities. Finally, it discusses the potential impacts and benefits of using VOM tools in the current practices in TRANSPETRO's Oil Pipelines and Terminals Unit, reinforcing the importance of such visual guides as vital to implement regional and corporate procedures, focusing on the main operational processes. (author)

  3. Applying CRISPR-Cas9 tools to identify and characterize transcriptional enhancers.

    Science.gov (United States)

    Lopes, Rui; Korkmaz, Gozde; Agami, Reuven

    2016-09-01

    The development of the CRISPR-Cas9 system triggered a revolution in the field of genome engineering. Initially, the use of this system was focused on the study of protein-coding genes but, recently, a number of CRISPR-Cas9-based tools have been developed to study non-coding transcriptional regulatory elements. These technological advances offer unprecedented opportunities for elucidating the functions of enhancers in their endogenous context. Here, we discuss the application, current limitations and future development of CRISPR-Cas9 systems to identify and characterize enhancer elements in a high-throughput manner.

  4. A tool for urban soundscape evaluation applying Support Vector Machines for developing a soundscape classification model.

    Science.gov (United States)

    Torija, Antonio J; Ruiz, Diego P; Ramos-Ridao, Angel F

    2014-06-01

    To ensure appropriate soundscape management in urban environments, the urban-planning authorities need a range of tools that enable such a task to be performed. An essential step during the management of urban areas from a sound standpoint should be the evaluation of the soundscape in such an area. In this sense, it has been widely acknowledged that a subjective and acoustical categorization of a soundscape is the first step to evaluate it, providing a basis for designing or adapting it to match people's expectations as well. In this sense, this work proposes a model for automatic classification of urban soundscapes. This model is intended for the automatic classification of urban soundscapes based on underlying acoustical and perceptual criteria. Thus, this classification model is proposed to be used as a tool for a comprehensive urban soundscape evaluation. Because of the great complexity associated with the problem, two machine learning techniques, Support Vector Machines (SVM) and Support Vector Machines trained with Sequential Minimal Optimization (SMO), are implemented in developing model classification. The results indicate that the SMO model outperforms the SVM model in the specific task of soundscape classification. With the implementation of the SMO algorithm, the classification model achieves an outstanding performance (91.3% of instances correctly classified).

  5. SHAPA: An interactive software tool for protocol analysis applied to aircrew communications and workload

    Science.gov (United States)

    James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.

    1990-01-01

    As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.

  6. Deriving efficient policy portfolios promoting sustainable energy systems-Case studies applying Invert simulation tool

    Energy Technology Data Exchange (ETDEWEB)

    Kranzl, Lukas; Stadler, Michael; Huber, Claus; Haas, Reinhard [Energy Economics Group, Vienna University of Technology, Gusshausstrasse 28/29/373-2A, 1040 Vienna (Austria); Ragwitz, Mario; Brakhage, Anselm [Fraunhofer Institute for Systems and Innovation Research, Breslauer Strasse 48, D-76139 Karlsruhe (Germany); Gula, Adam; Figorski, Arkadiusz [Faculty of Fuels and Energy, AGH University of Science and Technology, Al. Mickiewicza 30, PL-30-059 Krakow (Poland)

    2006-12-15

    Within recent years, energy policies have imposed a number of targets at European and national level for rational use of energy (RUE), renewable energy sources (RES) and related CO{sub 2} reductions. As a result, a wide variety of policy instruments is currently implemented and hence the question arises: how can these instruments be designed in a way to reach the maximum policy target with the minimum public money spent? The objective of this paper is to derive a methodology for obtaining efficient policy portfolios promoting sustainable energy systems depending on the policy target and show corresponding results from case studies in Austria, Germany and Poland. The investigations were carried out by application of Invert simulation tool, a computer model developed for simulating the impacts of various promotion schemes for renewable and efficient energy systems. With this tool, the CO{sub 2} reductions and related public expenses have been calculated for various policy mixes. In the building-related energy sector, it turned out that in all investigated regions support schemes for supply side measures are the most cost-efficient instruments. However, their potential is restricted and for achieving higher levels of CO{sub 2} reduction, promotion of demand side measures is indispensable. The paper shows that for a comprehensive comparison of policy portfolios, there are always two dimensions to be considered: efficiency and effectiveness. The more effective, i.e. the higher the implementation rate of a scheme, the more essential becomes the efficiency criteria. (author)

  7. Exploration tools for drug discovery and beyond: applying SciFinder to interdisciplinary research.

    Science.gov (United States)

    Haldeman, Margaret; Vieira, Barbara; Winer, Fred; Knutsen, Lars J S

    2005-06-01

    Chemists have long recognized the value of online databases for surveying the literature of their field. Chemical Abstracts Service (CAS) databases covering almost a century's worth of journal articles and patent documents are among the best known and widely used for searching information on compounds. Today's research presents a new challenge, however, as the boundaries of chemistry and biological sciences overlap increasingly. This trend is especially true in the drug discovery field where published findings relating to both chemical and biological entities and their interactions are examined. CAS has expanded its resources to meet the requirements of the new, interdisciplinary challenges faced by today's researchers. This is evident both in the content of CAS databases, which have been expanded to include more biology-related information, and in the technology of the search tools now available to researchers on their desktop. It is the integration of content and search-and-retrieval technology that enables new insights to be made in the vast body of accumulated information. CAS's SciFinder is a widely used research tool for this purpose.

  8. Orymold: ontology based gene expression data integration and analysis tool applied to rice

    Directory of Open Access Journals (Sweden)

    Segura Jordi

    2009-05-01

    Full Text Available Abstract Background Integration and exploration of data obtained from genome wide monitoring technologies has become a major challenge for many bioinformaticists and biologists due to its heterogeneity and high dimensionality. A widely accepted approach to solve these issues has been the creation and use of controlled vocabularies (ontologies. Ontologies allow for the formalization of domain knowledge, which in turn enables generalization in the creation of querying interfaces as well as in the integration of heterogeneous data, providing both human and machine readable interfaces. Results We designed and implemented a software tool that allows investigators to create their own semantic model of an organism and to use it to dynamically integrate expression data obtained from DNA microarrays and other probe based technologies. The software provides tools to use the semantic model to postulate and validate of hypotheses on the spatial and temporal expression and function of genes. In order to illustrate the software's use and features, we used it to build a semantic model of rice (Oryza sativa and integrated experimental data into it. Conclusion In this paper we describe the development and features of a flexible software application for dynamic gene expression data annotation, integration, and exploration called Orymold. Orymold is freely available for non-commercial users from http://www.oryzon.com/media/orymold.html

  9. Enabling to Apply XP Process in Distributed Development Environments with Tool Support

    Directory of Open Access Journals (Sweden)

    Ali Akbar Ansari

    2012-07-01

    Full Text Available The evaluation both in academic and industrial areas of the XP methodology has shown very good results if applied to small/medium co-localized working groups. In this paper, we described an approach that overcomes the XP constraint of collocation by introducing a process-support environment (called M.P.D.X.P that helps software development teams and solves the problems which arise when XP is carried out by distributed teams.

  10. δ18O record and temperature change over the past 100 years in ice cores on the Tibetan Plateau

    Institute of Scientific and Technical Information of China (English)

    YAO; Tandong; GUO; Xuejun; Lonnie; Thompson; DUAN; Keqin; WANG; Ninglian; PU; Jianchen; XU; Baiqing; YANG; Xiaoxin; SUN; Weizhen

    2006-01-01

    The 213 m ice core from the Puruogangri Ice Field on the Tibetan Plateau facilitates the study of the regional temperature changes with its δ18O record of the past 100 years. Here we combine information from this core with that from the Dasuopu ice core (from the southern Tibetan Plateau), the Guliya ice core (from the northwestern Plateau) and the Dunde ice core (from the northeastern Plateau) to learn about the regional differences in temperature change across the Tibetan Plateau. The δ18O changes vary with region on the Plateau, the variations being especially large between South and North and between East and West. Moreover, these four ice cores present increasing δ18O trends, indicating warming on the Tibetan Plateau over the past 100 years. A comparative study of Northern Hemisphere (NH) temperature changes, the δ18O-reflected temperature changes on the Plateau, and available meteorological records show consistent trends in overall warming during the past 100 years.

  11. Teaching Strategies to Apply in the Use of Technological Tools in Technical Education

    Directory of Open Access Journals (Sweden)

    Olga Arranz García

    2014-09-01

    Full Text Available The emergence of new technologies in education area is changing the way of organizing the educational processes. Teachers are not unrelated to these changes and must employ new strategies to adapt their teaching methods to the new circumstances. One of these adaptations is framed in the virtual learning, where the learning management systems have been revealed as a very effective means within the learning process. In this paper we try to provide teachers in engineering schools how to use in an appropriate way the different technological tools that are present in a virtual platform. Thus, in the experimental framework we show the results outcomes in the analysis of two data samples obtained before and after the implementation of the European Higher Education Area, that would be extrapolated for its innovative application to the learning techniques.

  12. Applying a Knowledge Management Modeling Tool for Manufacturing Vision (MV) Development

    DEFF Research Database (Denmark)

    Wang, Chengbo; Luxhøj, James T.; Johansen, John

    2004-01-01

    that the CBRM is supportive to the decision-making process of applying and augmenting organizational knowledge. It provides a new angle to tackle strategic management issues within the manufacturing system of a business operation. Explores a new proposition within strategic manufacturing management by enriching......This paper introduces an empirical application of an experimental model for knowledge management within an organization, namely a case-based reasoning model for manufacturing vision development (CBRM). The model integrates the development process of manufacturing vision with the methodology of case...

  13. Quantitative seismic interpretation: Applying rock physics tools to reduce interpretation risk

    Institute of Scientific and Technical Information of China (English)

    Yong Chen

    2007-01-01

    @@ Seismic data analysis is one of the key technologies for characterizing reservoirs and monitoring subsurface pore fluids. While there have been great advances in 3D seismic data processing, the quantitative interpretation of the seismic data for rock properties still poses many challenges. This book demonstrates how rock physics can be applied to predict reservoir parameters, such as lithologies and pore fluids, from seismically derived attributes, as well as how the multidisciplinary combination of rock physics models with seismic data, sedimentological information, and stochastic techniques can lead to more powerful results than can be obtained from a single technique.

  14. 100 years of superconductivity

    CERN Multimedia

    Globe Info

    2011-01-01

    Public lecture by Philippe Lebrun, who works at CERN on applications of superconductivity and cryogenics for particle accelerators. He was head of CERN’s Accelerator Technology Department during the LHC construction period. Centre culturel Jean Monnet, route de Gex Tuesday 11 October from 8.30 p.m. to 10.00 p.m. » Suitable for all – Admission free - Lecture in French » Number of places limited For further information: +33 (0)4 50 42 29 37

  15. 100 years of radar

    CERN Document Server

    Galati, Gaspare

    2016-01-01

    This book offers fascinating insights into the key technical and scientific developments in the history of radar, from the first patent, taken out by Hülsmeyer in 1904, through to the present day. Landmark events are highlighted and fascinating insights provided into the exceptional people who made possible the progress in the field, including the scientists and technologists who worked independently and under strict secrecy in various countries across the world in the 1930s and the big businessmen who played an important role after World War II. The book encourages multiple levels of reading. The author is a leading radar researcher who is ideally placed to offer a technical/scientific perspective as well as a historical one. He has taken care to structure and write the book in such a way as to appeal to both non-specialists and experts. The book is not sponsored by any company or body, either formally or informally, and is therefore entirely unbiased. The text is enriched by approximately three hundred ima...

  16. Neutron tomography of particulate filters: a non-destructive investigation tool for applied and industrial research

    Energy Technology Data Exchange (ETDEWEB)

    Toops, Todd J., E-mail: toopstj@ornl.gov [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Bilheux, Hassina Z.; Voisin, Sophie [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Gregor, Jens [University of Tennessee, Knoxville, TN (United States); Walker, Lakeisha; Strzelec, Andrea; Finney, Charles E.A.; Pihl, Josh A. [Oak Ridge National Laboratory, Oak Ridge, TN (United States)

    2013-11-21

    This research describes the development and implementation of high-fidelity neutron imaging and the associated analysis of the images. This advanced capability allows the non-destructive, non-invasive imaging of particulate filters (PFs) and how the deposition of particulate and catalytic washcoat occurs within the filter. The majority of the efforts described here were performed at the High Flux Isotope Reactor (HFIR) CG-1D neutron imaging beamline at Oak Ridge National Laboratory; the current spatial resolution is approximately 50 μm. The sample holder is equipped with a high-precision rotation stage that allows 3D imaging (i.e., computed tomography) of the sample when combined with computerized reconstruction tools. What enables the neutron-based image is the ability of some elements to absorb or scatter neutrons where other elements allow the neutron to pass through them with negligible interaction. Of particular interest in this study is the scattering of neutrons by hydrogen-containing molecules, such as hydrocarbons (HCs) and/or water, which are adsorbed to the surface of soot, ash and catalytic washcoat. Even so, the interactions with this adsorbed water/HC is low and computational techniques were required to enhance the contrast, primarily a modified simultaneous iterative reconstruction technique (SIRT). This effort describes the following systems: particulate randomly distributed in a PF, ash deposition in PFs, a catalyzed washcoat layer in a PF, and three particulate loadings in a SiC PF.

  17. Applying an ethical decision-making tool to a nurse management dilemma.

    Science.gov (United States)

    Toren, Orly; Wagner, Nurith

    2010-05-01

    This article considers ethical dilemmas that nurse managers may confront and suggests an ethical decision-making model that could be used as a tool for resolving such dilemmas. The focus of the article is on the question: Can nurse managers choose the ethically right solution in conflicting situations when nurses' rights collide with patients' rights to quality care in a world of cost-effective and economic constraint? Managers' responsibility is to ensure and facilitate a safe and ethical working environment in which nurses are able to give quality care to their patients. In nursing it is frequently declared that managers' main obligations are to patients' needs and their rights to receive quality care. However, managers' ethical responsibilities are not only to patients but also to the nurses working in their institution. This article describes a real (but disguised) situation from an Israeli health care context to illustrate the dilemmas that may arise. The question is posed of whether nurse managers can maintain patients' and nurses' rights and, at the same time, fulfill their obligation to the conflicting demands of the organization. The article also offers a way to solve conflict by using an ethical decision-making model.

  18. Applying CBR to machine tool product configuration design oriented to customer requirements

    Science.gov (United States)

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2017-01-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  19. Applying CBR to machine tool product configuration design oriented to customer requirements

    Science.gov (United States)

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2016-03-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  20. The potential of social entrepreneurship: conceptual tools for applying citizenship theory to policy and practice.

    Science.gov (United States)

    Caldwell, Kate; Harris, Sarah Parker; Renko, Maija

    2012-12-01

    Contemporary policy encourages self-employment and entrepreneurship as a vehicle for empowerment and self-sufficiency among people with disabilities. However, such encouragement raises important citizenship questions concerning the participation of people with intellectual and developmental disabilities (IDD). As an innovative strategy for addressing pressing social and economic problems, "social entrepreneurship" has become a phrase that is gaining momentum in the IDD community--one that carries with it a very distinct history. Although social entrepreneurship holds the potential to be an empowering source of job creation and social innovation, it also has the potential to be used to further disenfranchise this marginalized population. It is crucial that in moving forward society takes care not to perpetuate existing models of oppression, particularly in regard to the social and economic participation of people with IDD. The conceptual tools addressed in this article can inform the way that researchers, policymakers, and practitioners approach complex issues, such as social entrepreneurship, to improve communication among disciplines while retaining an integral focus on rights and social justice by framing this issue within citizenship theory.

  1. Applied Circular Dichroism: A Facile Spectroscopic Tool for Configurational Assignment and Determination of Enantiopurity

    Directory of Open Access Journals (Sweden)

    Macduff O. Okuom

    2015-01-01

    Full Text Available In order to determine if electronic circular dichroism (ECD is a good tool for the qualitative evaluation of absolute configuration and enantiopurity in the absence of chiral high performance liquid chromatography (HPLC, ECD studies were performed on several prescriptions and over-the-counter drugs. Cotton effects (CE were observed for both S and R isomers between 200 and 300 nm. For the drugs examined in this study, the S isomers showed a negative CE, while the R isomers displayed a positive CE. The ECD spectra of both enantiomers were nearly mirror images, with the amplitude proportional to the enantiopurity. Plotting the differential extinction coefficient (Δε versus enantiopurity at the wavelength of maximum amplitude yielded linear standard curves with coefficients of determination (R2 greater than 97% for both isomers in all cases. As expected, Equate, Advil, and Motrin, each containing a racemic mixture of ibuprofen, yielded no chiroptical signal. ECD spectra of Suphedrine and Sudafed revealed that each of them is rich in 1S,2S-pseudoephedrine, while the analysis of Equate vapor inhaler is rich in R-methamphetamine.

  2. A practical guide to applying lean tools and management principles to health care improvement projects.

    Science.gov (United States)

    Simon, Ross W; Canacari, Elena G

    2012-01-01

    Manufacturing organizations have used Lean management principles for years to help eliminate waste, streamline processes, and cut costs. This pragmatic approach to structured problem solving can be applied to health care process improvement projects. Health care leaders can use a step-by-step approach to document processes and then identify problems and opportunities for improvement using a value stream process map. Leaders can help a team identify problems and root causes and consider additional problems associated with methods, materials, manpower, machinery, and the environment by using a cause-and-effect diagram. The team then can organize the problems identified into logical groups and prioritize the groups by impact and difficulty. Leaders must manage action items carefully to instill a sense of accountability in those tasked to complete the work. Finally, the team leaders must ensure that a plan is in place to hold the gains.

  3. Can the FAST and ROSIER adult stroke recognition tools be applied to confirmed childhood arterial ischemic stroke?

    Directory of Open Access Journals (Sweden)

    Babl Franz E

    2011-10-01

    Full Text Available Abstract Background Stroke recognition tools have been shown to improve diagnostic accuracy in adults. Development of a similar tool in children is needed to reduce lag time to diagnosis. A critical first step is to determine whether adult stoke scales can be applied in childhood stroke. Our objective was to assess the applicability of adult stroke scales in childhood arterial ischemic stroke (AIS Methods Children aged 1 month to Results 47 children with AIS were identified. 34 had anterior, 12 had posterior and 1 child had anterior and posterior circulation infarcts. Median age was 9 years and 51% were male. Median time from symptom onset to ED presentation was 21 hours but one third of children presented within 6 hours. The most common presenting stroke symptoms were arm (63%, face (62%, leg weakness (57%, speech disturbance (46% and headache (46%. The most common signs were arm (61%, face (70% or leg weakness (57% and dysarthria (34%. 36 (78% of children had at least one positive variable on FAST and 38 (81% had a positive score of ≥1 on the ROSIER scale. Positive scores were less likely in children with posterior circulation stroke. Conclusion The presenting features of pediatric stroke appear similar to adult strokes. Two adult stroke recognition tools have fair to good sensitivity in radiologically confirmed childhood AIS but require further development and modification. Specificity of the tools also needs to be determined in a prospective cohort of children with stroke and non-stroke brain attacks.

  4. Ecoinformatics for integrated pest management: expanding the applied insect ecologist's tool-kit.

    Science.gov (United States)

    Rosenheim, Jay A; Parsa, Soroush; Forbes, Andrew A; Krimmel, William A; Law, Yao Hua; Segoli, Michal; Segoli, Moran; Sivakoff, Frances S; Zaviezo, Tania; Gross, Kevin

    2011-04-01

    Experimentation has been the cornerstone of much of integrated pest management (IPM) research. Here, we aim to open a discussion on the possible merits of expanding the use of observational studies, and in particular the use of data from farmers or private pest management consultants in "ecoinformatics" studies, as tools that might complement traditional, experimental research. The manifold advantages of experimentation are widely appreciated: experiments provide definitive inferences regarding causal relationships between key variables, can produce uniform and high-quality data sets, and are highly flexible in the treatments that can be evaluated. Perhaps less widely considered, however, are the possible disadvantages of experimental research. Using the yield-impact study to focus the discussion, we address some reasons why observational or ecoinformatics approaches might be attractive as complements to experimentation. A survey of the literature suggests that many contemporary yield-impact studies lack sufficient statistical power to resolve the small, but economically important, effects on crop yield that shape pest management decision-making by farmers. Ecoinformatics-based data sets can be substantially larger than experimental data sets and therefore hold out the promise of enhanced power. Ecoinformatics approaches also address problems at the spatial and temporal scales at which farming is conducted, can achieve higher levels of "external validity," and can allow researchers to efficiently screen many variables during the initial, exploratory phases of research projects. Experimental, observational, and ecoinformatics-based approaches may, if used together, provide more efficient solutions to problems in pest management than can any single approach, used in isolation.

  5. SIGMA: A Knowledge-Based Simulation Tool Applied to Ecosystem Modeling

    Science.gov (United States)

    Dungan, Jennifer L.; Keller, Richard; Lawless, James G. (Technical Monitor)

    1994-01-01

    The need for better technology to facilitate building, sharing and reusing models is generally recognized within the ecosystem modeling community. The Scientists' Intelligent Graphical Modelling Assistant (SIGMA) creates an environment for model building, sharing and reuse which provides an alternative to more conventional approaches which too often yield poorly documented, awkwardly structured model code. The SIGMA interface presents the user a list of model quantities which can be selected for computation. Equations to calculate the model quantities may be chosen from an existing library of ecosystem modeling equations, or built using a specialized equation editor. Inputs for dim equations may be supplied by data or by calculation from other equations. Each variable and equation is expressed using ecological terminology and scientific units, and is documented with explanatory descriptions and optional literature citations. Automatic scientific unit conversion is supported and only physically-consistent equations are accepted by the system. The system uses knowledge-based semantic conditions to decide which equations in its library make sense to apply in a given situation, and supplies these to the user for selection. "Me equations and variables are graphically represented as a flow diagram which provides a complete summary of the model. Forest-BGC, a stand-level model that simulates photosynthesis and evapo-transpiration for conifer canopies, was originally implemented in Fortran and subsequenty re-implemented using SIGMA. The SIGMA version reproduces daily results and also provides a knowledge base which greatly facilitates inspection, modification and extension of Forest-BGC.

  6. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    Science.gov (United States)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  7. Prediction of permafrost distribution on the Qinghai-Tibet Plateau in the next 50 and 100 years

    Institute of Scientific and Technical Information of China (English)

    NAN Zhuotong; LI Shuxun; CHENG Guodong

    2005-01-01

    Intergovernmental Panel on Climate Change (IPCC) in 2001 reported that the Earth air temperature would rise by 1.4-5.8℃ and 2.5℃ on average by the year 2100. China regional climate model results also showed that the air temperature on the Qinghai-Tibet Plateau (QTP) would increase by 2.2-2.6℃ in the next 50 years. A numerical permafrost model was ed to predict the changes of permafrost distribution on the QTP over the next 50 and 100 years under the two climatic warming scenarios, i.e. 0.02℃/a, the lower value of IPCC's estimation, and 0.052℃/a, the higher value predicted by Qin et al. Simulation results show that ( i ) in the case of 0.02℃/a air-temperature rise, permafrost area on the QTP will shrink about 8.8% in the next 50 years, and high temperature permafrost with mean annual ground temperature (MAGT) higher than -0.11℃ may turn into seasonal frozen soils. In the next 100 years, permafrost with MAGT higher than -0.5℃ will disappear and the permafrost area will shrink up to 13.4%. (ii) In the case of 0.052℃/a air-temperature rise, permafrost area on the QTP will reduce about 13.5% after 50 years. More remarkable degradation will take place after 100 years, and permafrost area will reduce about 46%. Permafrost with MAGT higher than -2℃ will turn into seasonal frozen soils and even unfrozen soils.

  8. Snooker: a structure-based pharmacophore generation tool applied to class A GPCRs.

    Science.gov (United States)

    Sanders, Marijn P A; Verhoeven, Stefan; de Graaf, Chris; Roumen, Luc; Vroling, Bas; Nabuurs, Sander B; de Vlieg, Jacob; Klomp, Jan P G

    2011-09-26

    G-protein coupled receptors (GPCRs) are important drug targets for various diseases and of major interest to pharmaceutical companies. The function of individual members of this protein family can be modulated by the binding of small molecules at the extracellular side of the structurally conserved transmembrane (TM) domain. Here, we present Snooker, a structure-based approach to generate pharmacophore hypotheses for compounds binding to this extracellular side of the TM domain. Snooker does not require knowledge of ligands, is therefore suitable for apo-proteins, and can be applied to all receptors of the GPCR protein family. The method comprises the construction of a homology model of the TM domains and prioritization of residues on the probability of being ligand binding. Subsequently, protein properties are converted to ligand space, and pharmacophore features are generated at positions where protein ligand interactions are likely. Using this semiautomated knowledge-driven bioinformatics approach we have created pharmacophore hypotheses for 15 different GPCRs from several different subfamilies. For the beta-2-adrenergic receptor we show that ligand poses predicted by Snooker pharmacophore hypotheses reproduce literature supported binding modes for ∼75% of compounds fulfilling pharmacophore constraints. All 15 pharmacophore hypotheses represent interactions with essential residues for ligand binding as observed in mutagenesis experiments and compound selections based on these hypotheses are shown to be target specific. For 8 out of 15 targets enrichment factors above 10-fold are observed in the top 0.5% ranked compounds in a virtual screen. Additionally, prospectively predicted ligand binding poses in the human dopamine D3 receptor based on Snooker pharmacophores were ranked among the best models in the community wide GPCR dock 2010.

  9. Applying Multiple Data Collection Tools to Quantify Human Papillomavirus Vaccine Communication on Twitter.

    Science.gov (United States)

    Massey, Philip M; Leader, Amy; Yom-Tov, Elad; Budenz, Alexandra; Fisher, Kara; Klassen, Ann C

    2016-12-05

    ,425/75,393, 27.09% and 6477/25,110, 25.79%, respectively), compared with only 11.5% of negative tweets (5647/48,940; Pcommunicate important health information, is a growing area of research in public health. Understanding the content and implications of conversations that form around HPV vaccination on social media can aid health organizations and health-focused Twitter users in creating a meaningful exchange of ideas and in having a significant impact on vaccine uptake. This area of research is inherently interdisciplinary, and this study supports this movement by applying public health, health communication, and data science approaches to extend methodologies across fields.

  10. How phenobarbital revolutionized epilepsy therapy: the story of phenobarbital therapy in epilepsy in the last 100 years.

    Science.gov (United States)

    Yasiry, Zeid; Shorvon, Simon D

    2012-12-01

    Phenobarbital (phenobarbitone) was first used as an antiepileptic drug 100 years ago, in 1912. This article tells the story of the discovery of its antiepileptic action, its early development, and the subsequent course of its clinical use over the 100-year period. The side effects, pharmacokinetics, and misuse of barbiturates are considered, along with the more recent clinical trials and the drug's current clinical utilization. The introduction of controlled drug regulations, the comparative cost of phenobarbital, and its inclusion on the World Health Organization (WHO) essential drug list are discussed. It is one of the few drugs on the formulary in 1912 that is still listed today, and remarkably its efficacy in epilepsy has not been significantly bettered. The current recommendation by the WHO is that phenobarbital should be offered as the first option for therapy for convulsive epilepsy in adults and children if availability can be ensured. This is rated as a strong recommendation because of the proven efficacy and low cost of phenobarbital, and despite its perceived side-effect profile and the practical problems of access. Whether this recommendation puts "a hierarchy on the brain," as has been suggested, is arguable. Much still needs to be learned about the drug's effects, and the issues raised by phenobarbital have lessons for all antiepileptic drug therapy. Wiley Periodicals, Inc. © 2012 International League Against Epilepsy.

  11. Development of a CSP plant energy yield calculation tool applying predictive models to analyze plant performance sensitivities

    Science.gov (United States)

    Haack, Lukas; Peniche, Ricardo; Sommer, Lutz; Kather, Alfons

    2017-06-01

    At early project stages, the main CSP plant design parameters such as turbine capacity, solar field size, and thermal storage capacity are varied during the techno-economic optimization to determine most suitable plant configurations. In general, a typical meteorological year with at least hourly time resolution is used to analyze each plant configuration. Different software tools are available to simulate the annual energy yield. Software tools offering a thermodynamic modeling approach of the power block and the CSP thermal cycle, such as EBSILONProfessional®, allow a flexible definition of plant topologies. In EBSILON, the thermodynamic equilibrium for each time step is calculated iteratively (quasi steady state), which requires approximately 45 minutes to process one year with hourly time resolution. For better presentation of gradients, 10 min time resolution is recommended, which increases processing time by a factor of 5. Therefore, analyzing a large number of plant sensitivities, as required during the techno-economic optimization procedure, the detailed thermodynamic simulation approach becomes impracticable. Suntrace has developed an in-house CSP-Simulation tool (CSPsim), based on EBSILON and applying predictive models, to approximate the CSP plant performance for central receiver and parabolic trough technology. CSPsim significantly increases the speed of energy yield calculations by factor ≥ 35 and has automated the simulation run of all predefined design configurations in sequential order during the optimization procedure. To develop the predictive models, multiple linear regression techniques and Design of Experiment methods are applied. The annual energy yield and derived LCOE calculated by the predictive model deviates less than ±1.5 % from the thermodynamic simulation in EBSILON and effectively identifies the optimal range of main design parameters for further, more specific analysis.

  12. Effects of 100 years wastewater irrigation on resistance genes, class 1 integrons and IncP-1 plasmids in Mexican soil

    Directory of Open Access Journals (Sweden)

    Sven eJechalke

    2015-03-01

    Full Text Available Long-term irrigation with untreated wastewater can lead to an accumulation of antibiotic substances and antibiotic resistance genes in soil. However, little is known so far about effects of wastewater, applied for decades, on the abundance of IncP-1 plasmids and class 1 integrons which may contribute to the accumulation and spread of resistance genes in the environment, and their correlation with heavy metal concentrations.Therefore, a chronosequence of soils that were irrigated with wastewater from zero to 100 years was sampled in the Mezquital Valley in Mexico in the dry season. The total community DNA was extracted and the absolute and relative abundance (relative to 16S rRNA genes of antibiotic resistance genes (tet(W, tet(Q, aadA, class 1 integrons (intI1, quaternary ammonium compound resistance genes (qacE+qacEΔ1 and IncP-1 plasmids (korB were quantified by real-time PCR. Except for intI1 and qacE+qacEΔ1 the abundances of selected genes were below the detection limit in non-irrigated soil. Confirming the results of a previous study, the absolute abundance of 16S rRNA genes in the samples increased significantly over time (linear regression model, p < 0.05 suggesting an increase in bacterial biomass due to repeated irrigation with wastewater. Correspondingly, all tested antibiotic resistance genes as well as intI1 and korB significantly increased in abundance over the period of 100 years of irrigation. In parallel, concentrations of the heavy metals Zn, Cu, Pb, Ni, and Cr significantly increased. However, no significant positive correlations were observed between the relative abundance of selected genes and years of irrigation, indicating no enrichment in the soil bacterial community due to repeated wastewater irrigation or due to a potential co-selection by increasing concentrations of heavy metals.

  13. Review of hardware cost estimation methods, models and tools applied to early phases of space mission planning

    Science.gov (United States)

    Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.

    2012-08-01

    The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation

  14. Simulated carbon and water processes of forest ecosystems in Forsmark and Oskarshamn during a 100-year period

    Energy Technology Data Exchange (ETDEWEB)

    Gustafsson, David; Jansson, Per-Erik [Royal Inst. of Technology, Stockholm (Sweden). Dept. of Land and Water Resources Engineering; Gaerdenaes, Annemieke [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden). Dept. of Soil Sciences; Eckersten, Henrik [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden). Dept. of Crop Production Ecology

    2006-12-15

    The Swedish Nuclear Fuel and Waste Management Co (SKB) is currently investigating the Forsmark and Oskarshamn areas for possible localisation of a repository for spent nuclear fuel. Important components of the investigations are characterizations of the land surface ecosystems in the areas with respect to hydrological and biological processes, and their implications for the fate of radionuclide contaminants entering the biosphere from a shallow groundwater contamination. In this study, we simulate water balance and carbon turnover processes in forest ecosystems representative for the Forsmark and Oskarshamn areas for a 100-year period using the ecosystem process model CoupModel. The CoupModel describes the fluxes of water and matter in a one-dimensional soil-vegetation-atmosphere system, forced by time series of meteorological variables. The model has previously been parameterized for many of the vegetation systems that can be found in the Forsmark and Oskarshamn areas: spruce/pine forests, willow, grassland and different agricultural crops. This report presents a platform for further use of models like CoupModel for investigations of radionuclide turnover in the Forsmark and Oskarshamn area based on SKB data, including a data set of meteorological forcing variables for Forsmark 1970-2004, suitable for simulations of a 100-year period representing the present day climate, a hydrological parameterization of the CoupModel for simulations of the forest ecosystems in the Forsmark and Oskarshamn areas, and simulated carbon budgets and process descriptions for Forsmark that correspond to a possible steady state of the soil storage of the forest ecosystem.

  15. Participatory tools working with crops, varieties and seeds. A guide for professionals applying participatory approaches in agrobiodiversity management, crop improvement and seed sector development

    NARCIS (Netherlands)

    Boef, de W.S.; Thijssen, M.H.

    2007-01-01

    Outline to the guide Within our training programmes on local management of agrobiodiversity, participatory crop improvement and the support of local seed supply participatory tools get ample attention. Tools are dealt with theoretically, are practised in class situations, but are also applied in

  16. Participatory tools working with crops, varieties and seeds. A guide for professionals applying participatory approaches in agrobiodiversity management, crop improvement and seed sector development

    NARCIS (Netherlands)

    Boef, de W.S.; Thijssen, M.H.

    2007-01-01

    Outline to the guide Within our training programmes on local management of agrobiodiversity, participatory crop improvement and the support of local seed supply participatory tools get ample attention. Tools are dealt with theoretically, are practised in class situations, but are also applied in fie

  17. A centennial to celebrate : energy and minerals science and technology 100 years of excellence : improving the quality of life of Canadians through natural resources

    Energy Technology Data Exchange (ETDEWEB)

    Udd, J.; Reeve, D.

    2007-07-01

    The year 2007 marked the 100th anniversary of Natural Resources Canada's (NRCan) contribution to science and technology excellence in energy and minerals. This publication discussed the 100 years of excellence of the energy and minerals science and technology sector. It discussed the history of Natural Resources Canada, with reference to the early years; first fuel testing efforts; first World War; the 1920s and 1930s; second World War; post-war years; the 1970s and 1980s; and the 1990s to the present. The publication discussed the creation of the Canada Centre for Mineral and Energy Technology (CANMET) as well as some current NRCan science and technology activities, such as alternative energy programs; energy efficiency for buildings, industries and communities; clean coal; oil sands tailings and water management; community energy systems; renewable energy efficient technology projects (RET) such as RETscreen; hybrid scoop; the anti-vibration rock drill handle; mine waste management; and green mines-green energy. Other NRCan science and technology programs that were presented in the publication included materials technology laboratory relocation; corrosion management tools for the oil and gas pipeline industry; lightweight magnesium engine cradle; mine environment neutral drainage program; metallurgical processing; counter-terrorism; and clean energy. figs.

  18. CoSMoS Southern California v3.0 Phase 1 (100-year storm) storm hazard projections

    Science.gov (United States)

    Barnard, Patrick; Erikson, Li; Foxgrover, Amy; O'Neill, Andrea; Herdman, Liv

    2016-01-01

    The Coastal Storm Modeling System (CoSMoS) makes detailed predictions (meter-scale) over large geographic scales (100s of kilometers) of storm-induced coastal flooding and erosion for both current and future sea-level rise (SLR) scenarios. CoSMoS v3.0 for Southern California shows projections for future climate scenarios (sea-level rise and storms) to provide emergency responders and coastal planners with critical storm-hazards information that can be used to increase public safety, mitigate physical damages, and more effectively manage and allocate resources within complex coastal settings. Phase I data for Southern California include flood-hazard information for the coast from the Mexican Border to Pt. Conception for a 100-year storm scenario and sea-level rise 0 - 2 m. Changes from previous data releases may be reflected in some areas. Data are complete for the information presented but are considered preliminary; changes may be reflected in the full data release (Phase II) in summer 2016.

  19. Applying decision trial and evaluation laboratory as a decision tool for effective safety management system in aviation transport

    Directory of Open Access Journals (Sweden)

    Ifeanyichukwu Ebubechukwu Onyegiri

    2016-10-01

    Full Text Available In recent years, in the aviation industry, the weak engineering controls and lapses associated with safety management systems (SMSs are responsible for the seemingly unprecedented disasters. A previous study has confirmed the difficulties experienced by safety managers with SMSs and the need to direct research to this area of investigation for more insights and progress in the evaluation and maintenance of SMSs in the aviation industry. The purpose of this work is to examine the application of Decision Trial and Evaluation Laboratory (DEMATEL to the aviation industry in developing countries with illustration using the Nigerian aviation survey data for the validation of the method. The advantage of the procedure over other decision making methods is in its ability to apply feedback in its decision making. It also affords us the opportunity of breaking down the complex aviation SMS components and elements which are multi-variate in nature through the analysis of the contributions of the diverse system criteria from the perspective of cause and effects, which in turn yields easier and yet more effective aviation transportation accident pre-corrective actions. In this work, six revised components of an SMS were identified and DEMATEL was applied to obtain their direct and indirect impacts and influences on the overall SMS performance. Data collection was by the survey questionnaire, which served as the initial direct-relation matrix, coded in Matlab software for establishing the impact relation map (IRM. The IRM was then plotted in MS Excel spread-sheet software. From our results, safety structure and regulation has the highest impact level on an SMS with a corresponding positive relation level value. In conclusion, the results agree with those of previous researchers that used grey relational analysis. Thus, DEMATEL serves as a great tool and resource for the safety manager.

  20. The protection of Canfranc Internation Railway Sattion against natural risks. Analysis and evaluation of its effectiveness 100 years later.

    Science.gov (United States)

    Fabregas, S.; Hurtado, R.; Mintegui, J.

    2012-04-01

    In the late XIXth century and early XXth century, the international railway station in Canfranc "Los Arañones" is built in the Central Pyrenees of Huesca in Spain, along the border between France and Spain. Just after starting the construction of the huge station (250 m long), it was found that natural hazards such as flash floods, landslides, falling blocks and avalanches affected it and compromised the safety of users and infrastructures.Quickly, hydrological restoration works were carried out along "Los Arañones" gorgers basins to reduce joint residual risks. Longitudinal and transversal dams for floods, a large reforestation work to prevent against falling blocks, erosion, flooding and regarding avalanches stone walls were built, as well as benches of grit, snow rakes, and "empty dams", which were created as experimental structures to dissipate the energy of the avalanche in the track zone and wich do not exist anywhere else in the world. All the works were carried out mainly by hand, with materials such as stone, cement and iron. Over 2,500,000 holes were made for planting more than 15 different species of trees, and more than 400,000 tons of stone were moved to build more than 12 different kinds of control measures.It is essential to emphasize the empirical nature of these works and Canfranc's function as a "laboratory or field tests", with most of its structures still effective 100 years after its construction. The works involved about 30% of the total cost of the station in the early XX century. Nowadays to have an "equivalent protection" with the current technology, around 100 million euro should be invested. It is also necessary to validate the current effectiveness of such works, its maintenance task and the protective role of the forest.

  1. 100-Year Floodplains, NC Floodplain Mapping Program data, Published in 2007, 1:12000 (1in=1000ft) scale, Iredell County GIS.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:12000 (1in=1000ft) scale, was produced all or in part from LIDAR information as of 2007. It is described as 'NC...

  2. 100-Year Floodplains, FEMA Floodway and Flood Boundary Maps, Published in 2005, 1:24000 (1in=2000ft) scale, Lafayette County Land Records.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Other information as of 2005. It is described as 'FEMA...

  3. 100-Year Floodplains, FEMA Flood Zones, Published in 2010, 1:2400 (1in=200ft) scale, Effingham County Board Of Commissioners.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:2400 (1in=200ft) scale, was produced all or in part from Published Reports/Deeds information as of 2010. It is...

  4. 100-Year Floodplains, Flood plains from FEMA, Published in 2003, 1:600 (1in=50ft) scale, Town of Cary NC.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:600 (1in=50ft) scale, was produced all or in part from LIDAR information as of 2003. It is described as 'Flood...

  5. 100-Year Floodplains, FEMA DFIRM preliminary map out now, to be published in 2009, Published in 2009, 1:12000 (1in=1000ft) scale, Brown County, WI.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:12000 (1in=1000ft) scale, was produced all or in part from Other information as of 2009. It is described as 'FEMA...

  6. 100-Year Floodplains, Data provided by FEMA and WI DNR, Published in 2009, 1:2400 (1in=200ft) scale, Dane County Land Information Office.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:2400 (1in=200ft) scale as of 2009. It is described as 'Data provided by FEMA and WI DNR'. Data by this publisher...

  7. The story of the Hawaiian Volcano Observatory -- A remarkable first 100 years of tracking eruptions and earthquakes

    Science.gov (United States)

    Babb, Janet L.; Kauahikaua, James P.; Tilling, Robert I.

    2011-01-01

    The year 2012 marks the centennial of the Hawaiian Volcano Observatory (HVO). With the support and cooperation of visionaries, financiers, scientists, and other individuals and organizations, HVO has successfully achieved 100 years of continuous monitoring of Hawaiian volcanoes. As we celebrate this milestone anniversary, we express our sincere mahalo—thanks—to the people who have contributed to and participated in HVO’s mission during this past century. First and foremost, we owe a debt of gratitude to the late Thomas A. Jaggar, Jr., the geologist whose vision and efforts led to the founding of HVO. We also acknowledge the pioneering contributions of the late Frank A. Perret, who began the continuous monitoring of Kīlauea in 1911, setting the stage for Jaggar, who took over the work in 1912. Initial support for HVO was provided by the Massachusetts Institute of Technology (MIT) and the Carnegie Geophysical Laboratory, which financed the initial cache of volcano monitoring instruments and Perret’s work in 1911. The Hawaiian Volcano Research Association, a group of Honolulu businessmen organized by Lorrin A. Thurston, also provided essential funding for HVO’s daily operations starting in mid-1912 and continuing for several decades. Since HVO’s beginning, the University of Hawaiʻi (UH), called the College of Hawaii until 1920, has been an advocate of HVO’s scientific studies. We have benefited from collaborations with UH scientists at both the Hilo and Mänoa campuses and look forward to future cooperative efforts to better understand how Hawaiian volcanoes work. The U.S. Geological Survey (USGS) has operated HVO continuously since 1947. Before then, HVO was under the administration of various Federal agencies—the U.S. Weather Bureau, at the time part of the Department of Agriculture, from 1919 to 1924; the USGS, which first managed HVO from 1924 to 1935; and the National Park Service from 1935 to 1947. For 76 of its first 100 years, HVO has been

  8. GEodesy Tools for Societal Issues (GETSI): Undergraduate curricular modules that feature geodetic data applied to critical social topics

    Science.gov (United States)

    Douglas, B. J.; Pratt-Sitaula, B.; Walker, B.; Miller, M. S.; Charlevoix, D.

    2014-12-01

    The GETSI project is a three-year NSF funded project to develop and disseminate teaching and learning materials that feature geodesy data applied to critical societal issues such as climate change, water resource management, and natural hazards (http://serc.carleton.edu/getsi). GETSI was born out of requests from geoscience faculty for more resources with which to educate future citizens and future geoscience professionals on the power and breadth of geodetic methods to address societally relevant topics. Development of the first two modules started at a February 2014 workshop and initial classroom testing begins in fall 2014. The Year 1 introductory module "Changing Ice and Sea Level" includes geodetic data such as gravity, satellite altimetry, and GPS time series. The majors-level Year 1 module is "Imaging Active Tectonics" and it has students analyzing InSAR and LiDAR data to assess infrastructure vulnerability to demonstratively active faults. Additional resources such as animations and interactive data tools are also being developed. The full modules will take about two weeks of class time; module design will permit portions of the material to be used as individual projects or assignments of shorter duration. Ultimately a total of four modules will be created and disseminated, two each at the introductory and majors-levels. GETSI is working in tight partnership with the Science Education Resource Center's (SERC) InTeGrate project on the module development, assessment, and dissemination to ensure compatibility with the growing number of resources for geoscience education. This will allow for an optimized module development process based on successful practices defined by these earlier efforts.

  9. A sampler of useful computational tools for applied geometry, computer graphics, and image processing foundations for computer graphics, vision, and image processing

    CERN Document Server

    Cohen-Or, Daniel; Ju, Tao; Mitra, Niloy J; Shamir, Ariel; Sorkine-Hornung, Olga; Zhang, Hao (Richard)

    2015-01-01

    A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing shows how to use a collection of mathematical techniques to solve important problems in applied mathematics and computer science areas. The book discusses fundamental tools in analytical geometry and linear algebra. It covers a wide range of topics, from matrix decomposition to curvature analysis and principal component analysis to dimensionality reduction.Written by a team of highly respected professors, the book can be used in a one-semester, intermediate-level course in computer science. It

  10. Portable hyperspectral device as a valuable tool for the detection of protective agents applied on hystorical buildings

    Science.gov (United States)

    Vettori, S.; Pecchioni, E.; Camaiti, M.; Garfagnoli, F.; Benvenuti, M.; Costagliola, P.; Moretti, S.

    2012-04-01

    In the recent past, a wide range of protective products (in most cases, synthetic polymers) have been applied to the surfaces of ancient buildings/artefacts to preserve them from alteration [1]. The lack of a detailed mapping of the permanence and efficacy of these treatments, in particular when applied on large surfaces such as building facades, may be particularly noxious when new restoration treatments are needed and the best choice of restoration protocols has to be taken. The presence of protective compounds on stone surfaces may be detected in laboratory by relatively simple diagnostic tests, which, however, normally require invasive (or micro-invasive) sampling methodologies and are time-consuming, thus limiting their use only to a restricted number of samples and sampling sites. On the contrary, hyperspectral sensors are rapid, non-invasive and non-destructive tools capable of analyzing different materials on the basis of their different patterns of absorption at specific wavelengths, and so particularly suitable for the field of cultural heritage [2,3]. In addition, they can be successfully used to discriminate between inorganic (i.e. rocks and minerals) and organic compounds, as well as to acquire, in short times, many spectra and compositional maps at relatively low costs. In this study we analyzed a number of stone samples (Carrara Marble and biogenic calcarenites - "Lecce Stone" and "Maastricht Stone"-) after treatment of their surfaces with synthetic polymers (synthetic wax, acrylic, perfluorinated and silicon based polymers) of common use in conservation-restoration practice. The hyperspectral device used for this purpose was ASD FieldSpec FR Pro spectroradiometer, a portable, high-resolution instrument designed to acquire Visible and Near-Infrared (VNIR: 350-1000 nm) and Short-Wave Infrared (SWIR: 1000-2500 nm) punctual reflectance spectra with a rapid data collection time (about 0.1 s for each spectrum). The reflectance spectra so far obtained in

  11. Simulation of 20-channel, 50-GHz, Si3N4-based arrayed waveguide grating applying three different photonics tools

    Science.gov (United States)

    Gajdošová, Lenka; Seyringer, Dana

    2017-02-01

    We present the design and simulation of 20-channel, 50-GHz Si3N4 based AWG using three different commercial photonics tools, namely PHASAR from Optiwave Systems Inc., APSS from Apollo Photonics Inc. and RSoft from Synopsys Inc. For this purpose we created identical waveguide structures and identical AWG layouts in these tools and performed BPM simulations. For the simulations the same calculation conditions were used. These AWGs were designed for TM-polarized light with an AWG central wavelength of 850 nm. The output of all simulations, the transmission characteristics, were used to calculate the transmission parameters defining the optical properties of the simulated AWGs. These parameters were summarized and compared with each other. The results feature very good correlation between the tools and are comparable to the designed parameters in AWG-Parameters tool.

  12. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Eric Moyer

    2016-04-01

    Full Text Available Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f .

  13. Tackling Wicked Problems : The Development of a New Decision-Making Tool, Applied to the Estonian Oil Shale Conundrum

    OpenAIRE

    Spaulding, Jeannette

    2014-01-01

    Wicked problems are a special subset of particularly complex issues that current problem-solving tools fail tofully address. Because of this deficiency, a new tool for evaluating and resolving wicked problems must be developed. Theories such as anti-positivism and systems thinking are explored in order to understand the nature of wicked problems, which are often defined by the involvement of multiple stakeholders as well as non-linear interrelations between various elements of the problem. Al...

  14. Applying Total Quality Management Tools Using QFD at Higher Education Institutions in Gulf Area (Case Study: ALHOSN University

    Directory of Open Access Journals (Sweden)

    Adnan Al-Bashir

    2016-07-01

    Full Text Available Human power’s quality plays the key role in the growth and development of societies where the quality of human powers can be enriched with the high quality education provided by the higher education institutions. The higher education institutions are hereby an important sector of any society since it defines the overall quality of human lives. This research will investigate the application of Total Quality Management (TQM tools at the higher education institutions; specifically at ALHOSN University. In this study five tools were implemented at ALHOSN University’s engineering college including: Quality Function Deployment, Affinity Diagrams, Tree Diagrams, Pareto Charts, and Fishbone Diagrams. The research will reveal that the implementation of TQM tools has a great benefit for higher education institutions where they have uncovered many area of potential improvement as well as the main causes of some of the problems the Faculty of Engineering is facing. Also, it will show that the implementation of TQM tools on higher education institution systems will enhance the performance of such institutions.

  15. Dr Margaretha Brongersma-Sanders (1905-1996), Dutch scientist: an annotated bibliography of her work to celebrate 100 years since her birth

    NARCIS (Netherlands)

    Turner, S.; Cadée, G.C.

    2006-01-01

    Dr Margaretha Brongersma-Sanders, palaeontologist, pioneer geochemist, geobiologist and oceanographer, Officer of the Order of Oranje Nassau was born 100 years ago (February 20th, 1905) in Kampen in The Netherlands. The fields of research that she covered during her lifetime include taxonomy of rece

  16. Experiences and Results of Applying Tools for Assessing the Quality of a mHealth App Named Heartkeeper.

    Science.gov (United States)

    Martínez-Pérez, Borja; de la Torre-Díez, Isabel; López-Coronado, Miguel

    2015-11-01

    Currently, many incomplete mobile apps can be found in the commercial stores, apps with bugs or low quality that needs to be seriously improved. The aim of this paper is to use two different tools for assessing the quality of a mHealth app for the self-management of heart diseases by the own patients named Heartkeeper. The first tool measures the compliance with the Android guidelines given by Google and the second measures the users' Quality of Experience (QoE). The results obtained indicated that Heartkeeper follows in many cases the Android guidelines, especially in the structure, and offers a satisfactory QoE for its users, with special mention to aspects such as the learning curve, the availability and the appearance. As a result, Heartkeeper has proved to be a satisfactory app from the point of view of Google and the users. The conclusions obtained are that the type of tools that measure the quality of an app can be very useful for developers in order to find aspects that need improvements before releasing their apps. By doing this, the number of low-quality applications released will decrease dramatically, so these techniques are strongly recommended for all the app developers.

  17. MATLAB-Simulink tool development for process identification and tuning tool applied to a HDT pilot plant; Desenvolvimento de ferramenta MATLAB-Simulink para identificacao de processos e sintonia de malhas aplicada a uma planta piloto HDT

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, Viviane; Araujo, Ofelia; Vaz Junior, Carlos Andre [Universidade Federal, Rio de Janeiro, RJ (Brazil). Escola de Quimica]. E-mail: vfonseca@eq.ufrj.br

    2003-07-01

    This research presents a process identification and PID tuning tool, applied to a HDT pilot plant, located at Chemistry School of University of Rio de Janeiro - Brazil, with collaboration of a petrochemical industry named COPENE. MATLAB and its library Simulink are used to obtain the tool, which accomplishes the functions of system optimization and process simulation, respectively. In and out plant data are obtained and used to make the openloop process identification through the transfer functions estimation, in agreement with the least squares method. The control loop tuning of the reactor pressure is determined through the transfer functions parameters estimated in the previous identification, based in the ITAE method. All obtained information is accept if the convergence criteria is achieved. The tuning results allow the success of the pressure control response and the tool could be used with other chemical processes. (author)

  18. A Simulation Tool for Steady State Thermal Performance Applied to the SPL Double-Walled Tube RF Power Coupler

    CERN Document Server

    Bonomi, R

    2014-01-01

    This note reports on the study carried out to design a tool for steady-state thermal performance of the RF power coupler inside the SPL cryostat. To reduce the amount of heat penetrating into the helium bath where the cavity is placed, the main coupler is actively cooled by means of an adequate flow rate of helium gas. The knowledge of the temperature profiles and the overall thermal performance of the power coupler are fundamental for the estimation of the total heat load budget of the cryostat.

  19. Applied acoustics concepts, absorbers, and silencers for acoustical comfort and noise control alternative solutions, innovative tools, practical examples

    CERN Document Server

    Fuchs, Helmut V

    2013-01-01

    The author gives a comprehensive overview of materials and components for noise control and acoustical comfort. Sound absorbers must meet acoustical and architectural requirements, which fibrous or porous material alone can meet. Basics and applications are demonstrated, with representative examples for spatial acoustics, free-field test facilities and canal linings. Acoustic engineers and construction professionals will find some new basic concepts and tools for developments in order to improve acoustical comfort. Interference absorbers, active resonators and micro-perforated absorbers of different materials and designs complete the list of applications.

  20. Applying value engineering and modern assessment tools in managing NEPA: Improving effectiveness of the NEPA scoping and planning process

    Energy Technology Data Exchange (ETDEWEB)

    ECCLESTON, C.H.

    1998-09-03

    While the National Environmental Policy Act (NEPA) implementing regulations focus on describing ''What'' must be done, they provide surprisingly little direction on ''how'' such requirements are to be implemented. Specific implementation of these requirements has largely been left to the discretion of individual agencies. More than a quarter of a century after NEPA's enactment, few rigorous tools, techniques, or methodologies have been developed or widely adopted for implementing the regulatory requirements. In preparing an Environmental Impact Statement, agencies are required to conduct a public scoping process to determine the range of actions, alternatives, and impacts that will be investigated. Determining the proper scope of analysis is an element essential in the successful planning and implementation of future agency actions. Lack of rigorous tools and methodologies can lead to project delays, cost escalation, and increased risk that the scoping process may not adequately capture the scope of decisions that eventually might need to be considered. Recently, selected Value Engineering (VE) techniques were successfully used in managing a prescoping effort. A new strategy is advanced for conducting a pre-scoping/scoping effort that combines NEPA with VE. Consisting of five distinct phases, this approach has potentially wide-spread implications in the way NEPA, and scoping in particular, is practiced.

  1. Population Dynamics P system (PDP) models: a standardized protocol for describing and applying novel bio-inspired computing tools.

    Science.gov (United States)

    Colomer, Maria Àngels; Margalida, Antoni; Pérez-Jiménez, Mario J

    2013-01-01

    Today, the volume of data and knowledge of processes necessitates more complex models that integrate all available information. This handicap has been solved thanks to the technological advances in both software and hardware. Computational tools available today have allowed developing a new family of models, known as computational models. The description of these models is difficult as they can not be expressed analytically, and it is therefore necessary to create protocols that serve as guidelines for future users. The Population Dynamics P systems models (PDP) are a novel and effective computational tool to model complex problems, are characterized by the ability to work in parallel (simultaneously interrelating different processes), are modular and have a high computational efficiency. However, the difficulty of describing these models therefore requires a protocol to unify the presentation and the steps to follow. We use two case studies to demonstrate the use and implementation of these computational models for population dynamics and ecological process studies, discussing briefly their potential applicability to simulate complex ecosystem dynamics.

  2. Development of a cost efficient methodology to perform allocation of flammable and toxic gas detectors applying CFD tools

    Energy Technology Data Exchange (ETDEWEB)

    Storch, Rafael Brod; Rocha, Gean Felipe Almeida [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Nalvarte, Gladys Augusta Zevallos [Det Norske Veritas (DNV), Novik (Norway)

    2012-07-01

    This paper is aimed to present a computational procedure for flammable and toxic gas detector allocation and quantification developed by DNV. The proposed methodology applies Computational Fluid Dynamics (CFD) simulations as well as operational and safety characteristics of the analyzed region to assess the optimal number of toxic and flammable gas detectors and their optimal location. A probabilistic approach is also used when applying the DNV software ThorEXPRESSLite, following NORSOK Z013 Annex G and presented in HUSER et al. 2000 and HUSER et al. 2001, when the flammable gas detectors are assessed. A DNV developed program, DetLoc, is used to run in an iterative way the procedure described above leading to an automatic calculation of the gas detectors location and number. The main advantage of the methodology presented above is the independence of human interaction in the gas detector allocation leading to a more precise and free of human judgment allocation. Thus, a reproducible allocation is generated when comparing several different analyses and a global criteria appliance is guaranteed through different regions in the same project. A case study is presented applying the proposed methodology. (author)

  3. The Black Top Hat function applied to a DEM: A tool to estimate recent incision in a mountainous watershed (Estibère Watershed, Central Pyrenees)

    Science.gov (United States)

    Rodriguez, Felipe; Maire, Eric; Courjault-Radé, Pierre; Darrozes, José

    2002-03-01

    The Top Hat Transform function is a grey-level image analysis tool that allows extracting peaks and valleys in a non-uniform background. This function can be applied onto a grey-level Digital Elevation Model (DEM). It is herein applied to quantify the volume of recent incised material in a mountainous Pyrenean watershed. Grey-level Closing operation applied to the Present-Day DEM gives a new image called ``paleo'' DEM. The Black Top Hat function consists in the subtraction of the ``paleo'' DEM with the Present-Day DEM. It gives a new DEM representing all valleys whose sizes range between the size of the structuring element and the null value as no threshold is used. The calculation of the incised volume is directly derived from the subtraction between the two DEM's. The geological significance of the quantitative results is discussed.

  4. Poor reliability between Cochrane reviewers and blinded external reviewers when applying the Cochrane risk of bias tool in physical therapy trials.

    Directory of Open Access Journals (Sweden)

    Susan Armijo-Olivo

    Full Text Available OBJECTIVES: To test the inter-rater reliability of the RoB tool applied to Physical Therapy (PT trials by comparing ratings from Cochrane review authors with those of blinded external reviewers. METHODS: Randomized controlled trials (RCTs in PT were identified by searching the Cochrane Database of Systematic Reviews for meta-analysis of PT interventions. RoB assessments were conducted independently by 2 reviewers blinded to the RoB ratings reported in the Cochrane reviews. Data on RoB assessments from Cochrane reviews and other characteristics of reviews and trials were extracted. Consensus assessments between the two reviewers were then compared with the RoB ratings from the Cochrane reviews. Agreement between Cochrane and blinded external reviewers was assessed using weighted kappa (κ. RESULTS: In total, 109 trials included in 17 Cochrane reviews were assessed. Inter-rater reliability on the overall RoB assessment between Cochrane review authors and blinded external reviewers was poor (κ  =  0.02, 95%CI: -0.06, 0.06]. Inter-rater reliability on individual domains of the RoB tool was poor (median κ  = 0.19, ranging from κ  =  -0.04 ("Other bias" to κ  =  0.62 ("Sequence generation". There was also no agreement (κ  =  -0.29, 95%CI: -0.81, 0.35] in the overall RoB assessment at the meta-analysis level. CONCLUSIONS: Risk of bias assessments of RCTs using the RoB tool are not consistent across different research groups. Poor agreement was not only demonstrated at the trial level but also at the meta-analysis level. Results have implications for decision making since different recommendations can be reached depending on the group analyzing the evidence. Improved guidelines to consistently apply the RoB tool and revisions to the tool for different health areas are needed.

  5. Selection of key ambient particulate variables for epidemiological studies - applying cluster and heatmap analyses as tools for data reduction.

    Science.gov (United States)

    Gu, Jianwei; Pitz, Mike; Breitner, Susanne; Birmili, Wolfram; von Klot, Stephanie; Schneider, Alexandra; Soentgen, Jens; Reller, Armin; Peters, Annette; Cyrys, Josef

    2012-10-01

    The success of epidemiological studies depends on the use of appropriate exposure variables. The purpose of this study is to extract a relatively small selection of variables characterizing ambient particulate matter from a large measurement data set. The original data set comprised a total of 96 particulate matter variables that have been continuously measured since 2004 at an urban background aerosol monitoring site in the city of Augsburg, Germany. Many of the original variables were derived from measured particle size distribution (PSD) across the particle diameter range 3 nm to 10 μm, including size-segregated particle number concentration, particle length concentration, particle surface concentration and particle mass concentration. The data set was complemented by integral aerosol variables. These variables were measured by independent instruments, including black carbon, sulfate, particle active surface concentration and particle length concentration. It is obvious that such a large number of measured variables cannot be used in health effect analyses simultaneously. The aim of this study is a pre-screening and a selection of the key variables that will be used as input in forthcoming epidemiological studies. In this study, we present two methods of parameter selection and apply them to data from a two-year period from 2007 to 2008. We used the agglomerative hierarchical cluster method to find groups of similar variables. In total, we selected 15 key variables from 9 clusters which are recommended for epidemiological analyses. We also applied a two-dimensional visualization technique called "heatmap" analysis to the Spearman correlation matrix. 12 key variables were selected using this method. Moreover, the positive matrix factorization (PMF) method was applied to the PSD data to characterize the possible particle sources. Correlations between the variables and PMF factors were used to interpret the meaning of the cluster and the heatmap analyses

  6. Network analysis as a tool for assessing environmental sustainability: applying the ecosystem perspective to a Danish water management system

    DEFF Research Database (Denmark)

    Pizzol, Massimo; Scotti, Marco; Thomsen, Marianne

    2013-01-01

    patterns of growth and development. We applied Network Analysis (NA) for assessing the sustainability of a Danish municipal Water Management System (WMS). We identified water users within the WMS and represented their interactions as a network of water flows. We computed intensive and extensive indices......: it is highly efficient at processing the water resource, but the rigid and almost linear structure makes it vulnerable in situations of stress such as heavy rain events. The analysis of future scenarios showed a trend towards increased sustainability, but differences between past and expected future...

  7. X-ray fluorescence - a non-destructive tool in investigation of Czech fine and applied art objects

    Science.gov (United States)

    Trojek, T.; Musílek, L.

    2017-08-01

    A brief review of application of X-ray fluorescence analysis (XRFA) to fine and applied arts related to Czech cultural heritage is presented. The Department of Dosimetry and Application of Ionising Radiation of CTU-FNSPE has used XRFA in collaboration with various Czech institutions dealing with cultural history for many kinds of artefacts, (e.g., Roman and medieval brass, gemstones and noble metals from the sceptre of one of the faculties of the Charles University in Prague, millefiori beads, etc.). In some cases, a combination of various other techniques alongside XRFA was used for enhancing our knowledge of a measured object.

  8. Lead-time reduction utilizing lean tools applied to healthcare: the inpatient pharmacy at a local hospital.

    Science.gov (United States)

    Al-Araidah, Omar; Momani, Amer; Khasawneh, Mohammad; Momani, Mohammed

    2010-01-01

    The healthcare arena, much like the manufacturing industry, benefits from many aspects of the Toyota lean principles. Lean thinking contributes to reducing or eliminating nonvalue-added time, money, and energy in healthcare. In this paper, we apply selected principles of lean management aiming at reducing the wasted time associated with drug dispensing at an inpatient pharmacy at a local hospital. Thorough investigation of the drug dispensing process revealed unnecessary complexities that contribute to delays in delivering medications to patients. We utilize DMAIC (Define, Measure, Analyze, Improve, Control) and 5S (Sort, Set-in-order, Shine, Standardize, Sustain) principles to identify and reduce wastes that contribute to increasing the lead-time in healthcare operations at the pharmacy understudy. The results obtained from the study revealed potential savings of > 45% in the drug dispensing cycle time.

  9. Applying TRIZ and Fuzzy AHP Based on Lean Production to Develop an Innovative Design of a New Shape for Machine Tools

    Directory of Open Access Journals (Sweden)

    Ho-Nien Hsieh

    2015-03-01

    Full Text Available Companies are facing cut throat competition and are forced to continuously perform better than their competitors. In order to enhance their position in the competitive world, organizations are improving at a faster pace. Industrial organizations must be used to the new ideals, such as innovation. Today, innovative design in the development of new products has become a core value in most companies, while innovation is recognized as the main driving force in the market. This work applies the Russian theory of inventive problem-solving, TRIZ and the fuzzy analytical hierarchy process (FAHP to design a new shape for machine tools. TRIZ offers several concepts and tools to facilitate concept creation and problem-solving, while FAHP is employed as a decision support tool that can adequately represent qualitative and subjective assessments under the multiple criteria decision-making environment. In the machine tools industry, this is the first study to develop an innovative design under the concept of lean production. We used TRIZ to propose the relevant principles to the shape’s design with the innovative design consideration and also used FAHP to evaluate and select the best feasible alternative from independent factors based on a multiple criteria decision-making environment. To develop a scientific method based on the lean production concept in order to design a new product and improve the old designing process is the contribution of this research.

  10. 100 years of Planck's quantum

    CERN Document Server

    Duck, Ian M

    2000-01-01

    This invaluable book takes the reader from Planck's discovery of the quantum in 1900 to the most recent interpretations and applications of nonrelativistic quantum mechanics.The introduction of the quantum idea leads off the prehistory of quantum mechanics, featuring Planck, Einstein, Bohr, Compton, and de Broglie's immortal contributions. Their original discovery papers are featured with explanatory notes and developments in Part 1.The invention of matrix mechanics and quantum mechanics by Heisenberg, Born, Jordan, Dirac, and Schrödinger is presented next, in Part 2.Following that, in Part 3,

  11. 100 Years of Chinese Education

    Institute of Scientific and Technical Information of China (English)

    ZHANG ZHIPING

    2010-01-01

    @@ This year marks the 100th anniversary of the foundation of the Department of Chinese Language and Literature (DOCLL) of Peking University. On March 31, 1910, the Imperial Capital University (the predecessor of today's Peking University) was estab-lished, with a Chinese literature school as an independent teaching unit.

  12. 100 Years of Chinese Education

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    This year marks the 100th anniversary of the foundation of the Department of Chinese Language and Literature (DOCLL) of Peking University. On March 31,1910,the Imperial Capital University (the predecessor of today’s Peking University) was established,with a Chinese literature school as an independent teaching unit.

  13. 100 years of radionuclide metrology.

    Science.gov (United States)

    Judge, S M; Arnold, D; Chauvenet, B; Collé, R; De Felice, P; García-Toraño, E; Wätjen, U

    2014-05-01

    The discipline of radionuclide metrology at national standards institutes started in 1913 with the certification by Curie, Rutherford and Meyer of the first primary standards of radium. In early years, radium was a valuable commodity and the aim of the standards was largely to facilitate trade. The focus later changed to providing standards for the new wide range of radionuclides, so that radioactivity could be used for healthcare and industrial applications while minimising the risk to patients, workers and the environment. National measurement institutes responded to the changing demands by developing new techniques for realising primary standards of radioactivity. Looking ahead, there are likely to be demands for standards for new radionuclides used in nuclear medicine, an expansion of the scope of the field into quantitative imaging to facilitate accurate patient dosimetry for nuclear medicine, and an increasing need for accurate standards for radioactive waste management and nuclear forensics.

  14. FEMA 100 year Flood Data

    Data.gov (United States)

    California Department of Resources — The Q3 Flood Data product is a digital representation of certain features of FEMA's Flood Insurance Rate Map (FIRM) product, intended for use with desktop mapping...

  15. Inspecting what you expect: Applying modern tools and techniques to evaluate the effectiveness of household energy interventions

    Science.gov (United States)

    Pillarisetti, Ajay

    Exposure to fine particles (PM2.5) resulting from solid fuel use for household energy needs - including cooking, heating, and lighting - is one of the leading causes of ill-health globally and is responsible for approximately 4 million premature deaths and 84 million lost disability-adjusted life years globally. The well-established links between cooking and ill-health are modulated by complex social, behavioral, technological, and environmental issues that pose unique challenges to efforts that seek to reduce this large health burden. Despite growing interest in the field - and numerous technical solutions that, in the laboratory at least, reduce emissions of harmful air pollutants from solid fuel combustion - there exists a need for refined tools, models, and techniques (1) for measuring environmental pollution in households using solid fuel, (2) for tracking adoption of interventions, and (3) for estimating the potential health benefits attributable to an intervention. Part of the need for higher spatial and temporal resolution data on particular concentrations and dynamics is being met by low-cost sensing platforms that provide large amounts of time-resolved data on critical parameters of interest, including PM2.5 concentrations and time-of-use metrics for heat-generating appliances, like stoves. Use of these sensors can result in non-trivial challenges, including those related to data management and analysis, and field logistics, but also enables novel lines of inquiry and insight. Chapter 2 presents a long-term deployment of real-time PM2.5 sensors in rural, solid-fuel-using kitchens, specifically seeking to evaluate how well commonly measured 24 or 48-hour samples represent long-term means. While short-term measures were poor predictors of long-term means, the dataset enabled evaluation of numerous sampling strategies - including sampling once per week, month, or season - that had much lower errors and higher probabilities of estimating the true mean

  16. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    Science.gov (United States)

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  17. Network Analysis as a tool for assessing environmental sustainability: applying the ecosystem perspective to a Danish Water Management System.

    Science.gov (United States)

    Pizzol, Massimo; Scotti, Marco; Thomsen, Marianne

    2013-03-30

    New insights into the sustainable use of natural resources in human systems can be gained through comparison with ecosystems via common indices. In both kinds of system, resources are processed by a number of users within a network, but we consider ecosystems as the only ones displaying sustainable patterns of growth and development. This study aims at using Network Analysis (NA) to move such "ecosystem perspective" from theory into practice. A Danish municipal Water Management System (WMS) is used as case study to test the NA methodology and to discuss its generic applicability. We identified water users within the WMS and represented their interactions as a network of water flows. We computed intensive and extensive indices of system-level performance for seven different network configurations illustrating past conditions (2004-2008) and future scenarios (2015 and 2020). We also computed the same indices for other 24 human systems and for 12 ecosystems, by using information from the existing scientific literature on NA. The comparison of these results reveals that the WMS is similar to the other human systems and that human systems generally differ from ecosystems. The WMS is highly efficient at processing the water resource, but the rigid and almost linear structure makes it vulnerable in situations of stress such as heavy rain events. The analysis of future scenarios showed a trend towards increased sustainability, but differences between past and expected future performance of the WMS are marginal. We argue that future interventions should create alternative pathways for reusing rainwater within the WMS, increasing its potential to withstand the occurrence of flooding. We discuss advantages, limitations, and general applicability of NA as a tool for assessing environmental sustainability in human systems.

  18. Architecture of the global land acquisition system: applying the tools of network science to identify key vulnerabilities

    Science.gov (United States)

    Seaquist, J. W.; Li Johansson, Emma; Nicholas, Kimberly A.

    2014-11-01

    Global land acquisitions, often dubbed ‘land grabbing’ are increasingly becoming drivers of land change. We use the tools of network science to describe the connectivity of the global acquisition system. We find that 126 countries participate in this form of global land trade. Importers are concentrated in the Global North, the emerging economies of Asia, and the Middle East, while exporters are confined to the Global South and Eastern Europe. A small handful of countries account for the majority of land acquisitions (particularly China, the UK, and the US), the cumulative distribution of which is best described by a power law. We also find that countries with many land trading partners play a disproportionately central role in providing connectivity across the network with the shortest trading path between any two countries traversing either China, the US, or the UK over a third of the time. The land acquisition network is characterized by very few trading cliques and therefore characterized by a low degree of preferential trading or regionalization. We also show that countries with many export partners trade land with countries with few import partners, and vice versa, meaning that less developed countries have a large array of export partnerships with developed countries, but very few import partnerships (dissassortative relationship). Finally, we find that the structure of the network is potentially prone to propagating crises (e.g., if importing countries become dependent on crops exported from their land trading partners). This network analysis approach can be used to quantitatively analyze and understand telecoupled systems as well as to anticipate and diagnose the potential effects of telecoupling.

  19. Simple model of dissolved oxygen consumption in a bay within high organic loading: an applied remediation tool.

    Science.gov (United States)

    Ahumada, Ramón; Vargas, José; Pagliero, Liliana

    2006-07-01

    San Vicente Bay is a coastal shallow embayment in Central Chile with multiple uses, one of which is receiving wastewater from industrial fisheries, steel mill effluents, and domestic sewage. A simulation model was developed and applied to dissolved oxygen consumption by organic residues released into this embayment. Three compartments were established as function of: depth, circulation and outfall location. The model compartments had different volumes, and their oxygen saturation value was used as baseline. The parameters: (a) BOD5 of the industrial and urban effluents, (b) oxygen demand by organic sediments, (c) respiration, (d) photosynthesis and (e) re-aeration were included in the model. Iteration results of the model showed severe alterations in Compartment 1, with a decrease of 65% in the oxygen below saturation. Compartment 2 showed a small decline (10%) and compartment 3 did not show apparent changes in oxygen values. Measures recommended for remediation were to decrease the BOD5 loading by 30% in the affected sector. Iteration of the model for 200 h following recommendations derived from the preceding results produced an increase in saturation of 60% (5 ml O2 L(-1)), which suggested an improvement of the environmental conditions.

  20. VERONA V6.22 – An enhanced reactor analysis tool applied for continuous core parameter monitoring at Paks NPP

    Energy Technology Data Exchange (ETDEWEB)

    Végh, J., E-mail: janos.vegh@ec.europa.eu [Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Pós, I., E-mail: pos@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Horváth, Cs., E-mail: csaba.horvath@energia.mta.hu [Centre for Energy Research, Hungarian Academy of Sciences, H-1525 Budapest 114, P.O. Box 49 (Hungary); Kálya, Z., E-mail: kalyaz@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Parkó, T., E-mail: parkot@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Ignits, M., E-mail: ignits@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary)

    2015-10-15

    Between 2003 and 2007 the Hungarian Paks NPP performed a large modernization project to upgrade its VERONA core monitoring system. The modernization work resulted in a state-of-the-art system that was able to support the reactor thermal power increase to 108% by more accurate and more frequent core analysis. Details of the new system are given in Végh et al. (2008), the most important improvements were as follows: complete replacement of the hardware and the local area network; application of a new operating system and porting a large fraction of the original application software to the new environment; implementation of a new human-system interface; and last but not least, introduction of new reactor physics calculations. Basic novelty of the modernized core analysis was the introduction of an on-line core-follow module based on the standard Paks NPP core design code HELIOS/C-PORCA. New calculations also provided much finer spatial resolution, both in terms of axial node numbers and within the fuel assemblies. The new system was able to calculate the fuel applied during the first phase of power increase accurately, but it was not tailored to determine the effects of burnable absorbers as gadolinium. However, in the second phase of the power increase process the application of fuel assemblies containing three fuel rods with gadolinium content was intended (in order to optimize fuel economy), therefore off-line and on-line VERONA reactor physics models had to be further modified, to be able to handle the new fuel according to the accuracy requirements. In the present paper first a brief overview of the system version (V6.0) commissioned after the first modernization step is outlined; then details of the modified off-line and on-line reactor physics calculations are described. Validation results for new modules are treated extensively, in order to illustrate the extent and complexity of the V&V procedure associated with the development and licensing of the new

  1. Undergraduate teaching modules featuring geodesy data applied to critical social topics (GETSI: GEodetic Tools for Societal Issues)

    Science.gov (United States)

    Pratt-Sitaula, B. A.; Walker, B.; Douglas, B. J.; Charlevoix, D. J.; Miller, M. M.

    2015-12-01

    The GETSI project, funded by NSF TUES, is developing and disseminating teaching and learning materials that feature geodesy data applied to critical societal issues such as climate change, water resource management, and natural hazards (serc.carleton.edu/getsi). It is collaborative between UNAVCO (NSF's geodetic facility), Mt San Antonio College, and Indiana University. GETSI was initiated after requests by geoscience faculty for geodetic teaching resources for introductory and majors-level students. Full modules take two weeks but module subsets can also be used. Modules are developed and tested by two co-authors and also tested in a third classroom. GETSI is working in partnership with the Science Education Resource Center's (SERC) InTeGrate project on the development, assessment, and dissemination to ensure compatibility with the growing number of resources for geoscience education. Two GETSI modules are being published in October 2015. "Ice mass and sea level changes" includes geodetic data from GRACE, satellite altimetry, and GPS time series. "Imaging Active Tectonics" has students analyzing InSAR and LiDAR data to assess infrastructure earthquake vulnerability. Another three modules are in testing during fall 2015 and will be published in 2016. "Surface process hazards" investigates mass wasting hazard and risk using LiDAR data. "Water resources and geodesy" uses GRACE, vertical GPS, and reflection GPS data to have students investigating droughts in California and the High Great Plains. "GPS, strain, and earthquakes" helps students learn about infinitesimal and coseismic strain through analysis of horizontal GPS data and includes an extension module on the Napa 2014 earthquake. In addition to teaching resources, the GETSI project is compiling recommendations on successful development of geodesy curricula. The chief recommendations so far are the critical importance of including scientific experts in the authorship team and investing significant resources in

  2. History of views on the relative positions of Antarctica and South America: A 100-year tango between Patagonia and the Antarctic Peninsula

    Science.gov (United States)

    Miller, H.

    2007-01-01

    Discussion of continental drift around Antarctica began nearly 100 years ago. While the Gondwana connections of Antarctica to Africa and Australia have been well defined for decades, the relative pre-drift positions of the Antarctic Peninsula and Patagonia continue to be subjects of controversy. Certainly older figures, which showed a paleo-position of the Peninsula crossing over continental crust of the Falkland Plateau or even South Africa or Patagonia, are out of consideration now. But contradictory opinions remain over the relative paleo-position of the Peninsula as a more or less straight prolongation of the Patagonian Andes, versus a position parallel to Patagonia along the Pacific coast. Geological reasons are found for both opinions, but geophysical observations on the adjacent ocean floors, particularly the evolution of the Weddell Sea crust, speak for the last-mentioned reconstruction.

  3. CoSMoS Southern California v3.0 Phase 1 (100-year storm) flood hazard projections: Los Angeles, San Diego and Orange counties

    Science.gov (United States)

    Barnard, Patrick; Erikson, Li; Foxgrover, Amy; O'Neill, Andrea; Herdman, Liv

    2015-01-01

    The Coastal Storm Modeling System (CoSMoS) makes detailed predictions (meter-scale) over large geographic scales (100s of kilometers) of storm-induced coastal flooding and erosion for both current and future sea-level rise (SLR) scenarios. CoSMoS v3.0 for Southern California shows projections for future climate scenarios (sea-level rise and storms) to provide emergency responders and coastal planners with critical storm-hazards information that can be used to increase public safety, mitigate physical damages, and more effectively manage and allocate resources within complex coastal settings. Phase I data for Southern California include flood-hazard information for the coast from the Mexican Border to Pt. Conception for a 100-year storm scenario. Data are complete for the information presented but are considered preliminary; changes may be reflected in the full data release (Phase II) in summer 2016.

  4. Barriers and facilitators for implementing a new screening tool in an emergency department: A qualitative study applying the Theoretical Domains Framework.

    Science.gov (United States)

    Kirk, Jeanette W; Sivertsen, Ditte M; Petersen, Janne; Nilsen, Per; Petersen, Helle V

    2016-10-01

    The aim was to identify the factors that were perceived as most important as facilitators or barriers to the introduction and intended use of a new tool in the emergency department among nurses and a geriatric team. A high incidence of functional decline after hospitalisation for acute medical illness has been shown in the oldest patients and those who are physically frail. In Denmark, more than 35% of older medical patients acutely admitted to the emergency department are readmitted within 90 days after discharge. A new screening tool for use in the emergency department aiming to identify patients at particularly high risk of functional decline and readmission was developed. Qualitative study based on semistructured interviews with nurses and a geriatric team in the emergency department and semistructured single interviews with their managers. The Theoretical Domains Framework guided data collection and analysis. Content analysis was performed whereby new themes and themes already existing within each domain were described. Six predominant domains were identified: (1) professional role and identity; (2) beliefs about consequences; (3) goals; (4) knowledge; (5) optimism and (6) environmental context and resources. The content analysis identified three themes, each containing two subthemes. The themes were professional role and identity, beliefs about consequences and preconditions for a successful implementation. Two different cultures were identified in the emergency department. These cultures applied to different professional roles and identity, different actions and sense making and identified how barriers and facilitators linked to the new screening tool were perceived. The results show that different cultures exist in the same local context and influence the perception of barriers and facilitators differently. These cultures must be identified and addressed when implementation is planned. © 2016 The Authors. Journal of Clinical Nursing Published by John

  5. Geographic information systems and applied spatial statistics are efficient tools to study Hansen's disease (leprosy) and to determine areas of greater risk of disease.

    Science.gov (United States)

    Queiroz, José Wilton; Dias, Gutemberg H; Nobre, Maurício Lisboa; De Sousa Dias, Márcia C; Araújo, Sérgio F; Barbosa, James D; Bezerra da Trindade-Neto, Pedro; Blackwell, Jenefer M; Jeronimo, Selma M B

    2010-02-01

    Applied Spatial Statistics used in conjunction with geographic information systems (GIS) provide an efficient tool for the surveillance of diseases. Here, using these tools we analyzed the spatial distribution of Hansen's disease in an endemic area in Brazil. A sample of 808 selected from a universe of 1,293 cases was geocoded in Mossoró, Rio Grande do Norte, Brazil. Hansen's disease cases were not distributed randomly within the neighborhoods, with higher detection rates found in more populated districts. Cluster analysis identified two areas of high risk, one with a relative risk of 5.9 (P = 0.001) and the other 6.5 (P = 0.001). A significant relationship between the geographic distribution of disease and the social economic variables indicative of poverty was observed. Our study shows that the combination of GIS and spatial analysis can identify clustering of transmissible disease, such as Hansen's disease, pointing to areas where intervention efforts can be targeted to control disease.

  6. 还原染料百年发展史话(待续)%The development of vat dyes in 100 years(to be continued)

    Institute of Scientific and Technical Information of China (English)

    陈荣圻

    2015-01-01

    1901年,BASF合成并生产了第1只还原染料(还原染料RSN),距今超过一百年,如果说1987年合成靛蓝早已诞生于BASF,那离百年就更远了.还原染料化学结构复杂,合成过程冗长、三废量大、难于处理,所以价格昂贵.但因其色彩鲜艳、密度高,非其他棉用染料所能取代.还原染料除了印染,经颜料化后,某些染料可制得高档有机颜料,有些品种还能拓展到光物理、电化学领域的液晶、光导材料等高科技领域,是不可缺失的功能材料,旧貌换新颜.%The first vat dye (vat dye RSN) was synthesized and produced by BASF in 1901, which was more than 100 years. Considering indigo blue was synthesized by BASF in 1987, it has a history of far more than 100 years. Vat dyes are expensive because of its complex chemical structure and long synthesis process with large amount of "three wastes" that are difficult to deal with. However, vat dyes are colourful, high color density and impossible to be replaced by any other dyes. Besides printing and dyeing, vat dyes can obtain high-grade organic pigments after pigmentation. Some of those high-grade organic pigments can extend to optical physics, liquid crystal in the electrochemistry optical material fields and other high-tech fields, which are indispensable function materials and completely changed.

  7. Reconstruction of soil moisture for the past 100 years in eastern Siberia by using δ13C of larch tree rings

    Science.gov (United States)

    Tei, Shunsuke; Sugimoto, Atsuko; Yonenobu, Hitoshi; Yamazaki, Takeshi; Maximov, Trofim C.

    2013-07-01

    stable carbon isotope ratio (δ13C) chronology for the past 100 years was developed from larch tree rings in eastern Siberia (near Yakutsk, 62°14'N, 129°37'E), to reconstruct past soil moisture water equivalent (SWE). Based on the correlation analyses between SWE and tree ring δ13C, we developed a linear regression model for SWE in the late growing period (LGP: 15 July to 31 August) using annual tree ring δ13C, which was calculated from the combination of latewood in a current year and earlywood in the following year, and then reconstructed SWE (LGP) for 1908-2007. Reconstructed SWE was compared with factors such as the output of the land surface model, annual precipitation, and Palmer Drought Severity Index for July. From the results, the reconstructed SWE appears reasonable and shows a large variation, including repeated occurrences of severe drought and an unprecedented high soil moisture event in 2006-2007 during the past 100 years. The reconstruction also captured a past documented record of severe drought in the 1940s. Despite the generally good performance of the reconstruction, by the 1930s the estimated SWE was higher than that expected from the annual precipitation. Tree ring width and δ13C were negatively correlated in most periods. However, the negative correlation was weaker for the period from 1919 to 1925, when relatively low air temperature was observed. This result suggests that the rate of photosynthesis, together with the degree of stomata opening, also affected the tree ring δ13C during cool periods.

  8. Changes in C37 alkenones flux on the eastern continental shelf of the Bering Sea: the record of Emiliania huxleyi bloom over the past 100 years

    Science.gov (United States)

    Harada, N.; Sato, M.; Okazaki, Y.; Oguri, K.; Tadai, O.; Saito, S.; Konno, S.; Jordan, R. W.; Katsuki, K.; Shin, K.; Narita, H.

    2008-12-01

    Flourishes of coccolithophores can be detected by ocean color imagery with data from the satellite-borne Sea-viewing Wide Field-of-view sensor SeaWiFs that was launched in 1997. Thus, temporally and spatially large-scale blooms of Emiliania huxleyi (E. huxleyi) have been distinguished annually in the eastern continental shelf of the Bering Sea since 1997. In 1997, a combination of atmospheric mechanisms produced summer weather anomalies such as calm winds, clear skies, and warm air temperature over the Bering Sea and the weather anomalies caused depletion of the subpycnocline nutrient reservoir (Napp and Hunt, 2001). After depletion of nitrate and silicate, a sustained (more than 4-month-long) bloom of E. huxleyi was observed (Stockwell et al., 2001). Because of the speed and magnitude with which parts of the Bering Sea ecosystem responded to changes in atmospheric factors (Napp and Hunt, 2001) and because a bloom of the coccolithophorid, Coccolithus pelagicus has also been detected in the northeastern Atlantic Ocean off Iceland every year since 1997 (Ostermann, 2001), the appearance of an E. huxleyi bloom in the Bering Sea could be related to atmospherically forced decadal oscillations or global factors. We have investigated spatial expansion and temporal development of E. huxleyi bloom on the continental shelf in the Bering Sea by using a biomarker of E. huxleyi, C37 alkenones flux recorded in the sediments during the past 100 years. As a result, the E. huxleyi bloom had been prominent since 1970"fs at latest during the last 100 years. In this presentation, we will discuss the relationship between E. huxleyi bloom and activity of Aleutian low, and also changes in diatom assemblages. References Napp and Hunt, 2001, Fish Oceanogr., 10, 61-68. Ostermann, 2001, WHOI annual report, pp.17-18. Stockwell et al., 2001, Fish Oceanogr., 10, 99-116.

  9. A 100-Year Retrospective Landscape-Level Carbon Budget for the Sooke Lake Watershed, British Columbia: Constraining Estimates of Terrestrial to Aquatic DOC Transfers.

    Science.gov (United States)

    Trofymow, J. A.; Smiley, B. P. K.

    2014-12-01

    To address how natural disturbance, forest harvest, and deforestation from reservoir creation affect landscape-level carbon (C) budgets, a retrospective C budget for the 8500 ha Sooke watershed from 1911 - 2012 was developed using historic spatial inventory and disturbance data. Data was input to a spatially-explicit version of the Carbon Budget Model-Canadian Forest Sector (CBM-CFS3), an inventory-based C budget model used to simulate forest C dynamics at multiple scales. In 1911 the watershed was dominated by mature/old Douglas-fir forests with aboveground biomass C (ABC) of 262 Mg C/ha and net ecosystem production (NEP) of 0.63 Mg C/ha/yr. Land was cleared around Sooke Lake, a dam built and lake expanded from 370 to 450 ha in 1915, 610 ha in 1970, 670 ha in 1980 and 810 ha in 2002. Along with deforestation, fires and localized harvest occurred from 1920 - 1940, reducing ABC to 189 Mg C/ha, with NEP varying from -1.63 to 0.13 Mg C/ha/yr. Distributed harvest occurred 1954 - 1998, with a minimum ABC of 148 Mg C/ha in 1991. By 2012 ABC (177 Mg C/ha) and NEP (2.29 Mg C/ha/yr) had increased. Over 100 years, 2430 ha forest was cut and replanted and 640 ha deforested. CBM-CFS3 includes transfers of dissolved organic C (DOC) to aquatic systems, however data has not been available to parameterize DOC flux. DOC fluxes are modelled as a fraction of decay loss from humified soil C with a default of 100% of losses to CO2 and 0% to DOC. Stream flow and [DOC] data from 1996 - 2012 for 3 watershed catchments, Rithet, Judge and Council were used to estimate annual DOC fluxes. Rithet, Judge and Council differed both in area % disturbed (logging or fire) over 100 years (39%, 93%, 91%) and in area % mature/old forest (>80yrs in 2012) (67%, 56%, 21%). DOC flux for Rithet and Judge ranged from 0.037 - 0.057 Mg C/ha/yr, Council averaged 0.017 Mg C/ha/yr. Low DOC fluxes were likely due to influences of a small lake in the catchment. Constraining CBM-CFS3 to observed DOC fluxes, required

  10. Quantification of uncertainties in the 100-year flow at an ungaged site near a gaged station and its application in Georgia

    Science.gov (United States)

    Cho, Huidae; Bones, Emma

    2016-08-01

    The Federal Emergency Management Agency has introduced the concept of the "1-percent plus" flow to incorporate various uncertainties in estimation of the 100-year or 1-percent flow. However, to the best of the authors' knowledge, no clear directions for calculating the 1-percent plus flow have been defined in the literature. Although information about standard errors of estimation and prediction is provided along with the regression equations that are often used to estimate the 1-percent flow at ungaged sites, uncertainty estimation becomes more complicated when there is a nearby gaged station because regression flows and the peak flow estimate from a gage analysis should be weighted to compute the weighted estimate of the 1-percent flow. In this study, an equation for calculating the 1-percent plus flow at an ungaged site near a gaged station is analytically derived. Also, a detailed process is introduced for calculating the 1-percent plus flow for an ungaged site near a gaged station in Georgia as an example and a case study is performed. This study provides engineers and practitioners with a method that helps them better assess flood risks and develop mitigation plans accordingly.

  11. Intraspecific variation in fine root respiration and morphology in response to in situ soil nitrogen fertility in a 100-year-old Chamaecyparis obtusa forest.

    Science.gov (United States)

    Makita, Naoki; Hirano, Yasuhiro; Sugimoto, Takanobu; Tanikawa, Toko; Ishii, Hiroaki

    2015-12-01

    Soil N fertility has an effect on belowground C allocation, but the physiological and morphological responses of individual fine root segments to variations in N availability under field conditions are still unclear. In this study, the direction and magnitude of the physiological and morphological function of fine roots in response to variable in situ soil N fertility in a forest site were determined. We measured the specific root respiration (Rr) rate, N concentration and morphology of fine root segments with 1-3 branching orders in a 100-year-old coniferous forest of Chamaecyparis obtusa. Higher soil N fertility induced higher Rr rates, root N concentration, and specific root length (SRL), and lower root tissue density (RTD). In all fertility levels, the Rr rates were significantly correlated positively with root N and SRL and negatively with RTD. The regression slopes of respiration with root N and RTD were significantly higher along the soil N fertility gradient. Although no differences in the slopes of Rr and SRL relationship were found across the levels, there were significant shifts in the intercept along the common slope. These results suggest that a contrasting pattern in intraspecific relationships between specific Rr and N, RTD, and SRL exists among soils with different N fertility. Consequently, substantial increases in soil N fertility would exert positive effects on organ-scale root performance by covarying the Rr, root N, and morphology for their potential nutrient and water uptake.

  12. Organochlorine pesticides (OCPs) in wetland soils under different land uses along a 100-year chronosequence of reclamation in a Chinese estuary

    Science.gov (United States)

    Bai, Junhong; Lu, Qiongqiong; Zhao, Qingqing; Wang, Junjing; Gao, Zhaoqin; Zhang, Guangliang

    2015-12-01

    Soil profiles were collected at a depth of 30 cm in ditch wetlands (DWs), riverine wetlands (RiWs) and reclaimed wetlands (ReWs) along a 100-year chronosequence of reclamation in the Pearl River Delta. In total, 16 OCPs were measured to investigate the effects of wetland reclamation and reclamation history on OCP levels. Our results showed that average ∑DDTs, HCB, MXC, and ∑OCPs were higher in surface soils of DWs compared to RiWs and ReWs. Both D30 and D20 soils contained the highest ∑OCP levels, followed by D40 and D100 soils; lower ∑OCP levels occurred in D10 soils. Higher ∑OCP levels were observed in the younger RiWs than in the older ones, and surface soils exhibited higher ∑OCP concentrations in the older ReWs compared with younger ReWs. The predominant percentages of γ-HCH in ∑HCHs (>42%) and aldrin in ∑DRINs (>46%) in most samples reflected the recent use of lindane and aldrin. The presence of dominant DDT isomers (p,p’-DDE and p,p’-DDD) indicated the historical input of DDT and significant aerobic degradation of the compound. Generally, DW soils had a higher ecotoxicological risk of OCPs than RiW and ReW soils, and the top 30 cm soils had higher ecotoxicological risks of HCHs than of DDTs.

  13. The beginnings of Orthopedic Surgery at the Mayo Clinic: A Review of the First Orthopedic Patients who Presented Over 100 Years Ago.

    Science.gov (United States)

    Camp, Christopher L; Morrey, Bernard F; Trousdale, Robert T

    2016-01-01

    Formalized training in the specialty of orthopedic surgery began at the Mayo Clinic nearly 100 years ago, and treatment of patients with musculoskeletal injuries and disease began even earlier. A robust historical patient database provides the opportunity for review of the first recorded orthopedic cases at our institution, which date back to 1907. The first 400 sequential medical charts of the Mayo Clinic's patient record database were comprehensively reviewed in order to identify the first documented orthopedic cases. Of the first 400 patients reviewed, 15 (4%) received specific orthopedic diagnoses. All presented during a three week period in 1907, and they traveled from all over the region for evaluation. The diagnoses included skeletal tuberculosis (n=6), traumatic fracture (n=3), osteomyelitis (n=2), syphilitic pathologic fracture (n=1), syphilitic ostitis of the tibia and radius (n=1), painful flat foot (n=1), and Morton's toe (n=1). Included with the records are patient demographics, diagnoses, symptoms, physical examination findings, radiograph reports, operative reports, and detailed drawings of symptomatology. Although the technology and science has advanced since the early practice of orthopedic surgery that took place over a century ago, we consider ourselves to be merely an extension of those who established the field before us. Just as the past relies on the future for the continuation of what it began so many years ago, we rely on our founders for the groundwork that they laid in creating this field of surgical medicine.

  14. Assessment and remediation of a historical pipeline release : tools, techniques and technologies applied to in-situ/ex-situ soil and groundwater remediation

    Energy Technology Data Exchange (ETDEWEB)

    Reid, N. [EBA Engineering Consultants Ltd., Calgary, AB (Canada); Kohlsmith, B. [Kinder Morgan Canada Inc., Calgary, AB (Canada)

    2008-07-01

    Tools, techniques, and technologies applied to in-situ/ex-situ soil and groundwater remediation were presented as part of an assessment and remediation of a historical pipeline release. The presentation discussed the initial assessment, as well as a discussion of remediation of hydrophobic soils, re-assessment, site specific criteria, a remediation trial involving bioventing and chemical oxidation, and a full scale remediation. The pipeline release occurred in the summer of 1977. The event was followed by a complete surface remediation with a significant amount of topsoil being removed and replaced. In 2004, a landowner complained of poor crop growth in four patches near the area of the historical spill. An initial assessment was undertaken and several photographs were presented. It was concluded that a comprehensive assessment set the base for a careful staged approach to the remediation of the site including the establishment of site specific criteria. The process was made possible with a high level of communication between all stakeholders. In addition, the most appropriate solution for the site was realized. figs.

  15. Application of a stent splint to protect intraoral organs from radiation injury to a 97 year-old patient with multiple oral cancers who survived over 100 year-old

    Energy Technology Data Exchange (ETDEWEB)

    Yanagisawa, Shigetaka; Kawamura, Tetsuo; Shimizu, Masatsugu; Aoki, Hirooki; Mizuki, Harumi; Ashizawa, Akira (Oita Medical Coll., Hasama (Japan))

    1989-06-01

    Radiation therapy had been used with increasing frequency in recent years in the management of oral cancers of advanced ages. In those cases we have to take good care to maintain the oral health of patients undergoing cancerocidal dose of radiation therapy. Using splints, as a tissue displacer, during radiation, we could treat a 99-year-old female patient without serious radiation sequelae, successfully she survived over 100 year-old. As she visited us at 97 year-old, the primary lesions located on the left upper lip, nose, upper and lower gums were diagnosed as multiple verrucous carcinoma histologically. Seventeen months after the first radiotherapy to the lip, nose and upper jaw, we planned again radiotherapy to the recurrent tumor of the lower gum. In order to eliminate and minimize side effects of the second irradiation for the contigenous intraoral organs, we devised a splint to exclude the tongue and upper gum apart from a radiation field. The splint, as tissue displacer, was made of heat-cured acrylic resin and divided into two pieces which were formed like full denture without artificial teeth. They were applied to the upper and lower jaws. The lower one had a large wing to exclude the tongue from irradiation field. After setting of the splint, she had been clenched slightly with an aid of chin cap. Then we could finish successfully the radiotherapy with 10 MV X-ray 40 Gy as scheduled without serious troubles. (author).

  16. Central control of information transmission through the intraspinal arborizations of sensory fibers examined 100 years after Ramón y Cajal.

    Science.gov (United States)

    Rudomin, Pablo

    2002-01-01

    About 100 years ago, Santiago Ramón y Cajal reported that sensory fibers entering the spinal cord have ascending and descending branches, and that each of them sends collaterals to the gray matter where they have profuse ramifications. To him this was a fundamental discovery and proposed that the intraspinal branches of the sensory fibers were "centripetal conductors by which sensory excitation is propagated to the various neurons in the gray matter". In addition, he assumed that "conduction of excitation within the intraspinal arborizations of the afferent fibers would be proportional to the diameters of the conductors", and that excitation would preferentially flow through the coarsest branches. The invariability of some elementary reflexes such as the knee jerk would be the result of a long history of plastic adaptations and natural selection of the safest neuronal organizations. There is now evidence suggesting that in the adult cat, the intraspinal branches of sensory fibers are not hard wired routes that diverge excitation to spinal neurons in an invariable manner, but rather dynamic pathways where excitation flow can be centrally addressed to reach specific neuronal targets. This central control of information flow is achieved by means of specific sets of GABAergic interneurons that produce primary afferent depolarization (PAD) via axo-axonic synapses and reduce transmitter release (presynaptic inhibition). The PAD produced by single, or by small groups of GABAergic interneurons in group I muscle afferents, can remain confined to some sets of intraspinal arborizations of the afferent fibers and not spread to nearby collaterals. In muscle spindle afferents this local character of PAD allows cutaneous and descending inputs to differentially inhibit the PAD in segmental and ascending collaterals of individual fibers, which may be an effective way to decouple the information flow arising from common sensory inputs. This feature appears to play an important role

  17. From Rail-Oriented to Automobile-Oriented Urban Development and Back. 100 Years of Paradigm Change and Transport Policy in Berlin

    Directory of Open Access Journals (Sweden)

    Friedemann Kunst

    2016-10-01

    Full Text Available Transport and its side effects are major problems in rapidly growing cities. Car traffic dominates these cities and pollutes the environment without being able to sufficiently secure the mobility of the urban population and goods. A paradigm shift in urban and transport policy will be necessary to change this situation. In spite of its different development dynamics, Berlin is an interesting example to discuss development strategies for rapidly growing cities because in the course of more than 100 years, a twofold paradigm shift has occurred in the city both conceptually and practically:  Berlin has shifted from a city dominated by rail traffic  to an automobile-oriented city,  and has then gradually transformed back into a city in which  an intertwined system of public and non-motorized individual means of transport secures the mobility of the urban population. The interdependencies on the conceptual level between urban planning and transport policies as well as on a practical level between urban structures and transport systems can be studied using the example of Berlin. Experiences with the implementation of automobile-oriented planning and the special conditions in the first decade after reunification led to protests, reflection, and a revision of the transport policy. A strategically designed process of integrated planning has brought about a trend reversal, and steered the development of transport in the direction of clearly formulated sustainability-oriented objectives. In this process, the reintegration of transport and spatial planning and a reorganization of institutional structures at the administrative level was of particular importance. Compact, rail-oriented settlement structures like in the metropolitan region of Berlin make it easier to dispense with automobiles than sprawled structures. The residual role that qualitatively improved automobiles will take in the cities of the future will have to be determined by research and

  18. Changes in stable isotopes, lignin-derived phenols, and fossil pigments in sediments of Lake Biwa, Japan: implications for anthropogenic effects over the last 100 years.

    Science.gov (United States)

    Hyodo, Fujio; Tsugeki, Narumi; Azuma, Jun-Ichi; Urabe, Jotaro; Nakanishi, Masami; Wada, Eitaro

    2008-09-15

    We measured stable nitrogen (N) and carbon (C) isotope ratios, lignin-derived phenols, and fossil pigments in sediments of known ages to elucidate the historical changes in the ecosystem status of Lake Biwa, Japan, over the last 100 years. Stable N isotope ratios and algal pigments in the sediments increased rapidly from the early 1960s to the 1980s, and then remained relatively constant, indicating that eutrophication occurred in the early 1960s but ceased in the 1980s. Stable C isotope ratios of the sediment increased from the 1960s, but decreased after the 1980s to the present. This decrease in stable C isotope ratios after the 1980s could not be explained by annual changes in either terrestrial input or algal production. However, when the C isotope ratios were corrected for the Suess effect, the shift to more negative isotopic value in atmospheric CO(2) by fossil fuel burning, the isotopic value showed a trend, which is consistent with the other biomarkers and the monitoring data. The trend was also mirrored by the relative abundance of lignin-derived phenols, a unique organic tracer of material that originated from terrestrial plants, which decreased in the early 1960s and recovered to some degree in the 1980s. We detected no notable difference in the composition of lignin phenols, suggesting that the terrestrial plant composition did not change markedly. However, we found that lignin accumulation rate increased around the 1980s. These results suggest that although eutrophication has stabilized since the 1980s, allochthonous organic matter input has changed in Lake Biwa over the past 25 years.

  19. Fractionation, transfer, and ecological risks of heavy metals in riparian and ditch wetlands across a 100-year chronosequence of reclamation in an estuary of China

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, Rong [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China); School of Nature Conservation, Beijing Forestry University, Beijing 100083 (China); Bai, Junhong, E-mail: junhongbai@163.com [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China); Lu, Qiongqiong; Zhao, Qingqing; Gao, Zhaoqin; Wen, Xiaojun; Liu, Xinhui [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China)

    2015-06-01

    The effect of reclamation on heavy metal concentrations and the ecological risks in ditch wetlands (DWs) and riparian wetlands (RWs) across a 100-year chronosequence in the Pearl River Estuary of China was investigated. Concentrations of 4 heavy metals (Cd, Cu, Pb, and Zn) in soil and plant samples, and sequential extracts of soil samples were determined, using inductively coupled plasma atomic absorption spectrometry. Results showed that heavy metal concentrations were higher in older DW soils than in the younger ones, and that the younger RW soils contained higher heavy metal concentrations compared to the older ones. Although the increasing tendency of heavy metal concentrations in soil was obvious after wetland reclamation, the metals Cu, Pb, and Zn exhibited low or no risks to the environment based on the risk assessment code (RAC). Cd, on the other hand, posed a medium or high risk. Cd, Pb, and Zn were mainly bound to Fe–Mn oxide, whereas most of Cu remained in the residual phase in both ditch and riparian wetland soils, and the residual proportions generally increased with depth. Bioconcentration and translocation factors for most of these four heavy metals significantly decreased in the DWs with older age (p < 0.05), whereas they increased in the RWs with younger age (p < 0.05). The DW soils contained higher concentrations of heavy metals in the organic fractions, whereas there were more carbonate and residual fractions in the RW soils. The non-bioavailable fractions of Cu and Zn, and the organic-bound Cd and Pb significantly inhibited plant growth. - Highlights: • Heavy metals in ditch wetland accumulated with increasing reclamation history. • Heavy metals exist in the Fe–Mn oxides and residual fractions in both wetlands. • Cd posed a medium to high environmental risk while low risk for other metals. • Long reclamation history caused lower BCFs and TFs in DWs and higher levels in RWs. • RW soils contained more heavy metals in the carbonate

  20. Upwelling and anthropogenic forcing on phytoplankton productivity and community structure changes in the Zhejiang coastal area over the last 100 years

    Institute of Scientific and Technical Information of China (English)

    DUAN Shanshan; XING Lei; ZHANG Hailong; FENG Xuwen; YANG Haili; ZHAO Meixun

    2014-01-01

    Phytoplankton productivity and community structure in marginal seas have been altered significantly dur-ing the past three decades, but it is still a challenge to distinguish the forcing mechanisms between climate change and anthropogenic activities. High time-resolution biomarker records of two 210Pb-dated sediment cores (#34:28.5°N, 122.272°E;CJ12-1269:28.861 9°N, 122.515 3°E) from the Min-Zhe coastal mud area were compared to reveal changes of phytoplankton productivity and community structure over the past 100 years. Phytoplankton productivity started to increase gradually from the 1970s and increased rapidly after the late 1990s at Site #34;and it started to increase gradually from the middle 1960s and increased rapidly after the late 1980s at Site CJ12-1269. Productivity of Core CJ12-1269 was higher than that of Core #34. Phy-toplankton community structure variations displayed opposite patterns in the two cores. The decreasing D/B (dinosterol/brassicasterol) ratio of Core #34 since the 1960s revealed increased diatom contribution to total productivity. In contrast, the increasing D/B ratio of Core CJ12-1269 since the 1950s indicated in-creased dinoflagellate contribution to total productivity. Both the productivity increase and the increased dinoflagellate contribution in Core CJ12-1269 since the 1950-1960s were mainly caused by anthropogenic activities, as the location was closer to the Changjiang River Estuary with higher nutrient concentration and decreasing Si/N ratios. However, increased diatom contribution in Core #34 is proposed to be caused by increased coastal upwelling, with higher nutrient concentration and higher Si/N ratios.

  1. A double-blind, randomized study to assess the validity of applied kinesiology (AK) as a diagnostic tool and as a nonlocal proximity effect.

    Science.gov (United States)

    Schwartz, Stephan A; Utts, Jessica; Spottiswoode, S James P; Shade, Christopher W; Tully, Lisa; Morris, William F; Nachman, Ginette

    2014-01-01

    Klinkoski and Leboeuf (1990), which considered 50 papers published between 1981 and 1987 by the International College of Applied Kinesiology, and the survey by Hall, Lewith, Brien, and Little (2008), using standard evaluation criteria [quality assessment tool for studies of diagnostic accuracy included in systematic reviews (QUADAS), Standards for Reporting of Diagnostic Studies (STARD), JADAD, and Consolidated Standards of Reporting Trials (CONSORT)], for research methodology, as well as six prior non-clinical studies by Radin (1984), Quintanar and Hill (1988), Braud (1989), Arnett et al. (1999), Ludtke (2001), and Kendler and Keating (2003), all together suggest the following: The research published by the Applied Kinesiology field itself is not to be relied upon, and in the experimental studies that do meet accepted standards of science, Applied Kinesiology has not demonstrated that it is a useful or reliable diagnostic tool upon which health decisions can be based. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Indications of progressive desiccation of the Transvaal Lowveld over the past 100 years, and implications for the water stabilization programme in the Kruger National Park

    Directory of Open Access Journals (Sweden)

    U. De V. Pienaar

    1985-12-01

    Full Text Available All available rainfall statistics recorded for the Kruger National Park area since 1907, coupled with an analysis of all the historical climatological data on hand, appear to confirm the quasi-twenty-year rainfall oscillation in precipitation pattern for the summer rainfall area. This was first pointed out by Tyson & Dyer (1975. The dendrochronological data obtained by Hall (1976 from a study of growth rings of a very old yellowwood tree (Podocarpus falcatus in Natal, also appear to indicate a superimposed, long-term (80-100 years pattern of alternate below- average and above-average rainfall periods. The historical data relating to climate in the park, during the past century or two, seem to bear out such a pattern. If this can be confirmed, it will be an enormous aid not only in wildlife-management planning, but also to agriculturists, demographic planners and others. It would appear that the long, relatively dry rainfall period of 1860-1970, with its concomitant progressive desiccation of the @ area in question, has passed over into the next aboveverage rainfall era. This does not mean that there will be no further cataclysmic droughts during future rainfall trough periods. It is therefore wise to plan ahead to meet such contingencies. The present water distribution pattern in the park (natural plus artificial water is conspicuously still well below that which pertained, during dry seasons, at the turn of the century, when the Sabi and Shingwedzi game reserves were proclaimed. It is the declared policy of the National Parks Board of Trustees to simulate natural regulating mechanisms as closely as possible. In consequence the artificial water-for-game program is a long way from completion. The large numbers of game animals in the park (including dominant species such as elephant Loxodonta africana and buffalo Syncerus coffer can no longer migrate out of the area to escape natural catastrophes (such as the crippling droughts of 1911-1917, the

  3. Uncertainty Analysis of Climate Warming During the Last 100 Years%近百年气候变暖的不确定性分析

    Institute of Scientific and Technical Information of China (English)

    赵宗慈; 王绍武; 罗勇; 江滢

    2009-01-01

    As mentioned in the IPCC report, the linear trend of the global annual mean surface air temperature during the last 100 years (1906 to 2005) is 0.74℃(in the range of 0.56 to 0.921℃). The wanning trend in China is 0.53~0.86℃ during the same period. But it should emphasized that there are some uncertainties and gaps in the information about the recent climate warming, especially in some special regions such as in China. The uncertainties of climate warming in China come from both lack of the observed data during the first half of the 20th century and the urbanization process (heat island effects) during the second half of the 20th century that might contribute 25% of the total wanning. The climate warming in China during the last 50 years is not only contributed by the human activity, but also by the urbanization, the natural periodicity and decadal variability, dimming and brightening of the solar radiation, as well as other forcing factors such as solar activity, volcanic activity and interactions inside the climate system. The fractional uncertainties of the future climate prediction and projections vary due to internal variability of the global climate system, climatemodel uncertainty and scenario uncertainty. The global climate models do not have adequately fine resolutions due to the lack of high speed computers, with inveracious simulations by the models, especially in the regional scales. The parameterizations of the physical, chemical and biological processes in the global climate system are complicated. The present climate models can hardly describe those processes, interactions and the feedback mechanisms, . such as clouds-aerosols-radiation feedbacks. The future climate changes will be caused by both natural and anthropogenic forcing factors. It is difficult to predict solar and volcanic activities in a long period. The human activities in future are provided as some scenarios, not the real human emissions. Therefore, the reliability of climate

  4. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  5. Performance-driven design with the support of digital tools: Applying discrete event simulation and space syntax on the design of the emergency department

    Directory of Open Access Journals (Sweden)

    David Morgareidge

    2014-09-01

    This case study demonstrates that DES and SSA are effective tools for facilitating decision-making related to design, reducing capital and operational costs, and improving organizational performance. DES focuses on operational processes and care flow. SSA complements DES with its strength in linking space to human behavior. Combining both tools can lead to high-performance ED design and can extend to broad applications in health care.

  6. The Screening Tool of Feeding Problems Applied to Children (STEP-CHILD): Psychometric Characteristics and Associations with Child and Parent Variables

    Science.gov (United States)

    Seiverling, Laura; Hendy, Helen M.; Williams, Keith

    2011-01-01

    The present study evaluated the 23-item Screening Tool for Feeding Problems (STEP; Matson & Kuhn, 2001) with a sample of children referred to a hospital-based feeding clinic to examine the scale's psychometric characteristics and then demonstrate how a children's revision of the STEP, the STEP-CHILD is associated with child and parent variables.…

  7. 手持式电动工具用轴承性能浅析%Brief Performance Analysis on the Bearings Applied to Hand-held Motor-operated Electric Tools

    Institute of Scientific and Technical Information of China (English)

    张永恩; 方承志; 宋贵州; 李兴林

    2014-01-01

    Deep groove ball bearings, represented in the tool assembly without considering other factors, combined with the relevant sections of the power tool safety standards, the content mainly discusses several performance requirements of the bearings applied to hand-held motor-operated electric tools and their relationship and effects with safety standard. This content can also be referenced while make technical design and quality appraisal.%以深沟球轴承为代表,在不考虑工具装配等因素的前提下,结合电动工具安全标准的相关章节,讨论并分析手持式电动工具用滚动轴承的主要性能和要求,可供设计选用和质量评价时参考。

  8. Applying standards to ICT models, tools and data in Europe to improve river basin networks and spread innovation on water sector

    Science.gov (United States)

    Pesquer, Lluís; Jirka, Simon; van de Giesen, Nick; Masó, Joan; Stasch, Christoph; Van Nooyen, Ronald; Prat, Ester; Pons, Xavier

    2015-04-01

    This work describes the strategy of the European Horizon 2020 project WaterInnEU. Its vision is to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to the water sector and to establish suitable conditions for new market opportunities based on these offerings. The main goals are: • Connect the research results and developments of previous EU funded activities with the already existing data available on European level and also with to the companies that are able to offer products and services based on these tools and data. • Offer an independent marketplace platform complemented by technical and commercial expertise as a service for users to allow the access to products and services best fitting their priorities, capabilities and procurement processes. One of the pillars of WaterInnEU is to stimulate and prioritize the application of international standards into ICT tools and policy briefs. The standardization of formats, services and processes will allow for a harmonized water management between different sectors, fragmented areas and scales (local, regional or international) approaches. Several levels of interoperability will be addressed: • Syntactic: Connecting system and tools together: Syntactic interoperability allows for client and service tools to automatically discover, access, and process data and information (query and exchange parts of a database) and to connect each other in process chains. The discovery of water related data is achieved using metadata cataloguing standards and, in particular, the one adopted by the INSPIRE directive: OGC Catalogue Service for the Web (CSW). • Semantic: Sharing a pan-European conceptual framework This is the ability of computer systems to exchange data with unambiguous, shared meaning. The project therefore addresses not only the packaging of data (syntax), but also the simultaneous transmission of the meaning with the data (semantics). This is accomplished by linking

  9. 100 years of occupational safety research: From basic protections and work analysis to a multilevel view of workplace safety and risk.

    Science.gov (United States)

    Hofmann, David A; Burke, Michael J; Zohar, Dov

    2017-03-01

    Starting with initiatives dating back to the mid-1800s, we provide a high-level review of the key trends and developments in the application of applied psychology to the field of occupational safety. Factory laws, basic worker compensation, and research on accident proneness comprised much of the early work. Thus, early research and practice very much focused on the individual worker, the design of their work, and their basic protection. Gradually and over time, the focus began to navigate further into the organizational context. One of the early efforts to broaden beyond the individual worker was a significant focus on safety-related training during the middle of the 20th century. Toward the latter years of the 20th century and continuing the move from the individual worker to the broader organizational context, there was a significant increase in leadership and organizational climate (safety climate) research. Ultimately, this resulted in the development of a multilevel model of safety culture/climate. After discussing these trends, we identify key conclusions and opportunities for future research. (PsycINFO Database Record

  10. LimTox: a web tool for applied text mining of adverse event and toxicity associations of compounds, drugs and genes.

    Science.gov (United States)

    Cañada, Andres; Capella-Gutierrez, Salvador; Rabal, Obdulia; Oyarzabal, Julen; Valencia, Alfonso; Krallinger, Martin

    2017-05-22

    A considerable effort has been devoted to retrieve systematically information for genes and proteins as well as relationships between them. Despite the importance of chemical compounds and drugs as a central bio-entity in pharmacological and biological research, only a limited number of freely available chemical text-mining/search engine technologies are currently accessible. Here we present LimTox (Literature Mining for Toxicology), a web-based online biomedical search tool with special focus on adverse hepatobiliary reactions. It integrates a range of text mining, named entity recognition and information extraction components. LimTox relies on machine-learning, rule-based, pattern-based and term lookup strategies. This system processes scientific abstracts, a set of full text articles and medical agency assessment reports. Although the main focus of LimTox is on adverse liver events, it enables also basic searches for other organ level toxicity associations (nephrotoxicity, cardiotoxicity, thyrotoxicity and phospholipidosis). This tool supports specialized search queries for: chemical compounds/drugs, genes (with additional emphasis on key enzymes in drug metabolism, namely P450 cytochromes-CYPs) and biochemical liver markers. The LimTox website is free and open to all users and there is no login requirement. LimTox can be accessed at: http://limtox.bioinfo.cnio.es. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Decision support tool for Virtual Power Players: Hybrid Particle Swarm Optimization applied to Day-ahead Vehicle-To-Grid Scheduling

    DEFF Research Database (Denmark)

    Soares, João; Valle, Zita; Morais, Hugo

    2013-01-01

    This paper presents a decision support Tool methodology to help virtual power players (VPPs) in the Smart Grid (SGs) context to solve the day-ahead energy ressource scheduling considering the intensive use of Distributed Generation (DG) and Vehicle-To-Grid (V2G). The main focus is the application...... of a new hybrid method combing a particle swarm approach and a deterministic technique based on mixedinteger linear programming (MILP) to solve the day-ahead scheduling minimizing total operation costs from the aggregator point of view. A realistic mathematical formulation, considering the electric network...... constraints and V2G charging and discharging efficiencies is presented. Full AC power flow calculation is included in the hybrid method to allow taking into account the network constraints. A case study with a 33-bus distribution network and 1800 V2G resources is used to illustrate the performance...

  12. The ClearEarth Project: Preliminary Findings from Experiments in Applying the CLEARTK NLP Pipeline and Annotation Tools Developed for Biomedicine to the Earth Sciences

    Science.gov (United States)

    Duerr, R.; Thessen, A.; Jenkins, C. J.; Palmer, M.; Myers, S.; Ramdeen, S.

    2016-12-01

    The ability to quickly find, easily use and effortlessly integrate data from a variety of sources is a grand challenge in Earth sciences, one around which entire research programs have been built. A myriad of approaches to tackling components of this challenge have been demonstrated, often with some success. Yet finding, assessing, accessing, using and integrating data remains a major challenge for many researchers. A technology that has shown promise in nearly every aspect of the challenge is semantics. Semantics has been shown to improve data discovery, facilitate assessment of a data set, and through adoption of the W3C's Linked Data Platform to have improved data integration and use at least for data amenable to that paradigm. Yet the creation of semantic resources has been slow. Why? Amongst a plethora of other reasons, it is because semantic expertise is rare in the Earth and Space sciences; the creation of semantic resources for even a single discipline is labor intensive and requires agreement within the discipline; best practices, methods and tools for supporting the creation and maintenance of the resources generated are in flux; and the human and financial capital needed are rarely available in the Earth sciences. However, other fields, such as biomedicine, have made considerable progress in these areas. The NSF-funded ClearEarth project is adapting the methods and tools from these communities for the Earth sciences in the expectation that doing so will enhance progress and the rate at which the needed semantic resources are created. We discuss progress and results to date, lessons learned from this adaptation process, and describe our upcoming efforts to extend this knowledge to the next generation of Earth and data scientists.

  13. Modified Linear Theory Aircraft Design Tools and Sonic Boom Minimization Strategy Applied to Signature Freezing via F-function Lobe Balancing

    Science.gov (United States)

    Jung, Timothy Paul

    Commercial supersonic travel has strong business potential; however, in order for the Federal Aviation Administration to lift its ban on supersonic flight overland, designers must reduce aircraft sonic boom strength to an acceptable level. An efficient methodology and associated tools for designing aircraft for minimized sonic booms are presented. The computer-based preliminary design tool, RapidF, based on modified linear theory, enables quick assessment of an aircraft's sonic boom with run times less than 30 seconds on a desktop computer. A unique feature of RapidF is that it tracks where on the aircraft each segment of the of the sonic boom came from, enabling precise modifications, speeding the design process. Sonic booms from RapidF are compared to flight test data, showing that it is capability of predicting a sonic boom duration, overpressure, and interior shock locations. After the preliminary design is complete, scaled flight tests should be conducted to validate the low boom design. When conducting such tests, it is insufficient to just scale the length; thus, equations to scale the weight and propagation distance are derived. Using RapidF, a conceptual supersonic business jet design is presented that uses F-function lobe balancing to create a frozen sonic boom using lifting surfaces. The leading shock is reduced from 1.4 to 0.83 psf, and the trailing shock from 1.2 to 0.87 psf, 41% and 28% reductions respectfully. By changing the incidence angle of the surfaces, different sonic boom shapes can be created, and allowing the lobes to be re-balanced for new flight conditions. Computational fluid dynamics is conducted to validate the sonic boom predictions. Off-design analysis is presented that varies weight, altitude, Mach number, and propagation angle, demonstrating that lobe-balance is robust. Finally, the Perceived Level of Loudness metric is analyzed, resulting in a modified design that incorporates other boom minimization techniques to further reduce

  14. [Leather bags production: organization study, general identification of hazards, biomechanical overload risk pre-evaluation using an easily applied evaluation tool].

    Science.gov (United States)

    Montomoli, Loretta; Coppola, Giuseppina; Sarrini, Daniela; Sartorelli, P

    2011-01-01

    Craft industries are the backbone of the Italian manufacturing system and in this sector the leather trade plays a crucial role. The aim of the study was to experiment with a risk pre-mapping data sheet in leather bag manufacture by analyzing the production cycle. The prevalence of biomechanical, organizational and physical factors was demonstrated in tanneries. With regard to chemical agents the lack of any priority of intervention could be due to the lack of information on the chemicals used. In the 2 enterprises that used mechanical processes the results showed different priorities for intervention and a different level of the extent of such intervention. In particular in the first enterprise biomechanical overload was a top priority, while in the second the results were very similar to those of the tannery. The analysis showed in both companies that there was a high prevalence of risk of upper limb biomechanical overload in leather bag manufacture. Chemical risk assessment was not shown as a priority because the list of chemicals used was neither complete nor sufficient. The risk pre-mapping data sheet allowed us to obtain a preliminary overview of all the major existing risks in the leather industry. Therefore the method can prove a useful tool for employers as it permits instant identification of priorities for intervention for the different risks.

  15. Simultaneous determination of benznidazole and itraconazole using spectrophotometry applied to the analysis of mixture: A tool for quality control in the development of formulations

    Science.gov (United States)

    Pinho, Ludmila A. G.; Sá-Barreto, Lívia C. L.; Infante, Carlos M. C.; Cunha-Filho, Marcílio S. S.

    2016-04-01

    The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form.

  16. Simultaneous determination of benznidazole and itraconazole using spectrophotometry applied to the analysis of mixture: A tool for quality control in the development of formulations.

    Science.gov (United States)

    Pinho, Ludmila A G; Sá-Barreto, Lívia C L; Infante, Carlos M C; Cunha-Filho, Marcílio S S

    2016-04-15

    The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form.

  17. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide.

    Science.gov (United States)

    Darwish, Hany W; Hassan, Said A; Salem, Maissa Y; El-Zeany, Badr A

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively.

  18. Ammonia synthesis catalyst 100 years:Practice, enlightenment and challenge%氨合成催化剂100年:实践、启迪和挑战

    Institute of Scientific and Technical Information of China (English)

    刘化章

    2014-01-01

    Haber-Bosch发明的氨合成催化剂创立已经100周年。介绍了氨合成催化剂在理论和实践方面的发展、成就及其启迪,展望了氨合成催化剂的未来和面临的新挑战。催化合成氨技术在20世纪化学工业的发展中起着核心的作用。一个世纪以来,氨合成催化剂经历了Fe3O4基熔铁催化剂、Fe1-xO基熔铁催化剂、Ru基催化剂等发展阶段,以及钴钼双金属氮化物催化剂的发现。实践表明,氨合成催化剂是多相催化领域中许多基础研究的起点和试金石,没有别的反应象氨合成反应一样,能够把理论、模型催化剂和实验连接起来。催化合成氨反应仍然是多相催化理论研究的一个理想的模型体系。理解该反应机理并转换成完美技术成为催化研究领域发展的基本标准。这个永不结束的故事仍然没有结束。除了关于反应的基本步骤、真实结构、亚氮化物这些问题之外,催化合成氨在理论上一个新的挑战是关于在室温和常压下氨合成的预测,包括电催化合成氨、光催化合成氨和化学模拟生物固氮以及包括氮分子在内的催化化学研究中几种最稳定的小分子的活化方法等。%Ammonia synthesis catalyst found by Haber-Bosch achieves its history of 100 years. The current understanding and enlightenment from foundation and development of ammonia synthesis catalyst are reviewed, and its future and facing new challenge remained today are expected. Catalytic ammo-nia synthesis technology has played a central role in the development of the chemical industry dur-ing the 20th century. During 100 years, ammonia synthesis catalyst has come through diversified seedtime such as Fe3O4-based iron catalysts, Fe1-xO-based iron catalysts, ruthenium-based catalysts, and discovery of a Co-Mo-N system. Often new techniques, methods, and theories of catalysis have initially been developed and applied in connection with studies of

  19. THE CASE STUDY TASKS AS A BASIS FOR THE FUND OF THE ASSESSMENT TOOLS AT THE MATHEMATICAL ANALYSIS FOR THE DIRECTION 01.03.02 APPLIED MATHEMATICS AND COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Dina Aleksandrovna Kirillova

    2015-12-01

    Full Text Available The modern reform of the Russian higher education involves the implementation of competence-based approach, the main idea of which is the practical orientation of education. Mathematics is a universal language of description, modeling and studies of phenomena and processes of different nature. Therefore creating the fund of assessment tools for mathematical disciplines based on the applied problems is actual. The case method is the most appropriate mean of monitoring the learning outcomes, it is aimed at bridging the gap between theory and practice.The aim of the research is the development of methodical materials for the creating the fund of assessment tools that are based on the case-study for the mathematical analisis for direction «Applied Mathematics and Computer Science». The aim follows from the contradiction between the need for the introduction of case-method in the educational process in high school and the lack of study of the theoretical foundations of using of this method as applied to mathematical disciplines, insufficient theoretical basis and the description of the process of creating case-problems for use their in the monitoring of the learning outcomes.

  20. Solar geometry tool applied to systems and bio-climatic architecture; Herramienta de geometria solar aplicada a sistemas y arquitectura bio-climatica

    Energy Technology Data Exchange (ETDEWEB)

    Urbano, Antonio; Matsumoto, Yasuhiro; Aguilar, Jaime; Asomoza Rene [CIMVESTAV-IPN, Mexico, D.F (Mexico)

    2000-07-01

    The present article shows the annual solar path, by means of graphic Cartesian, as well as the use of these, taken as base the astronomical, geographical antecedents and of the place. These graphs indicate the hours of sun along the day, month and year for the latitude of 19 Celsius degrees north, as well as the values of radiation solar schedule for the most important declines happened annually (equinoxes, solstices and the intermediate months). These graphs facilitate the user's good location to evaluate inherent obstacles of the environment and to determine in the place, the shades on the solar equipment or immovable (mountains, tree, buildings, windows, terraces, domes, et cetera), the hours of sun or the radiation for the wanted bio-climatic calculation. The present work is a tool of place engineering for the architects, designers, manufactures, planners, installers, energy auditors among other that require the use of the solar energy for anyone of its multiple applications. [Spanish] El presente articulo, muestra las trayectorias solares anules, mediante graficas cartesianas, asi como la utilizacion de estas, tomando como base los antecedentes astronomicos, geograficos y del lugar. Estas graficas indican las horas del sol a lo largo del dia, mes y ano para la latitud de 19 grados Celsius norte, asi como los valores de radiacion solar horaria para las declinaciones mas importantes ocurridas anualmente (equinoccios, solsticios y los meses intermedios). Estas graficas facilitan la ubicacion optima del usuario para evaluar obstaculos inherentes del entorno y determinar en el sitio, las sombras sobre los equipos solares o inmuebles (montanas, arboles, edificios, ventanas, terrazas, domos, etc.), las horas de sol o bien la radiacion para el calculo bio-climatico deseado. El presente trabajo es una herramienta de Ingenieria de sitio para los Arquitectos, Disenadores, Constructores, Proyectistas, Instaladores, Auditores Energeticos entre otros, que requieran el

  1. FAMUS (Flow Assurance by Management of Uncertainty and Simulation): a new tool for integrating flow assurance effects in traditional RAM (Reliability, Availability and Maintainability) analysis applied on a Norwegian Offshore System

    Energy Technology Data Exchange (ETDEWEB)

    Eisinger, Siegfried; Isaksen, Stefan; Grande, Oystein [Det Norske Veritas (DNV), Oslo (Norway); Chame, Luciana [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil)

    2008-07-01

    Traditional RAM (Reliability, Availability and Maintainability) models fall short of taking flow assurance effects into account. In many Oil and Gas production systems, flow assurance issues like hydrate formation, wax deposition or particle erosion may cause a substantial amount of production upsets. Flow Assurance issues are complex and hard to quantify in a production forecast. However, without taking them into account the RAM model generally overestimates the predicted system production. This paper demonstrates the FAMUS concept, which is a method and a tool for integrating RAM and Flow Assurance into one model, providing a better foundation for decision support. FAMUS utilises therefore both Discrete Event and Thermo-Hydraulic Simulation. The method is currently applied as a decision support tool in an early phase of the development of an offshore oil field on the Norwegian continental shelf. (author)

  2. FoodChain-Lab: A Trace-Back and Trace-Forward Tool Developed and Applied during Food-Borne Disease Outbreak Investigations in Germany and Europe.

    Science.gov (United States)

    Weiser, Armin A; Thöns, Christian; Filter, Matthias; Falenski, Alexander; Appel, Bernd; Käsbohrer, Annemarie

    2016-01-01

    FoodChain-Lab is modular open-source software for trace-back and trace-forward analysis in food-borne disease outbreak investigations. Development of FoodChain-Lab has been driven by a need for appropriate software in several food-related outbreaks in Germany since 2011. The software allows integrated data management, data linkage, enrichment and visualization as well as interactive supply chain analyses. Identification of possible outbreak sources or vehicles is facilitated by calculation of tracing scores for food-handling stations (companies or persons) and food products under investigation. The software also supports consideration of station-specific cross-contamination, analysis of geographical relationships, and topological clustering of the tracing network structure. FoodChain-Lab has been applied successfully in previous outbreak investigations, for example during the 2011 EHEC outbreak and the 2013/14 European hepatitis A outbreak. The software is most useful in complex, multi-area outbreak investigations where epidemiological evidence may be insufficient to discriminate between multiple implicated food products. The automated analysis and visualization components would be of greater value if trading information on food ingredients and compound products was more easily available.

  3. Copper nanoparticles applied to the preconcentration and electrochemical determination of β-adrenergic agonist: an efficient tool for the control of meat production.

    Science.gov (United States)

    Regiart, Matías; Escudero, Luis A; Aranda, Pedro; Martinez, Noelia A; Bertolino, Franco A; Raba, Julio

    2015-04-01

    A novel method for preconcentration and electrochemical detection of zinterol in bovine urine samples was developed. In order to improve the limit of detection, the surface of a screen-printed carbon electrode was modified with electrodeposited metal copper nanoparticles. The experimental electrodeposition optimization was performed using a central composite design (CCD), involving the variables: precursor concentration, potential and time applied. Copper nanoparticles were characterized by transmission electron microscopy, scanning electron microscopy, cyclic voltammetry, and energy dispersive X-ray spectroscopy. Mesoporous shuttle-like copper oxide nanoparticles were used for the preconcentration step to avoid interferences with many compounds present in the sample matrix. The optimal working conditions for the preconcentration approach were found by means of both two-level fractional factorial and CCD designs. The obtained enhancement factor for a sample volume of 30 mL was 35 fold. The calibration curve showed linearity between 0.5 and 45 ng mL(-1) and the limit of detection was 0.16 ng mL(-1). The intra and inter assay coefficients of variability were below 4% and 5%; respectively. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. FoodChain-Lab: A Trace-Back and Trace-Forward Tool Developed and Applied during Food-Borne Disease Outbreak Investigations in Germany and Europe.

    Directory of Open Access Journals (Sweden)

    Armin A Weiser

    Full Text Available FoodChain-Lab is modular open-source software for trace-back and trace-forward analysis in food-borne disease outbreak investigations. Development of FoodChain-Lab has been driven by a need for appropriate software in several food-related outbreaks in Germany since 2011. The software allows integrated data management, data linkage, enrichment and visualization as well as interactive supply chain analyses. Identification of possible outbreak sources or vehicles is facilitated by calculation of tracing scores for food-handling stations (companies or persons and food products under investigation. The software also supports consideration of station-specific cross-contamination, analysis of geographical relationships, and topological clustering of the tracing network structure. FoodChain-Lab has been applied successfully in previous outbreak investigations, for example during the 2011 EHEC outbreak and the 2013/14 European hepatitis A outbreak. The software is most useful in complex, multi-area outbreak investigations where epidemiological evidence may be insufficient to discriminate between multiple implicated food products. The automated analysis and visualization components would be of greater value if trading information on food ingredients and compound products was more easily available.

  5. Applying Genetic Algorithms and RIA technologies to the development of Complex-VRP Tools in real-world distribution of petroleum products

    Directory of Open Access Journals (Sweden)

    Antonio Moratilla Ocaña

    2014-12-01

    Full Text Available Distribution problems had held a large body of research and development covering the VRP problem and its multiple characteristics, but few investigations examine it as an Information System, and far fewer as how it should be addressed from a development and implementation point of view. This paper describes the characteristics of a real information system for fuel distribution problems at country scale, joining the VRP research and development work using Genetic Algorithms, with the design of a Web based Information System. In this paper a view of traditional workflow in this area is shown, with a new approximation in which is based proposed system. Taking into account all constraint in the field, the authors have developed a VRPWeb-based solution using Genetic Algorithms with multiple web frameworks for each architecture layer, focusing on functionality and usability, in order to minimizing human error and maximizing productivity. To achieve these goals, authors have use SmartGWT as a powerful Web based RIA SPA framework with java integration, and multiple server frameworks and OSS based solutions,applied to development of a very complex VRP system for a logistics operator of petroleum products.

  6. 100 years of Weyl’s law

    Directory of Open Access Journals (Sweden)

    Victor Ivrii

    2016-08-01

    Full Text Available Abstract We discuss the asymptotics of the eigenvalue counting function for partial differential operators and related expressions paying the most attention to the sharp asymptotics. We consider Weyl asymptotics, asymptotics with Weyl principal parts and correction terms and asymptotics with non-Weyl principal parts. Semiclassical microlocal analysis, propagation of singularities and related dynamics play crucial role. We start from the general theory, then consider Schrödinger and Dirac operators with the strong magnetic field and, finally, applications to the asymptotics of the ground state energy of heavy atoms and molecules with or without a magnetic field.

  7. Dairy innovations over the past 100 Years.

    Science.gov (United States)

    Tunick, Michael H

    2009-09-23

    The dairy industry in the United States has undergone many changes over the past century. Adulteration and contamination of milk were rampant before the passage and enforcement of the Pure Food and Drug Act of 1906, and the introduction and eventual acceptance of certified and pasteurized milk have provided consumers with a consistently safe product. Homogenization and advances in the packaging and transport of milk gradually took hold, improving the milk supply. Other developments included the concentration of milk and whey, lactose-reduced milk, and the popularization of yogurt. Consumers have benefited from advances in butter packaging, low-fat ice cream, cheese manufacture, and yogurt technology, which has helped create the large demand for dairy products in the United States. Current trends and issues, including the increasing popularity of organic and artisanal products and the use of rBST, will shape the future of the dairy industry.

  8. Appraising Schumpeter's "Essence" after 100 years

    DEFF Research Database (Denmark)

    Andersen, Esben Sloth

    by presenting it as only requiring the introduction of innovative entrepreneurs into the set-up of the Walrasian System. Actually, he could easily define the function of his type of entrepreneurs in this manner, but the analysis of the overall process of evolution required a radical reinterpretation...... of the system of general economic equilibrium. He thus made clear that he could not accept the standard interpretation of the quick Walrasian process of adaptation (tâtonnement). Instead, he saw the innovative transformation of routine behaviour as a relatively slow and conflict-ridden process...

  9. Spinoff 2003: 100 Years of Powered Flight

    Science.gov (United States)

    2003-01-01

    Today, NASA continues to reach milestones in space exploration with the Hubble Telescope, Earth-observing systems, the Space Shuttle, the Stardust spacecraft, the Chandra X-Ray Observatory, the International Space Station, the Mars rovers, and experimental research aircraft these are only a few of the many initiatives that have grown out of NASA engineering know-how to drive the Agency s missions. The technical expertise gained from these programs has transferred into partnerships with academia, industry, and other Federal agencies, ensuring America stays capable and competitive. With Spinoff 2003, we once again highlight the many partnerships with U.S. companies that are fulfilling the 1958 Space Act stipulation that NASA s vast body of scientific and technical knowledge also benefit mankind. This year's issue showcases innovations such as the cochlear implant in health and medicine, a cockpit weather system in transportation, and a smoke mask benefiting public safety; many other products are featured in these disciplines, as well as in the additional fields of consumer/home/recreation, environment and resources management, computer technology, and industrial productivity/ manufactacturing technology. Also in this issue, we devote an entire section to NASA s history in the field of flight and showcase NASA s newest enterprise dedicated to education. The Education Enterprise will provide unique teaching and learning experiences for students and teachers at all levels in science, technology, engineering, and mathematics. The Agency also is committed, as never before, to engaging parents and families through NASA s educational resources, content, and opportunities. NASA s catalyst to intensify its focus on teaching and learning springs from our mission statement: to inspire the next generation of explorers as only NASA can.

  10. 100-Year UPS,"Swifter" Olympics

    Institute of Scientific and Technical Information of China (English)

    Guo Yan; Yang Wei

    2007-01-01

    @@ As the official logistics and express delivery sponsor of 2008Beijing Olympics,UPS will manage all logistical operations at the Olympics Test Events(formally known as "Good Luck Beijing Events")and the actual Games through which majority of equipmerit used at the events will flow.

  11. 100 years of international arbitration and adjudication

    National Research Council Canada - National Science Library

    KEITH, Kenneth

    2014-01-01

    ...) - peaceful means of settlement of international disputes - processes for clarification and development of international law - Hague Academy on International Law - arbitration and adjudication within...

  12. After 100 years, is coevolution relevant?

    Science.gov (United States)

    Geral I. McDonald

    2011-01-01

    On the 100th anniversary of the introduction of Cronartium ribicola into western North America, it is fitting to assess the philosophical foundation of plant pathology and forest ecology. We should ask whether this foundation provides sufficient understanding of blister rust, other diseases of North American forests, and general forest ecology to insure the application...

  13. [100 years of the Babinski sign].

    Science.gov (United States)

    Estañol Vidal, B; Huerta Díaz, E; García Ramos, G

    1997-01-01

    In 1896 Joseph Francois Felix Babinski described for the first time the phenomenon of the toes. In his first paper he simply described extension of all toes with noxious stimulation of the sole of the foot. It was not until 1898 that he specifically described the extension of the hallux with stimulation of the lateral border of the sole. Babinski was probably not aware at the time that E. Remak, a German physician, had previously described the sign. In his third paper of 1903 Babinski concludes that if other authors had described the abnormal reflex before him, they found it fortuitously and did not realize its semiologic value. Babinski probably discovered it by a combination of chance, careful observation and intuition. He also had in mind practical applications of the sign particularly in the differential diagnosis with hysteria and in medico-legal areas. Several of his observations and the physiopathological mechanism proposed by him are still valid today. He realized since 1896 that the Babinski reflex was part of the flexor reflex synergy. He observed that several patients during the first hours of an acute cerebral or spinal insult had absent extensor reflexes. He realized that most patients with the abnormal reflex had weakness of the toes and ankles. He found a lack of correlation between hyperactive myotatic reflexes and the presence of an upgoing hallux. He discovered that not all patients with hemiplegia or paraplegia had the sign. He thought erroneously that some normal subjects could have an upgoing toe. His dream of a practical application of the sign has been fully achieved. The motto of Babinski was Observatio summa lex. Perhaps there is no better dictum in clinical neurology.

  14. Appraising Schumpeter's "Essence" after 100 years

    DEFF Research Database (Denmark)

    Andersen, Esben Sloth

    by presenting it as only requiring the introduction of innovative entrepreneurs into the set-up of the Walrasian System. Actually, he could easily define the function of his type of entrepreneurs in this manner, but the analysis of the overall process of evolution required a radical reinterpretation...

  15. Analysis of 100 Years of Curriculum Designs

    Science.gov (United States)

    Kelting-Gibson, Lynn

    2013-01-01

    Fifteen historical and contemporary curriculum designs were analyzed for elements of assessment that support student learning and inform instructional decisions. Educational researchers are purposely paying attention to the role assessment plays in a well-designed planning and teaching process. Assessment is a vital component to educational…

  16. Lurpak: Ready for another 100 years?

    DEFF Research Database (Denmark)

    Grunert, Klaus G.

    2001-01-01

    The Lur mark - the forerunner and very foundation of Lurpak butter - celebrates its 100th anniversary this year. That is an unusual and impressive lifetime for a consumer goods brand and something Danish dairy sector can be proud of.......The Lur mark - the forerunner and very foundation of Lurpak butter - celebrates its 100th anniversary this year. That is an unusual and impressive lifetime for a consumer goods brand and something Danish dairy sector can be proud of....

  17. The 100 year DASCH Transient Search

    Science.gov (United States)

    Miller, George F.; Grindlay, J. E.; Tang, S.; Los, E.

    2014-01-01

    The Digital Access to a Sky Century at Harvard (DASCH) project is currently digitizing the roughly 500,000 photographic plates maintained by the Harvard College Observatory. The Harvard plate collection covers each point of the sky roughly 500 to 3000 times from 1885 to 1992, with limiting magnitudes ranging from B=14-18 mag and photometric accuracy within ±0.1 mag. Production scanning (up to 400 plates/day) is proceeding in Galactic coordinates from the North Galactic Pole and is currently at roughly 50 degrees galactic latitude. The vastness of these data makes the DASCH project ideal to search for transient behavior. In particular, the large time base of the DASCH collection gives an unprecedented advantage when searching for outbursting systems with recurrence rates of decades or longer. These include recurrent novae, rare WZ Sge Cataclysmic Variables, blazars, X-Ray binaries, and supernovae in the Virgo Supercluster. We report here the discovery of previously unidentified stellar-like objects that underwent abnormally large (Δm=5-9) outbursts discovered with DASCH. We also report the discovery of outbursts from previously quiet AM CVn stars, as well as attempt to characterize their recurrence rates.

  18. 100 years of the physics of diodes

    Science.gov (United States)

    Zhang, Peng; Valfells, Ágúst; Ang, L. K.; Luginsland, J. W.; Lau, Y. Y.

    2017-03-01

    The Child-Langmuir Law (CL), discovered a century ago, gives the maximum current that can be transported across a planar diode in the steady state. As a quintessential example of the impact of space charge shielding near a charged surface, it is central to the studies of high current diodes, such as high power microwave sources, vacuum microelectronics, electron and ion sources, and high current drivers used in high energy density physics experiments. CL remains a touchstone of fundamental sheath physics, including contemporary studies of nanoscale quantum diodes and nano gap based plasmonic devices. Its solid state analog is the Mott-Gurney law, governing the maximum charge injection in solids, such as organic materials and other dielectrics, which is important to energy devices, such as solar cells and light emitting diodes. This paper reviews the important advances in the physics of diodes since the discovery of CL, including virtual cathode formation and extension of CL to multiple dimensions, to the quantum regime, and to ultrafast processes. We review the influence of magnetic fields, multiple species in bipolar flow, electromagnetic and time dependent effects in both short pulse and high frequency THz limits, and single electron regimes. Transitions from various emission mechanisms (thermionic-, field-, and photoemission) to the space charge limited state (CL) will be addressed, especially highlighting the important simulation and experimental developments in selected contemporary areas of study. We stress the fundamental physical links between the physics of beams to limiting currents in other areas, such as low temperature plasmas, laser plasmas, and space propulsion.

  19. Leadership: reflections over the past 100 years.

    Science.gov (United States)

    Gregoire, Mary B; Arendt, Susan W

    2014-05-01

    Leadership, viewed by the American Dietetic Association as the ability to inspire and guide others toward building and achieving a shared vision, is a much written-about topic. Research on leadership has addressed the topic using many different approaches, from a very simplistic definition of traits to a more complex process involving interactions, emotions, and learning. Thousands of books and papers have been published on the topic of leadership. This review paper will provide examples of the varying foci of the writings on this topic and includes references for instruments used to measure leadership traits and behaviors. Research is needed to determine effective strategies for preparing dietitians to be effective leaders and assume leadership positions. Identifying ways to help dietitians better reflect on their leadership experiences to enhance their learning and leadership might be one strategy to explore.

  20. Mendelism in human genetics: 100 years on.

    Science.gov (United States)

    Majumdar, Sisir K

    2003-01-01

    Genetics (Greek word--'genes' = born) is a science without an objective past. But the genre of genetics was always roaming in the corridors of human psyche since antiquity. The account of heritable deformities in human often appears in myths and legends. Ancient Hindu Caste system was based on the assumption that both desirable and undesirable traits are passed from generation to generation. In Babylonia 60 birth defects were listed on Clay tablets written around 5,000 year ago. The Jewish Talmud contains accurate description of the inheritance of haemophilia--a human genetic disorder. The Upanisads vedant--800--200 BC provides instructions for the choice of a wife emphasizing that no heritable illness should be present and that the family should show evidence of good character for several preceding generations. These examples indicate that heritable human traits played a significant role in social customs are presented in this article.

  1. Management Tools

    Science.gov (United States)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  2. Structural engineering developments in power plant cooling tower construction. 100 years of natural draught cooling towers - from tower cooler to cooling tower. Bautechnische Entwicklungen im Kraftwerkskuehlturmbau. 100 Jahre Naturzugkuehltuerme - vom Kaminkuehler zum Kuehlkamin

    Energy Technology Data Exchange (ETDEWEB)

    Damjakob, H.; Depe, T.; Vrangos, V. (Balcke-Duerr AG, Ratingen (Germany))

    1992-06-01

    Almost exactly 100 years ago, tower-type structures were first used for the production of artificial ventilation for cooling purposes. The shell of these so-called tower coolers, today known as 'natural draught cooling towers', was, from the outset, the subject of multiple structural engineering develepments in respect of design, material, construction and statistical calculation. These developments have been stimulated especially by the spasmodic increase in dimensions in the application of power plant cooling towers and, more recently, in connection with ecological requirements. (orig.).

  3. The extreme dry/wet events in northern China during recent 100 years%中国近代北方极端干湿事件的演变规律

    Institute of Scientific and Technical Information of China (English)

    马柱国; 丹利; 胡跃文

    2004-01-01

    Using monthly precipitation and monthly mean temperature, a surface humid index was proposed. According to the index, the distributed characteristics of extreme dryness has been fully analyzed. The results indicated that there is an obvious increasing trend of extreme dryness in the central part of northern China and northeastern China in the last 10 years, which shows a high frequency period of extreme dryness; while a low frequency period in the regions during the last 100 years. Compared with variation trend of the temperature in these regions, the region of high frequent extreme dryness is consistent with the warming trend in the same region.

  4. Applying Modeling Tools to Ground System Procedures

    Science.gov (United States)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  5. Predicting pathogen transport and risk of infection from land-applied biosolids

    Science.gov (United States)

    Olson, M. S.; Teng, J.; Kumar, A.; Gurian, P.

    2011-12-01

    Biosolids have been recycled as fertilizer to sustainably improve and maintain productive soils and to stimulate plant growth for over forty years, but may contain low levels of microbial pathogens. The Spreadsheet Microbial Assessment of Risk: Tool for Biosolids ("SMART Biosolids") is an environmental transport, exposure and risk model that compiles knowledge on the occurrence, environmental dispersion and attenuation of biosolids-associated pathogens to estimate microbial risk from biosolids land application. The SMART Biosolids model calculates environmental pathogen concentrations and assesses risk associated with exposure to pathogens from land-applied biosolids through five pathways: 1) inhalation of aerosols from land application sites, 2) consumption of groundwater contaminated by land-applied biosolids, 3) direct ingestion of biosolids-amended soils, 4) ingestion of plants contaminated by land-applied biosolids, and 5) consumption of surface water contaminated by runoff from a land application site. The SMART Biosolids model can be applied under a variety of scenarios, thereby providing insight into effective management practices. This study presents example results of the SMART Biosolids model, focusing on the groundwater and surface water pathways, following biosolids application to a typical site in Michigan. Volumes of infiltration and surface water runoff are calculated following a 100-year storm event. Pathogen transport and attenuation through the subsurface and via surface runoff are modeled, and pathogen concentrations in a downstream well and an adjacent pond are calculated. Risks are calculated for residents of nearby properties. For a 100-year storm event occurring immediately after biosolids application, the surface water pathway produces risks that may be of some concern, but best estimates do not exceed the bounds of what has been considered acceptable risk for recreational water use (Table 1); groundwater risks are very uncertain and at the

  6. 100 years of power plant technology - 100 years of material technology

    Energy Technology Data Exchange (ETDEWEB)

    Schoch, W.

    1983-07-01

    The introduction of the steam turbine required a spasmodic further development of steam boiler technology for the development of new boilers. The difficulties which occurred in this process due to the lack of suitable steels are indicated in this paper. Design and manufacture have nevertheless still not been resolved satisfactorily. With the founding of the VGB the operators endeavoured to find solutions. Further developments up to the technical maturity of our present conventional and nuclear power plant technology are described.

  7. A summary review of the photos of the ethnic groups in Yunnan in the past 100 years%云南百年民族题材照片的人类学解读

    Institute of Scientific and Technical Information of China (English)

    尹绍亭

    2015-01-01

    This paper gives a comprehensive review of the photos of the ethnic groups in Yunnan taken by Chinese and international ethnologists and anthropologists in the past 100 years.These pho-tos are divided into three historical periods according to their characteristics.The paper discusses the theoretical orientations,cultural implications as well as the value,significance and weaknesses of these photos as precious data of visual anthropology.%本文系统地梳理回顾了中外人类学民族学者百年来所拍摄的云南田野照片。根据拍摄时代和资料的特征,笔者将照片分为三个阶段,并逐一讨论各阶段照片资料的理论取向和文化内涵,表现了作为影视人类学重要途径的照片拍摄研究的价值、意义以及存在的问题。

  8. 1,3:2,4-Dibenzylidene-D-sorbitol (DBS) and its derivatives--efficient, versatile and industrially-relevant low-molecular-weight gelators with over 100 years of history and a bright future.

    Science.gov (United States)

    Okesola, Babatunde O; Vieira, Vânia M P; Cornwell, Daniel J; Whitelaw, Nicole K; Smith, David K

    2015-06-28

    Dibenzylidene-D-sorbitol (DBS) has been a well-known low-molecular-weight gelator of organic solvents for over 100 years. As such, it constitutes a very early example of a supramolecular gel--a research field which has recently developed into one of intense interest. The ability of DBS to self-assemble into sample-spanning networks in numerous solvents is predicated upon its 'butterfly-like' structure, whereby the benzylidene groups constitute the 'wings' and the sorbitol backbone the 'body'--the two parts representing the molecular recognition motifs underpinning its gelation mechanism, with the nature of solvent playing a key role in controlling the precise assembly mode. This gelator has found widespread applications in areas as diverse as personal care products and polymer nucleation/clarification, and has considerable potential in applications such as dental composites, energy technology and liquid crystalline materials. Some derivatives of DBS have also been reported which offer the potential to expand the scope and range of applications of this family of gelators and endow the nansocale network with additional functionality. This review aims to explain current trends in DBS research, and provide insight into how by combining a long history of application, with modern methods of derivatisation and analysis, the future for this family of gelators is bright, with an increasing number of high-tech applications, from environmental remediation to tissue engineering, being within reach.

  9. Bank erosion history of a mountain stream determined by means of anatomical changes in exposed tree roots over the last 100 years (Bílá Opava River — Czech Republic)

    Science.gov (United States)

    Malik, Ireneusz; Matyja, Marcin

    2008-06-01

    The date of exposure of spruce roots as a result of bank erosion was investigated on the Bílá Opava River in the northeastern Czech Republic. Following the exposure of roots, wood cells in the tree rings divide into early wood and late wood. Root cells within the tree rings also become smaller and more numerous. These processes permit dating of the erosion episodes in which roots were exposed. Sixty root samples were taken from seven sampling sites selected on two riverbed reaches. The results of root exposure dating were compared to historical data on hydrological flooding. Using the root exposure dating method, several erosion episodes were recorded for the last 100 years. The greatest bank erosion was recorded as consequence of an extraordinary flood in July 1997. In the upper, rocky part of the valley studied, bank erosion often took place during large floods that occurred in the early 20th century. In the lower, alluvial part of the valley, erosion in the exposed roots was recorded only in 1973 and has been intensive ever since. It is suggested that banks in the lower part are more frequently undercut, which leads to the falling of trees within whose roots older erosion episodes were recorded. Locally, bank erosion is often intensified by the position of 1- to 2-m boulders in the riverbed, which direct water into the parts of the banks where erosion occurs. Selective bank erosion could be intensified by debris dams and hillslope material supply to the riverbed.

  10. Review and Preview on The Development of the World women's 100 m Running in Nearly 100 Years%世界女子百米短跑百年发展的回顾与前瞻

    Institute of Scientific and Technical Information of China (English)

    王刚; 辛飞庆; 辛飞兵

    2014-01-01

    为了解和掌握近百年来世界女子百米短跑变化状况,运用文献资料法、数理统计法和比较法,分析世界女子百米短跑发展趋势,旨在探索当今及未来世界女子百米短跑发展趋势。结论:奥运会女子纪录不断被刷新,美国女子百米仍处世界领先水平,夺取奥运女子百米桂冠黑人选手居多,女性的生理结构决定女子百米成绩已达到女子极限。%This paper aims to understand and grasp the changes of the world women's 100 m running in nearly 100 years, explores the current and future development trend of world women's 100m running by methods of literature material , mathematical statistics and comparative law.Results show that the Olym-pic Games women's record has been refreshed on and on , the United States sprinters are still leading in the world, black sprinters is in the majority seizing the best title of Olympic games , the performance has reached the limit determined by the physiological structure of the female.

  11. Applied superconductivity

    CERN Document Server

    Newhouse, Vernon L

    1975-01-01

    Applied Superconductivity, Volume II, is part of a two-volume series on applied superconductivity. The first volume dealt with electronic applications and radiation detection, and contains a chapter on liquid helium refrigeration. The present volume discusses magnets, electromechanical applications, accelerators, and microwave and rf devices. The book opens with a chapter on high-field superconducting magnets, covering applications and magnet design. Subsequent chapters discuss superconductive machinery such as superconductive bearings and motors; rf superconducting devices; and future prospec

  12. 近百年El Nino/La Nina事件与北京气候相关性分析%Correlation Analysis Between El Nino/La Nina Phenomenon During the Recent 100 Years and Beijing Climate

    Institute of Scientific and Technical Information of China (English)

    刘桂莲; 张明庆

    2001-01-01

    Results of the analysis suggest that during the recent 100 years there e xists a strong correlation between the El Nino/La Nina phenomenon and Beijing′s rainfall in summer(June—August),mean monthly maximum temperature (July) and mean monthly minimum temperature in winter (January).El Nino phenomenon appears a negative-correlation with the summer rainfall and the mean monthly minimum te mperature;whereas a positive correlation with the mean monthly maximum temperatu re in summer.La Nina phenomenon appears a positive correlation with the summer r ainfall and the mean monthly minimum temperature in winter;whereas a negative-c orrelation with the mean monthly maximum temperature in summer.%通过对近百年El Nino/La Nina事件与北京气候相关性研究发现,El Nino/ La Nina事件与北京夏季(6~8月)降水、平均最高气温(7月)和冬季(1月)平均最低气温之间 相互关系显著。El Nino事件与夏季降水、冬季平均最低气温呈负相关,与夏季平均最高气 温呈正相关,造成降水减少,气温年较差增大,大陆性增强的气候特点。La Nina事件与夏 季降水、冬季平均最低气温呈正相关,与夏季平均最高气温呈负相关,使降水增加,气温年 较差减小,大陆性减弱的气候特点。

  13. 中国近百年地面温度变化自然因子的因果链分析%The Causal Chain Analysis of Natural Factors for China Sur face Temperature Variation during the Recent 100 Years

    Institute of Scientific and Technical Information of China (English)

    朱玉祥; 赵亮

    2014-01-01

    采用格兰杰(Granger)因果检验法,从自然因素的角度,即从天文因子(太阳黑子数SSN)和地球运动因子(地极移动X方向和Y方向)的角度,对我国近百年地面温度(TC)的变化进行了归因分析。所得结果如下:滞后1~11年内, SSN都不是TC的Granger原因;对于TC和极移X方向,当滞后6年时信度最高,此时极移X方向是TC的Granger原因(87%信度)。研究结果可能暗示,极移X方向的变化可能会导致6年后中国地面气温的变化。%The attribution analysis of China surface temperature(TC) in the recent 100 years is done from the angle of the astronomical factor(sunspot number, SSN) and the earth movement factor(polar shift x direction and y direction ) by using Granger causality test. The results are as follows:(1) SSN is not the Granger cause of TC in all of 1 to 11 years lags;(2) when the lag is 6 years, conifdence is the highest, at this time polar shift x direction is the Granger cause of TC (87%conifdence);(3) when the lag is 12 years, TC is the Granger cause of the polar shift y direction, the conifdence is 86%. The research results of the paper suggest that the change of polar shift x direction possibly results in the change of TC, and the change of TC possibly inlfuences the change of the polar shift y direction.

  14. How to apply clinical cases and medical literature in the framework of a modified "failure mode and effects analysis" as a clinical reasoning tool--an illustration using the human biliary system.

    Science.gov (United States)

    Wong, Kam Cheong

    2016-04-06

    Clinicians use various clinical reasoning tools such as Ishikawa diagram to enhance their clinical experience and reasoning skills. Failure mode and effects analysis, which is an engineering methodology in origin, can be modified and applied to provide inputs into an Ishikawa diagram. The human biliary system is used to illustrate a modified failure mode and effects analysis. The anatomical and physiological processes of the biliary system are reviewed. Failure is defined as an abnormality caused by infective, inflammatory, obstructive, malignancy, autoimmune and other pathological processes. The potential failures, their effect(s), main clinical features, and investigation that can help a clinician to diagnose at each anatomical part and physiological process are reviewed and documented in a modified failure mode and effects analysis table. Relevant medical and surgical cases are retrieved from the medical literature and weaved into the table. A total of 80 clinical cases which are relevant to the modified failure mode and effects analysis for the human biliary system have been reviewed and weaved into a designated table. The table is the backbone and framework for further expansion. Reviewing and updating the table is an iterative and continual process. The relevant clinical features in the modified failure mode and effects analysis are then extracted and included in the relevant Ishikawa diagram. This article illustrates an application of engineering methodology in medicine, and it sows the seeds of potential cross-pollination between engineering and medicine. Establishing a modified failure mode and effects analysis can be a teamwork project or self-directed learning process, or a mix of both. Modified failure mode and effects analysis can be deployed to obtain inputs for an Ishikawa diagram which in turn can be used to enhance clinical experiences and clinical reasoning skills for clinicians, medical educators, and students.

  15. LEGO Mindstorms在农业信息与自动化技术课程教学中的应用研究%Applying LEGO Mindstorms robots as a teaching tool in Agricultural Information and Automation education

    Institute of Scientific and Technical Information of China (English)

    冯雷; 郭亚芳; 王若青; 沈明卫; 何勇

    2012-01-01

    为了培养农业与生物系统工程类学生的创新能力,引导学生进行自主、探究式学习,在精细农业等农业信息和自动化技术类课程教学中应用LEGO Mindstorms组件作为教具,研制智能化可编程农机作业模拟系统.多组学生在Robolab编程环境下,使用此系统构建小型智能农业机械模型.结果表明,学生对这个实验项目的效果很满意.因为它不仅为学生提供了解决问题的技巧,提高了他们编程和机械设计等相关技能,同时有益于学生间形成团队协作的良好氛围.介绍了LECGO Mindstorms在相关农业信息课程教学中的探索应用以及实验结果.%The objective is to present details of an LEGO Mindstorms robot design challenge arranged for agriculture and biosystems engineering students in Zhejiang University. As a new teaching tool in the course of Agricultural Information and Automation, different groups of students built Robotic Agricultural Machines using LEGO Mindstorms kits, with Robolab as the programming environment. A survey among 30 students showed that all students were challenged by the projects and were highly satisfied with the outcomes. They all strongly agreed that the projects were effective in helping them to work in teams, apply problem-solving techniques and to boost their programming and mechanical design skills.

  16. Oil exploration and production activities after the flexibilizing of the strategical state monopoly in Brazil: environmental control tools applied by governmental bodies; Activites d'exploration et de production du petrole dans le nouveau scenario de flexibilite du monopole d'Etat au Bresil. Les controles gouvernementaux pour la protection de l'environnement

    Energy Technology Data Exchange (ETDEWEB)

    Malheiros, T.M.M. [IBAMA, Institut bresilien pour l' Environnement et les Ressources Naturelles Renouvelables, Rio de Janeiro, RI (Brazil); La Rovere, E.L. [Centro de Tecnologia, PPE/COPPE/UFRJ, Rio de Janeiro (Brazil)

    2000-07-01

    The goal of this paper is to discuss the environmental control tools applied by Brazilian governmental bodies to oil exploration and production activities after the flexibilizing of the strategical state monopoly in this sector. An analysis of the environmental control tools applied up to now by governmental bodies is needed due to the fast growth rate of these activities in the last few months and to the entrance of new players in this sector. This work presents the new scenario of the flexibilizing of the state oil monopoly in Brazil and the current situation of environmental control tools applied to oil exploration and production activities. Follow some proposals of changes in the environmental licensing procedures, and for the adoption of environmental audits aiming at an improved environmental control of these activities in the current Brazilian context. (authors)

  17. Applied mathematics

    CERN Document Server

    Logan, J David

    2013-01-01

    Praise for the Third Edition"Future mathematicians, scientists, and engineers should find the book to be an excellent introductory text for coursework or self-study as well as worth its shelf space for reference." -MAA Reviews Applied Mathematics, Fourth Edition is a thoroughly updated and revised edition on the applications of modeling and analyzing natural, social, and technological processes. The book covers a wide range of key topics in mathematical methods and modeling and highlights the connections between mathematics and the applied and nat

  18. Applied Enzymology.

    Science.gov (United States)

    Manoharan, Asha; Dreisbach, Joseph H.

    1988-01-01

    Describes some examples of chemical and industrial applications of enzymes. Includes a background, a discussion of structure and reactivity, enzymes as therapeutic agents, enzyme replacement, enzymes used in diagnosis, industrial applications of enzymes, and immobilizing enzymes. Concludes that applied enzymology is an important factor in…

  19. Applied dynamics

    CERN Document Server

    Schiehlen, Werner

    2014-01-01

    Applied Dynamics is an important branch of engineering mechanics widely applied to mechanical and automotive engineering, aerospace and biomechanics as well as control engineering and mechatronics. The computational methods presented are based on common fundamentals. For this purpose analytical mechanics turns out to be very useful where D’Alembert’s principle in the Lagrangian formulation proves to be most efficient. The method of multibody systems, finite element systems and continuous systems are treated consistently. Thus, students get a much better understanding of dynamical phenomena, and engineers in design and development departments using computer codes may check the results more easily by choosing models of different complexity for vibration and stress analysis.

  20. Four self management assessment tools applied in the health education of CRF%4种自我管理测评工具在CRF健康教育中的应用

    Institute of Scientific and Technical Information of China (English)

    寇琳; 官计; 张娅

    2012-01-01

    目的 探讨4种自我行为管理测评工具在慢性肾功能衰竭患者健康教育中应用的可行性以及有效性.方法 按诊断标准选取该院200例慢性肾功能衰竭(CRF)患者,随机分为4组,各组均给予相同方式的健康教育,并在其前后分别以多方面健康状况-健康功能指标量表(Ⅰ组)、慢性病自我管理研究测评量表(Ⅱ组)、Partners健康量表(Ⅲ组)、住院患者活动自评量表(Ⅳ组)4种自我管理测评工具对其健康教育的效果进行测评.结果 各组健康教育前后的测评结果 差异均具有统计学意义,(P<0.05);Ⅰ组、Ⅱ组、Ⅲ组、Ⅳ组重测信度分别为0.75、0.71、0.82、0.88,(P<0.05);Cronbach's a系数分别为0.81、0.71、0.93、0.91,(P<0.05).经对比,Ⅲ组所使用Partners健康量表对CRF健康教育的效果进行测评时表现出更高的适用性和有效性.结论 使用Partners健康量表在应用于慢性肾功能衰竭患者健康教育中具有良好的信效度,能更大限度的保证其结果 的真实性、准确性、有效性和可靠性,值得临床推广.%Objective To study 4 kinds of self behavior management assessment tools in patients with chronic renal failure health education,the application feasibility and effectiveness. Methods According to the selection criteria for the diagnosis of 200 cases of chronic renal failure ( CRF) patients were randomly divided into four groups and each group was given the same health education. Health-health function index scale (group I) .chronic disease self management research assessment scale (group II) .Partners health scale (group III ) .hospitalized patients self-evaluation scale ( group IV ) 4 kinds of self management assessment tools were used respectively to evaluate health education effects. Results For each group before and after the health education evaluation results were statistically different ( P <0.05 ). Test-retest reliabilities of group I ,group n.group III,group IV were

  1. Ferramenta fórum para discussão teórica em Estatística aplicada à Administração Forums as tools for theoretical discussion in Statistics applied to Administration

    Directory of Open Access Journals (Sweden)

    Daielly Melina Nassif Mantovani

    2010-08-01

    Full Text Available O objetivo deste artigo é analisar a utilização da ferramenta fórum no ensino de Estatística Aplicada à Administração. Estudou-se um fórum realizado em uma disciplina semipresencial, no qual os alunos discutiriam sobre um determinado estudo de caso que deveriam resolver (pesquisa sobre a viabilidade de um novo negócio. Observou-se que a maior parte dos alunos matriculados participou da atividade, provavelmente devido à sua obrigatoriedade. A maioria das mensagens foi classificada como relevante, correta e adequada, caracterizando debates. Ocorreu, porém, grande repetição, provavelmente devido à grande quantidade de alunos e à predominante participação no último dia da atividade. Houve poucos acessos após o encerramento da discussão, o que indica que o conteúdo construído foi pouco utilizado como fonte de consulta. De forma geral, a experiência foi bem-sucedida, indicando que esta ferramenta pode efetivamente ser utilizada como estratégia pedagógica para o ensino de Estatística, desde que se elabore um planejamento cuidadoso.This paper aims to analyze the use of forum discussion as a tool in the teaching of statistics applied to administration. In the study of a forum occurred in a hybrid discipline, the students were supposed to discuss a case study that they should solve (a new business viability research. Most students participated in the activity, probably due to its obligatory nature. Most messages posted were considered relevant, correct and adequate, characterizing the debates. There was, however, a lot of repetition, probably due to the great number of students interacting and to the high level of participation on the last day of the forum. There was little access to the forum after the last day of the interaction, which means that the content developed was not very much used as a research source. In general terms, this experience was successful, indicating that forums can really be used as a pedagogical

  2. Micromachining with Nanostructured Cutting Tools

    CERN Document Server

    Jackson, Mark J

    2013-01-01

    The purpose of the brief is to explain how nanostructured tools can be used to machine materials at the microscale.  The aims of the brief are to explain to readers how to apply nanostructured tools to micromachining applications. This book describes the application of nanostructured tools to machining engineering materials and includes methods for calculating basic features of micromachining. It explains the nature of contact between tools and work pieces to build a solid understanding of how nanostructured tools are made.

  3. Philosophical Foundation of Chinese Modernity and the Development of Chinese Philosophy in Recent 100 Years%中华现代性的哲学奠基与百年来的中国哲学

    Institute of Scientific and Technical Information of China (English)

    沈清松

    2013-01-01

    提本文关心的问题在于“中华现代性”的哲学奠基,并扣紧此一思路将百年来的中国哲学发展分为四个阶段。第一阶段,哲学作为西方现代性基础思想之引进者,最领风骚。第二阶段,哲学协助反思民族精神,虽渐退居第二线,然已开始思索中华现代性的方向。第三阶段,哲学工作者致力提出数个融合中西的哲学体系,试图为中华现代性作哲学奠基,在人文、社会科学界中实属难能可贵。现在,处于第四期后期,中国哲学面对全球化与后现代挑战,应能在跨文化与全世界脉络中继续探辟概念系统、生活世界与中华精神资源,进行批判性、合理性、顾全整体的哲学思索。%Focusing on the problematics of Chinese modernity and its philosophical foundation, this paper divides the development of philosophy in China in recent 100 years into four peroiods.In its first period, from 1911 to 1927, philosophers in China were the most enthusiastic in introdu-cing Western modernity and its philosophical foundations from various forms and doctrines of modern Western philosophy.This period was the most pro-gressive and, indeed, impacted all Chinese intellectuals at the time.In its second period from 1928 to 1949, during the time of national construction and the Japanese invasion, philosophers stepped behind, serving as helpers in clarifying and articulating the Chinese spirit and Chinese subjectivity at the time of its awakening.This prepared for the philosophical foundation of a Chinese model of modernity.In the third period from 1949 to 1980, some major philosophical figures had built their philosophical systems synthesizing Western and Chinese philosophies.This was, indeed, most precious, and also unique, in comparison with other disciplines in the humanities and the social sciences.These philosophical systems could be seen as divers at-tempts to lay the philosophical foundation of Chinese

  4. Applied Literature for Healing,

    Directory of Open Access Journals (Sweden)

    Susanna Marie Anderson

    2014-11-01

    Full Text Available In this qualitative research study interviews conducted with elite participants serve to reveal the underlying elements that unite the richly diverse emerging field of Applied Literature. The basic interpretative qualitative method included a thematic analysis of data from the interviews yielding numerous common elements that were then distilled into key themes that elucidated the beneficial effects of engaging consciously with literature. These themes included developing a stronger sense of self in balance with an increasing connection with community; providing a safe container to engage challenging and potentially overwhelming issues from a stance of empowered action; and fostering a healing space for creativity. The findings provide grounds for uniting the work being done in a range of helping professions into a cohesive field of Applied Literature, which offers effective tools for healing, transformation and empowerment. Keywords: Applied Literature, Bibliotherapy, Poetry Therapy, Arts in Corrections, Arts in Medicine

  5. Geometric reasoning about assembly tools

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, R.H.

    1997-01-01

    Planning for assembly requires reasoning about various tools used by humans, robots, or other automation to manipulate, attach, and test parts and subassemblies. This paper presents a general framework to represent and reason about geometric accessibility issues for a wide variety of such assembly tools. Central to the framework is a use volume encoding a minimum space that must be free in an assembly state to apply a given tool, and placement constraints on where that volume must be placed relative to the parts on which the tool acts. Determining whether a tool can be applied in a given assembly state is then reduced to an instance of the FINDPLACE problem. In addition, the author presents more efficient methods to integrate the framework into assembly planning. For tools that are applied either before or after their target parts are mated, one method pre-processes a single tool application for all possible states of assembly of a product in polynomial time, reducing all later state-tool queries to evaluations of a simple expression. For tools applied after their target parts are mated, a complementary method guarantees polynomial-time assembly planning. The author presents a wide variety of tools that can be described adequately using the approach, and surveys tool catalogs to determine coverage of standard tools. Finally, the author describes an implementation of the approach in an assembly planning system and experiments with a library of over one hundred manual and robotic tools and several complex assemblies.

  6. Applied combustion

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-12-31

    From the title, the reader is led to expect a broad practical treatise on combustion and combustion devices. Remarkably, for a book of modest dimension, the author is able to deliver. The text is organized into 12 Chapters, broadly treating three major areas: combustion fundamentals -- introduction (Ch. 1), thermodynamics (Ch. 2), fluid mechanics (Ch. 7), and kinetics (Ch. 8); fuels -- coal, municipal solid waste, and other solid fuels (Ch. 4), liquid (Ch. 5) and gaseous (Ch. 6) fuels; and combustion devices -- fuel cells (Ch. 3), boilers (Ch. 4), Otto (Ch. 10), diesel (Ch. 11), and Wankel (Ch. 10) engines and gas turbines (Ch. 12). Although each topic could warrant a complete text on its own, the author addresses each of these major themes with reasonable thoroughness. Also, the book is well documented with a bibliography, references, a good index, and many helpful tables and appendices. In short, Applied Combustion does admirably fulfill the author`s goal for a wide engineering science introduction to the general subject of combustion.

  7. Disclosure as a regulatory tool

    DEFF Research Database (Denmark)

    2006-01-01

    The chapter analyses how disclure can be used as a regulatory tool and analyses how it has been applied so far in the area of financial market law and consumer law.......The chapter analyses how disclure can be used as a regulatory tool and analyses how it has been applied so far in the area of financial market law and consumer law....

  8. Nutrient Dynamics over the Past 100 Years and Its Restoration Baseline in Dianshan Lake%淀山湖百年营养演化历史及营养物基准的建立

    Institute of Scientific and Technical Information of China (English)

    李小平; 陈小华; 董旭辉; 董志; 孙敦平

    2012-01-01

    Cyclotella bodanica,C.ocelata,Achnanthes minutissima,Cocconeis placentula var lineate,Cymbella sp.,Fragilaria piata,F.brevistrata,F.construens var venter to recent eutrophic species including Cyclostephanos dubius,C.atomus,Stephanodiscus minitulus,S.hantzschi,Aulacoseria alpigena.The epilimnetic TP over the past 100 years reconstructed using an established diatom-TP transfer function matches well with the monitoring TP where exists.Based on the sedimentary nutrient characteristics and diatom-reconstructed nutrient dynamics,we proposed that the nutrient baseline for Dianshan Lake is 50-60 μg·L-1,500 mg·kg-1 and 550 mg·kg-1 for water TP concentration,sedimentary TP and TN,respectively.

  9. Applied biotechnology in nematology.

    Science.gov (United States)

    Caswell-Chen, E P; Williamson, V M; Westerdahl, B B

    1993-12-01

    During the past two decades, rapid advances in biotechnology and molecular biology have affected the understanding and treatment of human and plant diseases. The human and Caenorhabditis elegans genome-sequencing projects promise further techniques and results useful to applied nematology. Of course, biotechnology is not a panacea for nematological problems, but it provides many powerful tools that have potential use in applied biology and nematode management. The tools will facilitate research on a range of previously intractable problems in nematology, from identification of species and pathotypes to the development of resistant cultivars that have been inaccessible because of technical limitations. However, to those unfamiliar or not directly involved with the new technologies and their extensive terminology, the benefits of the advances in biotechnology may not be readily discerned. The sustainable agriculture of the future will require ecology-based management, and successful integrated nematode management will depend on combinations of control tactics to reduce nematode numbers. In this review we discuss how biotechnology may influence nematode management, define terminology relative to potential applications, and present current and future avenues of research in applied nematology, including species identification, race and pathotype identification, development of resistant cultivars, definition of nematode-host interactions, nematode population dynamics, establishment of optimal rotations, the ecology of biological control and development of useful biological control agents, and the design of novel nematicides.

  10. Applied ALARA techniques

    Energy Technology Data Exchange (ETDEWEB)

    Waggoner, L.O.

    1998-02-05

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work.

  11. 代码重构工具在面向对象教学的应用探索%Research on Apply of Refactor Tools in Object-Orient Programming Teaching

    Institute of Scientific and Technical Information of China (English)

    沈健

    2013-01-01

    Presented a way that design several teaching cases and use the code refactoring tools for software reconstruction and im-provement in object-oriented programming experiment teaching. This method can improve the students' understanding of code reuse and reconstruction of software. It also can help student to understand the object-oriented ideas and improve the ability of programming.%针对面向对象程序教学,提出在实验教学中通过设计案例,应用代码重构工具对程序进行重构和改进,提高学生对代码复用以及软件重构等的认识,有助于面向对象思想的掌握,也有利于学生编程能力的提高。

  12. 100 years' evolution of fisheries higher education and its strategic transformation in China%我国水产高等教育的百年沿革与战略转型

    Institute of Scientific and Technical Information of China (English)

    宁波

    2011-01-01

    In the early 20th century, in order to safeguard the country' s maritime rights and interests and to develop national fisheries industry, the government of Qing Dynasty and Republic of China learning from Japan, United States and other countries' experiences, began to develop China' s fisheries education. After the founding of new China, Shanghai Fisheries College and other fisheries colleges had been founded since 1952 in succession. In 1950's and 1960's, learning experiences from Soviet Union, the system of fisheries higher education was established in China. After decades of development, the fisheries higher education had made great achievements, and made outstanding contributions to the development of fisheries industry in China. In the late 20th century, to meet the needs of the marine industry development, and the self-development needs of fisheries higher education, and the needs of building a modern marine society, the fisheries colleges and universities have changed their names to marine universities, and have transformed from single discipline universities into multi-disciplinary marine universities. The transformation has promoted the development of marine higher education in China. For the effective development of marine higher education,it is suggested to take higher starting point and the road of international education, and to develop basic sciences and applied sciences coordinated, and to optimize the structure of marine higher education further,and to build a good three-dimensional linkage mechanism between the government, marine universities and the society.%20世纪初,为维护海权,清政府、国民政府先后学习日、美等国经验,开始发展水产教育.新中国成立后,上海水产学院等若干所本科水产学府从1952年起陆续建立.20世纪五六十年代,中国学习苏联水产高校经验,奠定中国水产高等教育的主要格局.经过几十年发展,中国水产高等教育取得很大成就,为中国水

  13. Tool steels

    DEFF Research Database (Denmark)

    Højerslev, C.

    2001-01-01

    On designing a tool steel, its composition and heat treatment parameters are chosen to provide a hardened and tempered martensitic matrix in which carbides are evenly distributed. In this condition the matrix has an optimum combination of hardness andtoughness, the primary carbides provide...... resistance against abrasive wear and secondary carbides (if any) increase the resistance against plastic deformation. Tool steels are alloyed with carbide forming elements (Typically: vanadium, tungsten, molybdenumand chromium) furthermore some steel types contains cobalt. Addition of alloying elements...

  14. Applied mechanics of solids

    CERN Document Server

    Bower, Allan F

    2009-01-01

    Modern computer simulations make stress analysis easy. As they continue to replace classical mathematical methods of analysis, these software programs require users to have a solid understanding of the fundamental principles on which they are based. Develop Intuitive Ability to Identify and Avoid Physically Meaningless Predictions Applied Mechanics of Solids is a powerful tool for understanding how to take advantage of these revolutionary computer advances in the field of solid mechanics. Beginning with a description of the physical and mathematical laws that govern deformation in solids, the text presents modern constitutive equations, as well as analytical and computational methods of stress analysis and fracture mechanics. It also addresses the nonlinear theory of deformable rods, membranes, plates, and shells, and solutions to important boundary and initial value problems in solid mechanics. The author uses the step-by-step manner of a blackboard lecture to explain problem solving methods, often providing...

  15. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  16. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  17. Applied investigation on multimedia interactive teaching system of numerical control machine tools%多媒体视频互动系统在数控机床实训中的应用研究

    Institute of Scientific and Technical Information of China (English)

    施立钦

    2014-01-01

    多媒体数控机床实训互动教学系统由高清示教系统、视频监控系统、视频显示系统、语音对讲系统、集中控制系统及课程录播等多个子系统构成,并通过后台软件融合成一个有机的整体。运用该系统可以解决当前高职院校数控技术专业面临的教学方法滞后、实践效果差、安全隐患不断、教师资源不足等诸多问题,探索出一种新的多媒体视频互动系统在数控机床实训中的教学模式。%Multimedia interactive teaching system of numerical control machine tools consisted of high deifnition demonstration teaching system, video monitoring system, video display system, voice intercom system, centralized control system, course recorded and multiple subsystems and formed an organic whole through the background software. The multimedia interactive teaching system solved the current numerical control technology specialty teaching facing many problems such as teaching lag, low practice teaching method effect, safety hidden trouble, insufifciency of teacher resources and many other issues in higher vocational colleges. A new teaching mode was explored.

  18. Ludic Educational Game Creation Tool

    DEFF Research Database (Denmark)

    Vidakis, Nikolaos; Syntychakis, Efthimios; Kalafatis, Konstantinos

    2015-01-01

    creation tool features a web editor, where the game narrative can be manipulated, according to specific needs. Moreover, this tool is applied for creating an educational game according to a reference scenario namely teaching schoolers road safety. A ludic approach is used both in game creation and play...

  19. Herramientas y propuestas de innovación basadas en la tecnología de realidad aumentada aplicadas a la literatura infantil y juvenil / Tools and proposals for innovation based on augmented reality technology applied children's literature

    Directory of Open Access Journals (Sweden)

    Noelia Margarita Moreno Martínez

    2017-01-01

    Full Text Available Resumen En la sociedad de la información en la que nos encontramos inmersos la proliferación y el auge que están teniendo hoy en día dispositivos como smartphone, tablet, phablet se manifiesta en la asunción de nuevos modelos de aprendizaje, formas de vida, de comunicación, relaciones y entretenimiento por parte del ciudadano de la nueva era digital. Así pues, el desarrollo de estrategias para implantar estas nuevas herramientas tecnológicas en el aula supone una oportunidad para replantearnos la práctica educativa acorde con las nuevas características, demandas y necesidades del alumnado diverso al que se atiende, aprovechando así las posibilidades que nos ofrecen las tecnologías emergentes como la realidad aumentada (RA bajo una nueva modalidad de aprendizaje basada en Mobile Learning. En el presente trabajo realizaremos una revisión y análisis de aplicaciones móviles basadas en la tecnología de realidad aumentada para entornos Android e iOs y, posteriormente, presentaremos propuestas de actividades para la implementación de dicha tecnología en el abordaje de la literatura infantil y juvenil. Abstract In our society´s information where we are immersed there is a boom about devices such as smartphone, tablet, phablet. They are reflected in the assumption of new learning models, lifestyles, communication, relationships and entertainment by citizens of the new digital era. Thus, the development of strategies to implement these new technological tools in the classroom is an opportunity to rethink the line educational practice with new features, demands and needs of diverse student. They are attending and take advantage of the possibilities offered by emerging technologies as augmented reality in a new mode of learning based on Mobile Learning. In this paper, we review and analysis of mobile applications based on augmented reality technology developed for Android and iOS environments, and we will present activities for the

  20. Acupuncture applied as a sensory discrimination training tool decreases movement-related pain in patients with chronic low back pain more than acupuncture alone: a randomised cross-over experiment.

    Science.gov (United States)

    Wand, Benedict Martin; Abbaszadeh, Sam; Smith, Anne Julia; Catley, Mark Jon; Moseley, G Lorimer

    2013-11-01

    High-quality clinical evidence suggests that although acupuncture appears superior to usual care in the management of chronic low back pain, there is little meaningful difference between true and sham acupuncture. This suggests that the benefits of acupuncture are mediated by the placebo response. An alternative explanation is that sham acupuncture is an active treatment and shares a mechanism of action with traditionally applied acupuncture. One plausible candidate for this mechanism is improvement in self-perception mediated through the sensory discrimination-like qualities of acupuncture. We aimed to compare the effects of acupuncture with a sensory discrimination training component to acupuncture without. 25 people with chronic low back pain were enroled in a randomised cross-over experiment. We compared the effect of acupuncture delivered when sensory discrimination is optimised to acupuncture delivered when it is not, on movement-related back pain immediately after each intervention. We found that the average pain intensity after participants had received acupuncture with sensory discrimination training (2.8±2.5) was less than when they received acupuncture without sensory discrimination training (3.6±2.0). This difference was statistically significant (after adjustment; mean difference=-0.8, 95% CI -1.4 to -0.3; p=0.011). Our findings are consistent with the idea that acupuncture may offer specific benefit that is not dependent on precisely where the needles are inserted so much as that the patient attends to where they are inserted. If so, the location of the needles might be better focused on the painful area and the need for penetration of the skin may be mitigated.

  1. [100 years of studying poliomyelitis virus and nonpoliomyelitis enteroviruses].

    Science.gov (United States)

    Lashkevich, V A

    2008-01-01

    M. P. Chumakov Institute of Poliomyelitis and Viral Encephalitis, Russian Academy of Medical Sciences, Moscow The paper deals with the history of discovery of poliomyelitis virus by K. Landsteiner and E. Popper in 1908, the identification of three immunological types of the virus in 1949, the discovery of viral multiplication in the cultures of non-nerve cells with a cytopathogenic effect by A. Anders in 1949, the development of new diagnostic techniques, the design of inactivated poliovirus vaccine by D. Salk in 1953 and its live vaccine by A. Sabin in 1957. The advantages and disadvantages of these vaccines and the prospects for further poliomyelitis control are discussed. The characteristics and role of nonpoliomyelitis enteroviruses are considered. The most important scientific discoveries made in the study of enteroviruses are noted.

  2. Celebrating 100 Years of Flight: Testing Wing Designs in Aircraft

    Science.gov (United States)

    Pugalee, David K.; Nusinov, Chuck; Giersch, Chris; Royster, David; Pinelli, Thomas E.

    2005-01-01

    This article describes an investigation involving several designs of airplane wings in trial flight simulations based on a NASA CONNECT program. Students' experiences with data collection and interpretation are highlighted. (Contains 5 figures.)

  3. Mário de Sá-Carneiro, 100 years later

    Directory of Open Access Journals (Sweden)

    Paula Cristina Costa

    2016-12-01

    Full Text Available http://dx.doi.org/10.5007/2175-7917.2016v21n2p112 Mário de Sá-Carneiro’s work is still current even hundred years after his death: the modernity of his style is not limited to the modernist fashions of the time, but remains contemporary of today. This work has a large thematic coherence. Throughout his poetry, prose and drama, is repeated anew the theme of Myself/Other, the desire to achieve absolute perfection, such as Icarus, and the failure of this fulfillment. In texts like the well known «Quasi» and «7» poems as well as in the «A Confissão de Lúcio» novel, it is clearly present this wish for the merge between Myself and the Other and the impossibility of achieving it. Mário de Sá-Carneiro was, together with Fernando Pessoa, one of the founders of the Portuguese Modernism and one of the directors of «Orpheu» magazine. Both poets created several important isms for the Portuguese Modermism: paulismo, interseccionismo, sensacionismo. Nevertheless they kept themselves loyal to their own styles.

  4. I. P. PAVLOV: 100 YEARS OF RESERACH ON ASSOCIATIVE LEARNING

    Directory of Open Access Journals (Sweden)

    GERMÁN GUTIÉRREZ

    2005-07-01

    Full Text Available A biographical summary of Ivan Pavlov is presented, emphasizing his academic formation and achievements, and hiscontributions to general science and psychology. His main findings on associative learning are described and three areasof current development in this area are discussed: the study of behavioral mechanisms, the study of neurobiologicalmechanisms and the functional role of learning.

  5. 100 Years of Attempts to Transform Physics Education

    Science.gov (United States)

    Otero, Valerie K.; Meltzer, David E.

    2016-12-01

    As far back as the late 1800s, U.S. physics teachers expressed many of the same ideas about physics education reform that are advocated today. However, several popular reform efforts eventually failed to have wide impact, despite strong and enthusiastic support within the physics education community. Broad-scale implementation of improved instructional models today may be just as elusive as it has been in the past, and for similar reasons. Although excellent instructional models exist and have been available for decades, effective and scalable plans for transforming practice on a national basis have yet to be developed and implemented. Present-day teachers, education researchers, and policy makers can find much to learn from past efforts, both in their successes and their failures. To this end, we present a brief outline of some key ideas in U.S. physics education during the past 130 years. We address three core questions that are prominent in the literature: (a) Why and how should physics be taught? (b) What physics should be taught? (c) To whom should physics be taught? Related issues include the role of the laboratory and attempts to make physics relevant to everyday life. We provide here only a brief summary of the issues and debates found in primary-source literature; an extensive collection of historical resources on physics education is available at https://sites.google.com/site/physicseducationhistory/home.

  6. The death of Florence Nightingale: BJN 100 years ago.

    Science.gov (United States)

    Castledine, Sir George

    This August marks the centenary of the death of Florence Nightingale, who died at 2 o'clock on Saturday 13 August 1910 at her home, 10 South Street, Park Lane, London. The following are some snippets which appeared in the BJN of the 20 and 27 August 1910. It was not until the announcement of her death in the morning papers of Monday 15 August that the country heard about Nightingale's death. In her last hours she was attended by Sir Thomas Barlow and two nurses from the Nursing Sisters' Institution, Devonshire Square, founded by Mrs Elizabeth Fry in 1840.

  7. 100 years after the Marsica earthquake: contribute of outreach activities

    Science.gov (United States)

    D'Addezio, Giuliana; Giordani, Azzurra; Valle, Veronica; Riposati, Daniela

    2015-04-01

    Many outreach events have been proposed by the scientific community to celebrate the Centenary of the January 13, 1915 earthquake, that devastated the Marsica territory, located in Central Apennines. The Laboratorio Divulgazione Scientifica e Attività Museali of the Istituto Nazionale di Geofisica e Vulcanologia (INGV's Laboratory for Outreach and Museum Activities) in Rome, has realised an interactive exhibition in the Castello Piccolomini, Celano (AQ), to retrace the many aspects of the earthquake disaster, in a region such as Abruzzo affected by several destructive earthquakes during its history. The initiatives represent an ideal opportunity for the development of new programs of communication and training on seismic risk and to spread the culture of prevention. The INGV is accredited with the Servizio Civile Nazionale (National Civic Service) and volunteers are involved in the project "Science and Outreach: a comprehensive approach to the divulgation of knowledge of Earth Sciences" starting in 2014. In this contest, volunteers had the opportunity to fully contribute to the exhibition, in particular, promoting and realising two panels concerning the social and environmental consequences of the Marsica earthquake. Describing the serious consequences of the earthquake, we may raise awareness about natural hazards and about the only effective action for earthquake defense: building with anti seismic criteria. After studies and researches conducted in libraries and via web, two themes have been developped: the serious problem of orphans and the difficult reconstruction. Heavy snowfalls and the presence of wolves coming from the high and wild surrounding mountains complicated the scenario and decelerated the rescue of the affected populations. It is important to underline that the earthquake was not the only devastating event in the country in 1915; another drammatic event was, in fact, the First World War. Whole families died and the still alive infants and children were sent to Rome in hospitals and in other suitable structures. Many stories of poor orphans are known but we decided to outlines stories that besides the dramma had an happy ending. To understand the hugeness of the tragedy, we may consider that the number of towns and villages completely destroyed by the earthquake was more than fifty. The reconstruction was very difficult and slow also because of the war, and involved the relocation of settlements in different places. The first shelters to be reconstructed were those for survivors: very small shacks built with anti seismic criteria. They are still on the territory, to be a symbol of the reconstruction and a remined evidence of the earthquake.

  8. Evolutionary Lightsailing Missions for the 100-Year Starship

    Science.gov (United States)

    Friedman, L.; Garber, D.; Heinsheimer, T.

    Incremental milestones towards interstellar flight can be achieved in this century by building on first steps with lightsailing, the only known technology that might someday take us to the stars. That this is now possible is enabled by achievements of first solar sail flights, the use of nano-technology for miniaturization of spacecraft, advances in information processing and the decoding of our genomes into transportable form. This paper quantifies a series of robotic steps through and beyond the solar system that are practical and would stimulate the development of new technologies in guidance, navigation, materials, communication, sensors, information processing etc. while exploring ever-more distant, exciting space objectives at distances impractical for classical rocket-based technologies. There robotic steps may be considered as precursors to human interstellar flight, but they may also be considered as evolutionary steps that provide for a different future: One of virtual human interstellar flight that may bypass the ideas of the past (big rockets launching heavy people) in favour of those of the future ­ networking amongst the stars with information, and the physical transport of digital and biological genomes.

  9. SA stands on the shoulders of 100 years of innovation

    CSIR Research Space (South Africa)

    Wall, K

    2010-05-01

    Full Text Available Aircraft and the motor car, with improved roads, have revolutionised inter-city travel; the internet has revolutionised how learners prepare assignments; computers have revolutionised data sorting and analysis, and so on. This newspaper article...

  10. Martin Gardner: 100 Years of the Magic of Physics

    Science.gov (United States)

    Gillaspy, John D.

    2014-01-01

    2014 marks the 100th anniversary of the birth of Martin Gardner, a man whose writings helped inspire generations of young students to pursue a career in physics and mathematics. From his first to his last, and many in between, Gardner's publications often combined magic and science. A recurring theme was the clever use of physical principles…

  11. [100 years of surgery in the Kosevo Hospital in Sarajevo].

    Science.gov (United States)

    Durić, O

    1994-01-01

    The surgery Department of the Regional Hospital was opened on 1st July, 1894 in Sarajevo, what meant the beginnings of European surgery school influence here. The School was in the second half of its activity, better known as "century of surgery". The building, fittings, equipment and staff continued their work here coping the Viennese school achievements. It was headed by the prominent European surgeon, primarius Dr Josef Preindisberger, first assistant to the great personality Dr. Billroth. In the way this institution became a referral centre for two other hospitals in Sarajevo: the Vakuf's and the Military Hospital, but for some 17 more in BH, which were built in the course of ten years. Because of the therapeutic success in the domain of the general surgery and diseases of the eye and according the annual reports, the first 50 beds became insufficient for all those who wanted the treatment. So, the Department was enlarged, in 1905 a new regional Hospital was planned, to act as clinics. The World War 1 stopped the plans. During the period of Kingdom of Yugoslavia, destroyed by war, the Surgery Department continued its work with the doctors educated to continue the work on the pre war level. As a broad pathology basis, but the need of space that time chief surgeon. Primarius Milivoje Kostić worked out in details the former plan of the new hospital building up with a base for clinics. It was accepted as a ten years project, which, to the regrets, did not come to existence to the World War 2.(ABSTRACT TRUNCATED AT 250 WORDS)

  12. Sunspot Cycle 24: Smallest Cycle in 100 Years?

    Science.gov (United States)

    2005-01-11

    and H. B. Hathaway, D. H., R. M. Wilson, and E. J. Reichmann (1994), The shape of Snodgrass (1988), The extended solar activity cycle, Nature, 333...748, the sunspot cycle, Sol. Phys., 151, 177. doi:10.1038/333748a0. Hathaway, D. H., R. M. Wilson, and E. J. Reichmann (2002), Group sun- spot numbers

  13. History of the Shaped Charge Effect: The First 100 Years

    Science.gov (United States)

    1990-03-22

    transferred, inasmuch as 10 Part 1 both originators of the effect were in proximiy - southern Gernmany and Switzerland border each other. Dr. Mohaupt’s...Mistel ( Mistletoe ) referred to the parasitic mounting of the top aircraft on the host aircraft. In the tactical version, the bomber’s nose was replaced...16) in the patents (Ref. 32) issued in France in 1940 and in Australia in 1941, wherein the inventors (Mohaupt and his two associates) had claimed the

  14. Global change and water resources in the next 100 years

    Science.gov (United States)

    Larsen, Matthew C.; Hirsch, R.M.

    2010-01-01

    We are in the midst of a continental-scale, multi-year experiment in the United States, in which we have not defined our testable hypotheses or set the duration and scope of the experiment, which poses major water-resources challenges for the 21st century. What are we doing? We are expanding population at three times the national growth rate in our most water-scarce region, the southwestern United States, where water stress is already great and modeling predicts decreased streamflow by the middle of this century. We are expanding irrigated agriculture from the west into the east, particularly to the southeastern states, where increased competition for ground and surface water has urban, agricultural, and environmental interests at odds, and increasingly, in court. We are expanding our consumption of pharmaceutical and personal care products to historic high levels and disposing them in surface and groundwater, through sewage treatment plants and individual septic systems. These substances are now detectable at very low concentrations and we have documented significant effects on aquatic species, particularly on fish reproduction function. We don’t yet know what effects on human health may emerge, nor do we know if we need to make large investments in water treatment systems, which were not designed to remove these substances. These are a few examples of our national-scale experiment. In addition to these water resources challenges, over which we have some control, climate change models indicate that precipitation and streamflow patterns will change in coming decades, with western mid-latitude North America generally drier. We have already documented trends in more rain and less snow in western mountains. This has large implications for water supply and storage, and groundwater recharge. We have documented earlier snowmelt peak spring runoff in northeastern and northwestern States, and western montane regions. Peak runoff is now about two weeks earlier than it was in the first half of the 20th century. Decreased summer runoff affects water supply for agriculture, domestic water supply, cooling needs for thermoelectric power generation, and ecosystem needs. In addition to the reduced volume of streamflow during warm summer months, less water results in elevated stream temperature, which also has significant effects on cooling of power generating facilities and on aquatic ecosystem needs. We are now required to include fish and other aquatic species in negotiation over how much water to leave in the river, rather than, as in the past, how much water we could remove from a river. Additionally, we must pay attention to the quality of that water, including its temperature. This is driven in the US by the Endangered Species Act and the Clean Water Act. Furthermore, we must now better understand and manage the whole hydrograph and the influence of hydrologic variability on aquatic ecosystems. Man has trimmed the tails off the probability distribution of flows. We need to understand how to put the tails back on but can’t do that without improved understanding of aquatic ecosystems. Sea level rise presents challenges for fresh water extraction from coastal aquifers as they are compromised by increased saline intrusion. A related problem faces users of ‘run-of-the-river’ water-supply intakes that are threatened by a salt front that migrates further upstream because of higher sea level. We face significant challenges with water infrastructure. The U.S. has among the highest quality drinking water in the world piped to our homes. However, our water and sewage treatment plants and water and sewer pipelines have not had adequate maintenance or investment for decades. The US Environmental Protection Agency estimates that there are up to 3.5M illnesses per year from recreational contact with sewage from sanitary sewage overflows. Infrastructure investment needs have been put at 5 trillion nationally. Global change and water resources c

  15. Engineering and malaria control: learning from the past 100 years

    DEFF Research Database (Denmark)

    Konradsen, Flemming; van der Hoek, Wim; Amerasinghe, Felix P

    2004-01-01

    Traditionally, engineering and environment-based interventions have contributed to the prevention of malaria in Asia. However, with the introduction of DDT and other potent insecticides, chemical control became the dominating strategy. The renewed interest in environmental-management-based approa......Traditionally, engineering and environment-based interventions have contributed to the prevention of malaria in Asia. However, with the introduction of DDT and other potent insecticides, chemical control became the dominating strategy. The renewed interest in environmental...

  16. Global change and water resources in the next 100 years

    Science.gov (United States)

    Larsen, M. C.; Hirsch, R. M.

    2010-03-01

    We are in the midst of a continental-scale, multi-year experiment in the United States, in which we have not defined our testable hypotheses or set the duration and scope of the experiment, which poses major water-resources challenges for the 21st century. What are we doing? We are expanding population at three times the national growth rate in our most water-scarce region, the southwestern United States, where water stress is already great and modeling predicts decreased streamflow by the middle of this century. We are expanding irrigated agriculture from the west into the east, particularly to the southeastern states, where increased competition for ground and surface water has urban, agricultural, and environmental interests at odds, and increasingly, in court. We are expanding our consumption of pharmaceutical and personal care products to historic high levels and disposing them in surface and groundwater, through sewage treatment plants and individual septic systems. These substances are now detectable at very low concentrations and we have documented significant effects on aquatic species, particularly on fish reproduction function. We don’t yet know what effects on human health may emerge, nor do we know if we need to make large investments in water treatment systems, which were not designed to remove these substances. These are a few examples of our national-scale experiment. In addition to these water resources challenges, over which we have some control, climate change models indicate that precipitation and streamflow patterns will change in coming decades, with western mid-latitude North America generally drier. We have already documented trends in more rain and less snow in western mountains. This has large implications for water supply and storage, and groundwater recharge. We have documented earlier snowmelt peak spring runoff in northeastern and northwestern States, and western montane regions. Peak runoff is now about two weeks earlier than it was in the first half of the 20th century. Decreased summer runoff affects water supply for agriculture, domestic water supply, cooling needs for thermoelectric power generation, and ecosystem needs. In addition to the reduced volume of streamflow during warm summer months, less water results in elevated stream temperature, which also has significant effects on cooling of power generating facilities and on aquatic ecosystem needs. We are now required to include fish and other aquatic species in negotiation over how much water to leave in the river, rather than, as in the past, how much water we could remove from a river. Additionally, we must pay attention to the quality of that water, including its temperature. This is driven in the US by the Endangered Species Act and the Clean Water Act. Furthermore, we must now better understand and manage the whole hydrograph and the influence of hydrologic variability on aquatic ecosystems. Man has trimmed the tails off the probability distribution of flows. We need to understand how to put the tails back on but can’t do that without improved understanding of aquatic ecosystems. Sea level rise presents challenges for fresh water extraction from coastal aquifers as they are compromised by increased saline intrusion. A related problem faces users of ‘run-of-the-river’ water-supply intakes that are threatened by a salt front that migrates further upstream because of higher sea level. We face significant challenges with water infrastructure. The U.S. has among the highest quality drinking water in the world piped to our homes. However, our water and sewage treatment plants and water and sewer pipelines have not had adequate maintenance or investment for decades. The US Environmental Protection Agency estimates that there are up to 3.5M illnesses per year from recreational contact with sewage from sanitary sewage overflows. Infrastructure investment needs have been put at 5 trillion nationally. Global change and water resources challenges that we face this century include a combination of local and national management problems that are already upon us, as well as emerging and future problems that are closely associated with rising temperature and changes in precipitation distribution in time and space.

  17. Huck Finn: 100 Years of Durn Fool Problems.

    Science.gov (United States)

    Stanek, Lou Willett

    1985-01-01

    Discusses the censorship of Mark Twain's "Huckleberry Finn" since it was first published in 1885. Highlights include Twain's public image, viewpoints of censors, the banning of the book and school censorship cases, and the celebration of the centennial of "Huckleberry Finn." Nine references are cited. (EJS)

  18. 100 Years of Curriculum History, Theory, and Research

    Science.gov (United States)

    Schoenfeld, Alan H.

    2016-01-01

    This article reviews a collection of papers written by the American Educational Research Association's first 50 presidents that deal specifically with curricular issues. It characterizes the ways in which curricula were conceptualized, implemented, and assessed, with an eye toward the epistemological and methodological framings that the authors…

  19. 100 years of public electricity supply in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Gersdorff, B. von

    1984-05-08

    On May, 8sup(th), 1884, Emil Rathenau founded Germany's first power utility company, the Municipal Electricity Works, Berlin, the predecessor of today's Berlin Power and Light Co. (Bewag). Rathenau had recognised the significance of Thomas Alva Edison's light bulb for a broad application of electricity. Block power plants with machine outputs of 150 PS were at the beginning of their development in 1884. Today, in nuclear power plants with machine outputs of around 1300 MW, the technology of power supply has reached the ''large technology'' prophesied by Rathenau at the start of the century. With today's challenge of environment conservation the power utilities are confronted with new tasks for the future.

  20. 100 Years of Attempts to Transform Physics Education

    Science.gov (United States)

    Otero, Valerie K.; Meltzer, David E.

    2016-01-01

    As far back as the late 1800s, U.S. physics teachers expressed many of the same ideas about physics education reform that are advocated today. However, several popular reform efforts eventually failed to have wide impact, despite strong and enthusiastic support within the physics education community. Broad-scale implementation of improved…

  1. Developing Resilient Children: After 100 Years of Montessori Education

    Science.gov (United States)

    Drake, Meg

    2008-01-01

    In this millennium, educators are faced with a number of issues that Dr. Maria Montessori could not have predicted. Today, students are different from the children Dr. Montessori observed in her "Casa dei Bambini." They are influenced by technology in all its forms. Some suffer from medical problems such as complex food allergies, which wreak…

  2. 100 YEARS OF AUDI%百年奥迪

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Since entering China in 1988, Audi has been ranked "China Auto Customer Service Satisfaction" champion and "China Auto Sales Satisfaction" champion ten times by J.D. Power, meanwhile becoming the first luxury car brand in the Chinese market to exceed one million units in sales. This success is not only due to the formidable technical background based on the idea of "Innovation through Technology,"%源于德国的奥迪汽车1988年挺进中国,二十几年内,十次摘得J.D.Power公布的"中国汽车用户服务满意度"冠军和"中国汽车销售满意度"冠军,并成为第一个中国地区累计销量超过100万辆的高档汽车品牌。奥迪在中国的成功不仅源自"突破科技、启迪未来"的技术背景,更要归功于其品牌策划与本地用户的良性互动。3月21日,"心动上海"奥迪艺术展在沪申画廊开幕。策展人顾振清生于上海,

  3. C-TOOL

    DEFF Research Database (Denmark)

    Taghizadeh-Toosi, Arezoo; Christensen, Bent Tolstrup; Hutchings, Nicholas John

    2014-01-01

    Soil organic carbon (SOC) is a significant component of the global carbon (C) cycle. Changes in SOC storage affect atmospheric CO2 concentrations on decadal to centennial timescales. The C-TOOL model was developed to simulate farm- and regional-scale effects of management on medium- to long......-term SOC storage in the profile of well-drained agricultural mineral soils. C-TOOL uses three SOC pools for both the topsoil (0–25 cm) and the subsoil (25–100 cm), and applies temperature-dependent first order kinetics to regulate C turnover. C-TOOL also enables the simulation of 14C turnover. The simple...... model structure facilitates calibration and requires few inputs (mean monthly air temperature, soil clay content, soil C/N ratio and C in organic inputs). The model was parameterised using data from 19 treatments drawn from seven long-term field experiments in the United Kingdom, Sweden and Denmark...

  4. Organisational skills and tools.

    Science.gov (United States)

    Wicker, Paul

    2009-04-01

    While this article mainly applies to practitioners who have responsibilities for leading teams or supervising practitioners, many of the skills and tools described here may also apply to students or junior practitioners. The purpose of this article is to highlight some of the main points about organisation, some of the organisational skills and tools that are available, and some examples of how these skills and tools can be used to make practitioners more effective at organising their workload. It is important to realise that organising work and doing work are two completely different things and shouldn't be mixed up. For example, it would be very difficult to start organising work in the middle of a busy operating list: the organisation of the work must come before the work starts and therefore preparation is often an important first step in organising work. As such, some of the tools and skills described in this article may need to be used hours or even days prior to the actual work taking place.

  5. Research on the Acceptance of Web2.0 Tools Applied in Small and Micro Businesses’ Marketing Competitive Intelligence:A Case in Zhenjiang%Web2.0工具在小微企业市场营销竞争情报中的应用研究--以镇江地区为例

    Institute of Scientific and Technical Information of China (English)

    宋新平; 朱鹏云

    2016-01-01

    Nowadays, it is popular to use Web2.0 tools, which has been applied in marketing competitive intelligence. As a result, the competitive advantage and performance of salespeople have been improved. In this paper, we explored the acceptance and existing problems of Web2.0 tools applied in small and micro businesses’ marketing competitive intelligence through questionnaire survey and semi-structured interview. The results indicate that the cognition of applying Web2.0 tools is weak, most enterprises haven’t constructed a mature social media communication platform, and the value of Web2.0 tools has not been developed completely, thus, various barriers are existed. At the end, the paper put forward relevant suggestions to solve these problems.%目前,Web2.0工具的使用已相当普遍,并逐步渗透到了市场营销竞争情报工作中,对营销员的竞争优势和工作绩效产生了很大影响。为了解Web2.0工具在市场营销竞争情报中的应用现状及存在的问题,本文以镇江地区的小微企业为主要对象,进行了实证调研和统计分析。研究显示,目前Web2.0工具在小微企业市场营销竞争情报中的使用认知并不高,且多数企业缺乏成熟的Web2.0平台,而各类Web2.0工具的应用价值开发不完全,致使Web2.0工具的应用障碍较大。针对上述问题,本文提出了相应的建议。

  6. Mathematical tools

    Science.gov (United States)

    Capozziello, Salvatore; Faraoni, Valerio

    In this chapter we discuss certain mathematical tools which are used extensively in the following chapters. Some of these concepts and methods are part of the standard baggage taught in undergraduate and graduate courses, while others enter the tool-box of more advanced researchers. These mathematical methods are very useful in formulating ETGs and in finding analytical solutions.We begin by studying conformal transformations, which allow for different representations of scalar-tensor and f(R) theories of gravity, in addition to being useful in GR. We continue by discussing variational principles in GR, which are the basis for presenting ETGs in the following chapters. We close the chapter with a discussion of Noether symmetries, which are used elsewhere in this book to obtain analytical solutions.

  7. A Chinese type 2 diabetic patient sports hindrance survey tool developed by applying the Rasch model%应用Rasch模型开发我国2型糖尿病患者运动阻碍的调查工具

    Institute of Scientific and Technical Information of China (English)

    李庆雯; 朱为模; 李梅

    2014-01-01

    In order to develop and standardize a diabetic patient sports hindrance survey tool, and to clarify major sports hindrances suffered by type 2 diabetic patients in China, the authors applied the world latest educational/psychological measurement technology to simplify the survey tool, so that it can be applied to mass groups of people better. The authors carried out a questionnaire survey on 197 type 2 diabetic patients (104 males, 93 females, average age:53.6). The survey tool originated from a mature sports hindrance survey questionnaire established in the United States, including 43 sports hindrances, used to determine the degree of hindrance in their sports participation. The authors analyzed the data by apply-ing the Rasch model, determined the magnitudes of sports hindrances based on their Logits values, and then analyzed the results according to the Rasch model, reduced questionnaire contents on the basis of maintaining the original content structure, screened out the best reduced questionnaire version by means of correlation analysis, and found that the lack of sports knowledge and the lack of professional guidance were major sports hindrances for type 2 diabetic patients in China. Based on their analysis by applying the new item response theory (IRT) model, the authors found that all the 4 reduced versions of questionnaires were correlative to the original 43-hindrance version, and that the 16-hindrance questionnaire tool had the highest efficiency, whose usage was recommended.%为了开发并规范糖尿病患者运动阻碍的调查工具,弄清中国2型糖尿病患者的主要运动阻碍,运用国际上最新的教育/心理测量技术简化调查工具,更好地应用大规模人群。对197名2型糖尿病患者(男性104名,女性93名,平均年龄53.6岁)进行问卷调查。调查工具源自美国成熟的43项运动阻碍调查问卷,作为判定妨碍他们参加运动的程度。数据应用Rasch模型进

  8. The utilization of space as tool for applied research

    Science.gov (United States)

    Sprenger, H. J.

    1998-01-01

    The major reasons for the reluctance of industry towards a broader participation into space utilization have been identified by thorough analysis of discussions with industry representatives. They are mainly resulting from difficulties to estimate possible commercial benefits from research or processing carried out in the space environment, and incompatibility of present space access conditions with the requirements of industry. The approach for the motivation of industry differs from the usual way of spin-offs and technology transfer, since it is strictly based on identifying the industrial demands. An important finding was that the private sector cannot be expected as a paying customer in a single step. For a certain transition period joint industry/university research has to be promoted in which the industrial partners determine the objectives that would be met by targeted research efforts of the scientists from academia and contract research institutions.

  9. Database Constraints Applied to Metabolic Pathway Reconstruction Tools

    Directory of Open Access Journals (Sweden)

    Jordi Vilaplana

    2014-01-01

    Full Text Available Our group developed two biological applications, Biblio-MetReS and Homol-MetReS, accessing the same database of organisms with annotated genes. Biblio-MetReS is a data-mining application that facilitates the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (reannotation of proteomes, to properly identify both the individual proteins involved in the process(es of interest and their function. It also enables the sets of proteins involved in the process(es in different organisms to be compared directly. The efficiency of these biological applications is directly related to the design of the shared database. We classified and analyzed the different kinds of access to the database. Based on this study, we tried to adjust and tune the configurable parameters of the database server to reach the best performance of the communication data link to/from the database system. Different database technologies were analyzed. We started the study with a public relational SQL database, MySQL. Then, the same database was implemented by a MapReduce-based database named HBase. The results indicated that the standard configuration of MySQL gives an acceptable performance for low or medium size databases. Nevertheless, tuning database parameters can greatly improve the performance and lead to very competitive runtimes.

  10. Applied epidemiology: another tool in dairy herd health programs?

    Science.gov (United States)

    Frankena, K; Noordhuizen, J P; Stassen, E N

    1994-01-01

    Data bases of herd health programs concern data from individual animals mainly. Several parameters that determine herd performance can be calculated from these programs, and by comparing actual values with standard values, areas for further improvement of health (and production) can be advised. However, such advice is usually not backed up by the proper statistical analyses. Moreover, data concerning the environment of the animals are not present and hence advice concerning multifactorial diseases are based on common knowledge and experience. Veterinary epidemiology offers methods that might improve the value of herd health programs by identification and quantification of factors and conditions contributing to multifactorial disease occurrence. Implementation of these methods within herd health programs will lead to more scientifically sound advice.

  11. Applying Web-Based Tools for Research, Engineering, and Operations

    Science.gov (United States)

    Ivancic, William D.

    2011-01-01

    Personnel in the NASA Glenn Research Center Network and Architectures branch have performed a variety of research related to space-based sensor webs, network centric operations, security and delay tolerant networking (DTN). Quality documentation and communications, real-time monitoring and information dissemination are critical in order to perform quality research while maintaining low cost and utilizing multiple remote systems. This has been accomplished using a variety of Internet technologies often operating simultaneously. This paper describes important features of various technologies and provides a number of real-world examples of how combining Internet technologies can enable a virtual team to act efficiently as one unit to perform advanced research in operational systems. Finally, real and potential abuses of power and manipulation of information and information access is addressed.

  12. Scanning Probe Microscopy as a Tool Applied to Agriculture

    Science.gov (United States)

    Leite, Fabio Lima; Manzoli, Alexandra; de Herrmann, Paulo Sérgio Paula; Oliveira, Osvaldo Novais; Mattoso, Luiz Henrique Capparelli

    The control of materials properties and processes at the molecular level inherent in nanotechnology has been exploited in many areas of science and technology, including agriculture where nanotech methods are used in release of herbicides and monitoring of food quality and environmental impact. Atomic force microscopy (AFM) and related techniques are among the most employed nanotech methods, particularly with the possibility of direct measurements of intermolecular interactions. This chapter presents a brief review of the applications of AFM in agriculture that may be categorized into four main topics, namely thin films, research on nanomaterials and nanostructures, biological systems and natural fibers, and soils science. Examples of recent applications will be provided to give the reader a sense of the power of the technique and potential contributions to agriculture.

  13. Spectroscopic Tools Applied to Element Z = 115 Decay Chains

    Directory of Open Access Journals (Sweden)

    Forsberg U.

    2014-03-01

    Full Text Available Nuclides that are considered to be isotopes of element Z = 115 were produced in the reaction 48Ca + 243Am at the GSI Helmholtzzentrum für Schwerionenforschung Darmstadt. The detector setup TASISpec was used. It was mounted behind the gas-filled separator TASCA. Thirty correlated α-decay chains were found, and the energies of the particles were determined with high precision. Two important spectroscopic aspects of the offline data analysis are discussed in detail: the handling of digitized preamplified signals from the silicon strip detectors, and the energy reconstruction of particles escaping to upstream detectors relying on pixel-by-pixel dead-layer thicknesses.

  14. 29 CFR 1915.133 - Hand tools.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Hand tools. 1915.133 Section 1915.133 Labor Regulations...) OCCUPATIONAL SAFETY AND HEALTH STANDARDS FOR SHIPYARD EMPLOYMENT Tools and Related Equipment § 1915.133 Hand tools. The provisions of this section shall apply to ship repairing, shipbuilding and shipbreaking....

  15. Tool Gear: Infrastructure for Parallel Tools

    Energy Technology Data Exchange (ETDEWEB)

    May, J; Gyllenhaal, J

    2003-04-17

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  16. RSP Tooling Technology

    Energy Technology Data Exchange (ETDEWEB)

    None

    2001-11-20

    RSP Tooling{trademark} is a spray forming technology tailored for producing molds and dies. The approach combines rapid solidification processing and net-shape materials processing in a single step. The general concept involves converting a mold design described by a CAD file to a tooling master using a suitable rapid prototyping (RP) technology such as stereolithography. A pattern transfer is made to a castable ceramic, typically alumina or fused silica (Figure 1). This is followed by spray forming a thick deposit of a tooling alloy on the pattern to capture the desired shape, surface texture, and detail. The resultant metal block is cooled to room temperature and separated from the pattern. The deposit's exterior walls are machined square, allowing it to be used as an insert in a standard mold base. The overall turnaround time for tooling is about 3 to 5 days, starting with a master. Molds and dies produced in this way have been used in high volume production runs in plastic injection molding and die casting. A Cooperative Research and Development Agreement (CRADA) between the Idaho National Engineering and Environmental Laboratory (INEEL) and Grupo Vitro has been established to evaluate the feasibility of using RSP Tooling technology for producing molds and dies of interest to Vitro. This report summarizes results from Phase I of this agreement, and describes work scope and budget for Phase I1 activities. The main objective in Phase I was to demonstrate the feasibility of applying the Rapid Solidification Process (RSP) Tooling method to produce molds for the manufacture of glass and other components of interest to Vitro. This objective was successfully achieved.

  17. RSP Tooling Technology

    Energy Technology Data Exchange (ETDEWEB)

    None

    2001-11-20

    RSP Tooling{trademark} is a spray forming technology tailored for producing molds and dies. The approach combines rapid solidification processing and net-shape materials processing in a single step. The general concept involves converting a mold design described by a CAD file to a tooling master using a suitable rapid prototyping (RP) technology such as stereolithography. A pattern transfer is made to a castable ceramic, typically alumina or fused silica (Figure 1). This is followed by spray forming a thick deposit of a tooling alloy on the pattern to capture the desired shape, surface texture, and detail. The resultant metal block is cooled to room temperature and separated from the pattern. The deposit's exterior walls are machined square, allowing it to be used as an insert in a standard mold base. The overall turnaround time for tooling is about 3 to 5 days, starting with a master. Molds and dies produced in this way have been used in high volume production runs in plastic injection molding and die casting. A Cooperative Research and Development Agreement (CRADA) between the Idaho National Engineering and Environmental Laboratory (INEEL) and Grupo Vitro has been established to evaluate the feasibility of using RSP Tooling technology for producing molds and dies of interest to Vitro. This report summarizes results from Phase I of this agreement, and describes work scope and budget for Phase I1 activities. The main objective in Phase I was to demonstrate the feasibility of applying the Rapid Solidification Process (RSP) Tooling method to produce molds for the manufacture of glass and other components of interest to Vitro. This objective was successfully achieved.

  18. Tools for Authentication

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2008-07-09

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  19. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  20. Applying SF-Based Genre Approaches to English Writing Class

    Science.gov (United States)

    Wu, Yan; Dong, Hailin

    2009-01-01

    By exploring genre approaches in systemic functional linguistics and examining the analytic tools that can be applied to the process of English learning and teaching, this paper seeks to find a way of applying genre approaches to English writing class.

  1. What Metadata Principles Apply to Scientific Data?

    Science.gov (United States)

    Mayernik, M. S.

    2014-12-01

    Information researchers and professionals based in the library and information science fields often approach their work through developing and applying defined sets of principles. For example, for over 100 years, the evolution of library cataloging practice has largely been driven by debates (which are still ongoing) about the fundamental principles of cataloging and how those principles should manifest in rules for cataloging. Similarly, the development of archival research and practices over the past century has proceeded hand-in-hand with the emergence of principles of archival arrangement and description, such as maintaining the original order of records and documenting provenance. This project examines principles related to the creation of metadata for scientific data. The presentation will outline: 1) how understandings and implementations of metadata can range broadly depending on the institutional context, and 2) how metadata principles developed by the library and information science community might apply to metadata developments for scientific data. The development and formalization of such principles would contribute to the development of metadata practices and standards in a wide range of institutions, including data repositories, libraries, and research centers. Shared metadata principles would potentially be useful in streamlining data discovery and integration, and would also benefit the growing efforts to formalize data curation education.

  2. Cyber Security Evaluation Tool

    Energy Technology Data Exchange (ETDEWEB)

    2009-08-03

    CSET is a desktop software tool that guides users through a step-by-step process to assess their control system network security practices against recognized industry standards. The output from CSET is a prioritized list of recommendations for improving the cyber security posture of your organization’s ICS or enterprise network. CSET derives the recommendations from a database of cybersecurity standards, guidelines, and practices. Each recommendation is linked to a set of actions that can be applied to enhance cybersecurity controls.

  3. Software Engineering applied to Manufacturing Problems

    Directory of Open Access Journals (Sweden)

    Jorge A. Ruiz-Vanoye

    2010-05-01

    Full Text Available Optimization approaches have traditionally been viewed as tools for solving manufacturing problems, the optimization approach is not suitable for many problems arising in modern manufacturing systems due to their complexity and involvement of qualitative factors. In this paper we use a tool of software engineering applied to manufacturing problems. We use the Heuristics Lab software to determine and analyze the solution obtained for Manufacturing Problems.

  4. Rib forming tool for tubing

    Science.gov (United States)

    Rowley, James P.; Lewandowski, Edward F.; Groh, Edward F.

    1976-01-01

    Three cylindrical rollers are rotatably mounted equidistant from the center of a hollow tool head on radii spaced 120.degree. apart. Each roller has a thin flange; the three flanges lie in a single plane to form an internal circumferential rib in a rotating tubular workpiece. The tool head has two complementary parts with two rollers in one part of the head and one roller in the other part; the two parts are joined by a hinge. A second hinge, located so the rollers are between the two hinges, connects one of the parts to a tool bar mounted in a lathe tool holder. The axes of rotation of both hinges and all three rollers are parallel. A hole exposing equal portions of the three roller flanges is located in the center of the tool head. The two hinges permit the tool head to be opened and rotated slightly downward, taking the roller flanges out of the path of the workpiece which is supported on both ends and rotated by the lathe. The parts of the tool head are then closed on the workpiece so that the flanges are applied to the workpiece and form the rib. The tool is then relocated for forming of the next rib.

  5. Journal of applied mathematics

    National Research Council Canada - National Science Library

    2001-01-01

    "[The] Journal of Applied Mathematics is a refereed journal devoted to the publication of original research papers and review articles in all areas of applied, computational, and industrial mathematics...

  6. Sheet Bending using Soft Tools

    Science.gov (United States)

    Sinke, J.

    2011-05-01

    Sheet bending is usually performed by air bending and V-die bending processes. Both processes apply rigid tools. These solid tools facilitate the generation of software for the numerical control of those processes. When the lower rigid die is replaced with a soft or rubber tool, the numerical control becomes much more difficult, since the soft tool deforms too. Compared to other bending processes the rubber backed bending process has some distinct advantages, like large radius-to-thickness ratios, applicability to materials with topcoats, well defined radii, and the feasibility of forming details (ridges, beads). These advantages may give the process exclusive benefits over conventional bending processes, not only for industries related to mechanical engineering and sheet metal forming, but also for other disciplines like Architecture and Industrial Design The largest disadvantage is that also the soft (rubber) tool deforms. Although the tool deformation is elastic and recovers after each process cycle, the applied force during bending is related to the deformation of the metal sheet and the deformation of the rubber. The deformation of the rubber interacts with the process but also with sheet parameters. This makes the numerical control of the process much more complicated. This paper presents a model for the bending of sheet materials using a rubber lower die. This model can be implemented in software in order to control the bending process numerically. The model itself is based on numerical and experimental research. In this research a number of variables related to the tooling and the material have been evaluated. The numerical part of the research was used to investigate the influence of the features of the soft lower tool, like the hardness and dimensions, and the influence of the sheet thickness, which also interacts with the soft tool deformation. The experimental research was focused on the relation between the machine control parameters and the most

  7. Downhole tool with replaceable tool sleeve sections

    Energy Technology Data Exchange (ETDEWEB)

    Case, W. A.

    1985-10-29

    A downhole tool for insertion in a drill stem includes elongated cylindrical half sleeve tool sections adapted to be non-rotatably supported on an elongated cylindrical body. The tool sections are mountable on and removable from the body without disconnecting either end of the tool from a drill stem. The half sleeve tool sections are provided with tapered axially extending flanges on their opposite ends which fit in corresponding tapered recesses formed on the tool body and the tool sections are retained on the body by a locknut threadedly engaged with the body and engageable with an axially movable retaining collar. The tool sections may be drivably engaged with axial keys formed on the body or the tool sections may be formed with flat surfaces on the sleeve inner sides cooperable with complementary flat surfaces formed on a reduced diameter portion of the body around which the tool sections are mounted.

  8. Advances in Applied Mechanics

    OpenAIRE

    2014-01-01

    Advances in Applied Mechanics draws together recent significant advances in various topics in applied mechanics. Published since 1948, Advances in Applied Mechanics aims to provide authoritative review articles on topics in the mechanical sciences, primarily of interest to scientists and engineers working in the various branches of mechanics, but also of interest to the many who use the results of investigations in mechanics in various application areas, such as aerospace, chemical, civil, en...

  9. Perspectives on Applied Ethics

    OpenAIRE

    2007-01-01

    Applied ethics is a growing, interdisciplinary field dealing with ethical problems in different areas of society. It includes for instance social and political ethics, computer ethics, medical ethics, bioethics, envi-ronmental ethics, business ethics, and it also relates to different forms of professional ethics. From the perspective of ethics, applied ethics is a specialisation in one area of ethics. From the perspective of social practice applying eth-ics is to focus on ethical aspects and ...

  10. Useful design tools?

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole

    2005-01-01

    vague and contested concept of sustainability into concrete concepts and building projects. It describes a typology of tools: process tools, impact assessment tools, multi-criteria tools and tools for monitoring. It includes a Danish paradigmatic case study of stakeholder participation in the planning...... of a new sustainable settlement. The use of design tools is discussed in relation to innovation and stakeholder participation, and it is stressed that the usefulness of design tools is context dependent....

  11. 29 CFR 1915.132 - Portable electric tools.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Portable electric tools. 1915.132 Section 1915.132 Labor... § 1915.132 Portable electric tools. The provisions of this section shall apply to ship repairing... frames of portable electric tools and appliances, except double insulated tools approved by...

  12. Applied Neuroscience Laboratory Complex

    Data.gov (United States)

    Federal Laboratory Consortium — Located at WPAFB, Ohio, the Applied Neuroscience lab researches and develops technologies to optimize Airmen individual and team performance across all AF domains....

  13. Vygotsky in applied neuropsychology

    National Research Council Canada - National Science Library

    Glozman, Janna M

    2016-01-01

    ...) to analyze the development of these theories in contemporary applied neuropsychology. An analysis of disturbances of mental functioning is impossible without a systemic approach to the evidence observed...

  14. CoC GIS Tools (GIS Tool)

    Data.gov (United States)

    Department of Housing and Urban Development — This tool provides a no-cost downloadable software tool that allows users to interact with professional quality GIS maps. Users access pre-compiled projects through...

  15. Evolution of the lignite industry in the Rhineland across 100 years of 'Rheinische Aktiengesellschaft fuer Braunkohlenbergbau und Brikettfabrikation''; Entwicklung der Braunkohlenindustrie Im Rheinland im Spiegel von 100 Jahren ''Rheinische Aktiengesellschaft fuer Braunkohlenbergbau und Brikettfabrikation''

    Energy Technology Data Exchange (ETDEWEB)

    Hartung, M. [RWE Power AG, Ressort Braunkohlengewinnung, -stromerzeugung und -veredlung, Koeln (Germany); Kulik, L. [RWE Power AG, Bereich Tagebauplanung und -genehmigung, Koeln (Germany). PBT Bereich Tagebauplanung und -genehmigung; Gaertner, D. [RWE Power AG, Sparte Tagebaue, Bergheim (Germany)

    2008-09-15

    Rheinische Aktiengesellschaft fuer Braunkohlenbergbau und Brikettfabrikation was set up in the year 1908, and this event heralded an unprecedented streamlining of the lignite industry as it grew and developed in the Rhenish mining area. It paved the way for a bundling of forces that had become necessary in the sector's recent history if it was going to successfully face the upcoming challenges on the commodity and energy markets and in the mining sector. The amalgamation of the many Rhenish mining companies also enabled the industry to safeguard its interests more forcefully and effectively in its dealings with other market players and policy-makers. In the light of today's discussions about raw-material shortages, sustainable energy supply, climate protection and emissions trading, lignite is again facing huge challenges, just as it once did. Even if these are new and of a different nature, the course taken 100 years ago may provide pointers for the future alignment of the Rhenish lignite-mining industry. In view of the ongoing efforts made to gain acceptance among citizens, policy-makers and business, backed by an internal promotion of knowledge transfer and further education, the systematic use of technical progress to improve the competitive situation, and a far-sighted gearing of the product portfolio to growth markets. Rhenish lignite is creating a sound basis that will enable it to successfully meet tomorrow's challenges in a time-tested manner. (orig.)

  16. Refrigerated cutting tools improve machining of superalloys

    Science.gov (United States)

    Dudley, G. M.

    1971-01-01

    Freon-12 applied to tool cutting edge evaporates quickly, leaves no residue, and permits higher cutting rate than with conventional coolants. This technique increases cutting rate on Rene-41 threefold and improves finish of machined surface.

  17. Applied Parallel Metadata Indexing

    Energy Technology Data Exchange (ETDEWEB)

    Jacobi, Michael R [Los Alamos National Laboratory

    2012-08-01

    The GPFS Archive is parallel archive is a parallel archive used by hundreds of users in the Turquoise collaboration network. It houses 4+ petabytes of data in more than 170 million files. Currently, users must navigate the file system to retrieve their data, requiring them to remember file paths and names. A better solution might allow users to tag data with meaningful labels and searach the archive using standard and user-defined metadata, while maintaining security. last summer, I developed the backend to a tool that adheres to these design goals. The backend works by importing GPFS metadata into a MongoDB cluster, which is then indexed on each attribute. This summer, the author implemented security and developed the user interfae for the search tool. To meet security requirements, each database table is associated with a single user, which only stores records that the user may read, and requires a set of credentials to access. The interface to the search tool is implemented using FUSE (Filesystem in USErspace). FUSE is an intermediate layer that intercepts file system calls and allows the developer to redefine how those calls behave. In the case of this tool, FUSE interfaces with MongoDB to issue queries and populate output. A FUSE implementation is desirable because it allows users to interact with the search tool using commands they are already familiar with. These security and interface additions are essential for a usable product.

  18. What are applied ethics?

    Science.gov (United States)

    Allhoff, Fritz

    2011-03-01

    This paper explores the relationships that various applied ethics bear to each other, both in particular disciplines and more generally. The introductory section lays out the challenge of coming up with such an account and, drawing a parallel with the philosophy of science, offers that applied ethics may either be unified or disunified. The second section develops one simple account through which applied ethics are unified, vis-à-vis ethical theory. However, this is not taken to be a satisfying answer, for reasons explained. In the third section, specific applied ethics are explored: biomedical ethics; business ethics; environmental ethics; and neuroethics. These are chosen not to be comprehensive, but rather for their traditions or other illustrative purposes. The final section draws together the results of the preceding analysis and defends a disunity conception of applied ethics.

  19. Fire behavior modeling-a decision tool

    Science.gov (United States)

    Jack Cohen; Bill Bradshaw

    1986-01-01

    The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...

  20. Industrial biotechnology: tools and applications.

    Science.gov (United States)

    Tang, Weng Lin; Zhao, Huimin

    2009-12-01

    Industrial biotechnology involves the use of enzymes and microorganisms to produce value-added chemicals from renewable sources. Because of its association with reduced energy consumption, greenhouse gas emissions, and waste generation, industrial biotechnology is a rapidly growing field. Here we highlight a variety of important tools for industrial biotechnology, including protein engineering, metabolic engineering, synthetic biology, systems biology, and downstream processing. In addition, we show how these tools have been successfully applied in several case studies, including the production of 1, 3-propanediol, lactic acid, and biofuels. It is expected that industrial biotechnology will be increasingly adopted by chemical, pharmaceutical, food, and agricultural industries.

  1. Special Functions for Applied Scientists

    CERN Document Server

    Mathai, A M

    2008-01-01

    Special Functions for Applied Scientists provides the required mathematical tools for researchers active in the physical sciences. The book presents a full suit of elementary functions for scholars at the PhD level and covers a wide-array of topics and begins by introducing elementary classical special functions. From there, differential equations and some applications into statistical distribution theory are examined. The fractional calculus chapter covers fractional integrals and fractional derivatives as well as their applications to reaction-diffusion problems in physics, input-output analysis, Mittag-Leffler stochastic processes and related topics. The authors then cover q-hypergeometric functions, Ramanujan's work and Lie groups. The latter half of this volume presents applications into stochastic processes, random variables, Mittag-Leffler processes, density estimation, order statistics, and problems in astrophysics. Professor Dr. A.M. Mathai is Emeritus Professor of Mathematics and Statistics, McGill ...

  2. Biomimetics: process, tools and practice.

    Science.gov (United States)

    Fayemi, P E; Wanieck, K; Zollfrank, C; Maranzana, N; Aoussat, A

    2017-01-23

    Biomimetics applies principles and strategies abstracted from biological systems to engineering and technological design. With a huge potential for innovation, biomimetics could evolve into a key process in businesses. Yet challenges remain within the process of biomimetics, especially from the perspective of potential users. We work to clarify the understanding of the process of biomimetics. Therefore, we briefly summarize the terminology of biomimetics and bioinspiration. The implementation of biomimetics requires a stated process. Therefore, we present a model of the problem-driven process of biomimetics that can be used for problem-solving activity. The process of biomimetics can be facilitated by existing tools and creative methods. We mapped a set of tools to the biomimetic process model and set up assessment sheets to evaluate the theoretical and practical value of these tools. We analyzed the tools in interdisciplinary research workshops and present the characteristics of the tools. We also present the attempt of a utility tree which, once finalized, could be used to guide users through the process by choosing appropriate tools respective to their own expertize. The aim of this paper is to foster the dialogue and facilitate a closer collaboration within the field of biomimetics.

  3. Tool Changer For Robot

    Science.gov (United States)

    Voellmer, George M.

    1992-01-01

    Mechanism enables robot to change tools on end of arm. Actuated by motion of robot: requires no additional electrical or pneumatic energy to make or break connection between tool and wrist at end of arm. Includes three basic subassemblies: wrist interface plate attached to robot arm at wrist, tool interface plate attached to tool, and holster. Separate tool interface plate and holster provided for each tool robot uses.

  4. Route Availabililty Planning Tool -

    Data.gov (United States)

    Department of Transportation — The Route Availability Planning Tool (RAPT) is a weather-assimilated decision support tool (DST) that supports the development and execution of departure management...

  5. Applied eye tracking research

    NARCIS (Netherlands)

    Jarodzka, Halszka

    2011-01-01

    Jarodzka, H. (2010, 12 November). Applied eye tracking research. Presentation and Labtour for Vereniging Gewone Leden in oprichting (VGL i.o.), Heerlen, The Netherlands: Open University of the Netherlands.

  6. Mesothelioma Applied Research Foundation

    Science.gov (United States)

    ... Percentage Donations Tribute Wall Other Giving/Fundraising Opportunities Bitcoin Donation Form FAQs Mesothelioma Awareness Day: Find out ... Percentage Donations Tribute Wall Other Giving/Fundraising Opportunities Bitcoin Donation Form FAQs © 2017 Mesothelioma Applied Research Foundation, ...

  7. Applied eye tracking research

    NARCIS (Netherlands)

    Jarodzka, Halszka

    2011-01-01

    Jarodzka, H. (2010, 12 November). Applied eye tracking research. Presentation and Labtour for Vereniging Gewone Leden in oprichting (VGL i.o.), Heerlen, The Netherlands: Open University of the Netherlands.

  8. Computer and Applied Ethics

    OpenAIRE

    越智, 貢

    2014-01-01

    With this essay I treat some problems raised by the new developments in science and technology, that is, those about Computer Ethics to show how and how far Applied Ethics differs from traditional ethics. I take up backgrounds on which Computer Ethics rests, particularly historical conditions of morality. Differences of conditions in time and space explain how Computer Ethics and Applied Ethics are not any traditional ethics in concrete cases. But I also investigate the normative rea...

  9. PAT tools for fermentation processes

    DEFF Research Database (Denmark)

    Gernaey, Krist

    The publication of the Process Analytical Technology (PAT) guidance has been one of the most important milestones for pharmaceutical production during the past ten years. The ideas outlined in the PAT guidance are also applied in other industries, for example the fermentation industry. Process...... knowledge is central in PAT projects. This presentation therefore gives a brief overview of a number of PAT tools for collecting process knowledge on fermentation processes: - On-line sensors, where for example spectroscopic measurements are increasingly applied - Mechanistic models, which can be used...

  10. Navigating Towards Digital Tectonic Tools

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due; Kirkegaard, Poul Henning

    2006-01-01

    like opposites, the term tectonics deals with creating a meaningful relationship between the two. The aim of this paper is to investigate what a digital tectonic tool could be and what relationship with technology it should represent. An understanding of this relationship can help us not only...... to understand the conflicts in architecture and the building industry but also bring us further into a discussion of how architecture can use digital tools. The investigation is carried out firstly by approaching the subject theoretically through the term tectonics and by setting up a model of the values...... a tectonic tool should encompass. Secondly the ability and validity of the model are shown by applying it to a case study of Jørn Utzon’s work on Minor Hall in Sydney Opera House - for the sake of exemplification the technical field focused on in this paper is room acoustics. Thirdly the relationship between...

  11. Evaluating tidal marsh sustainability in the face of sea-level rise: a hybrid modeling approach applied to San Francisco Bay.

    Directory of Open Access Journals (Sweden)

    Diana Stralberg

    Full Text Available BACKGROUND: Tidal marshes will be threatened by increasing rates of sea-level rise (SLR over the next century. Managers seek guidance on whether existing and restored marshes will be resilient under a range of potential future conditions, and on prioritizing marsh restoration and conservation activities. METHODOLOGY: Building upon established models, we developed a hybrid approach that involves a mechanistic treatment of marsh accretion dynamics and incorporates spatial variation at a scale relevant for conservation and restoration decision-making. We applied this model to San Francisco Bay, using best-available elevation data and estimates of sediment supply and organic matter accumulation developed for 15 Bay subregions. Accretion models were run over 100 years for 70 combinations of starting elevation, mineral sediment, organic matter, and SLR assumptions. Results were applied spatially to evaluate eight Bay-wide climate change scenarios. PRINCIPAL FINDINGS: Model results indicated that under a high rate of SLR (1.65 m/century, short-term restoration of diked subtidal baylands to mid marsh elevations (-0.2 m MHHW could be achieved over the next century with sediment concentrations greater than 200 mg/L. However, suspended sediment concentrations greater than 300 mg/L would be required for 100-year mid marsh sustainability (i.e., no elevation loss. Organic matter accumulation had minimal impacts on this threshold. Bay-wide projections of marsh habitat area varied substantially, depending primarily on SLR and sediment assumptions. Across all scenarios, however, the model projected a shift in the mix of intertidal habitats, with a loss of high marsh and gains in low marsh and mudflats. CONCLUSIONS/SIGNIFICANCE: Results suggest a bleak prognosis for long-term natural tidal marsh sustainability under a high-SLR scenario. To minimize marsh loss, we recommend conserving adjacent uplands for marsh migration, redistributing dredged sediment to raise

  12. PSYCHOANALYSIS AS APPLIED AESTHETICS.

    Science.gov (United States)

    Richmond, Stephen H

    2016-07-01

    The question of how to place psychoanalysis in relation to science has been debated since the beginning of psychoanalysis and continues to this day. The author argues that psychoanalysis is best viewed as a form of applied art (also termed applied aesthetics) in parallel to medicine as applied science. This postulate draws on a functional definition of modernity as involving the differentiation of the value spheres of science, art, and religion. The validity criteria for each of the value spheres are discussed. Freud is examined, drawing on Habermas, and seen to have erred by claiming that the psychoanalytic method is a form of science. Implications for clinical and metapsychological issues in psychoanalysis are discussed.

  13. Applied chemical engineering thermodynamics

    CERN Document Server

    Tassios, Dimitrios P

    1993-01-01

    Applied Chemical Engineering Thermodynamics provides the undergraduate and graduate student of chemical engineering with the basic knowledge, the methodology and the references he needs to apply it in industrial practice. Thus, in addition to the classical topics of the laws of thermodynamics,pure component and mixture thermodynamic properties as well as phase and chemical equilibria the reader will find: - history of thermodynamics - energy conservation - internmolecular forces and molecular thermodynamics - cubic equations of state - statistical mechanics. A great number of calculated problems with solutions and an appendix with numerous tables of numbers of practical importance are extremely helpful for applied calculations. The computer programs on the included disk help the student to become familiar with the typical methods used in industry for volumetric and vapor-liquid equilibria calculations.

  14. A novel tool-use mode in animals: New Caledonian crows insert tools to transport objects.

    Science.gov (United States)

    Jacobs, Ivo F; von Bayern, Auguste; Osvath, Mathias

    2016-11-01

    New Caledonian crows (Corvus moneduloides) rely heavily on a range of tools to extract prey. They manufacture novel tools, save tools for later use, and have morphological features that facilitate tool use. We report six observations, in two individuals, of a novel tool-use mode not previously reported in non-human animals. Insert-and-transport tool use involves inserting a stick into an object and then moving away, thereby transporting both object and tool. All transported objects were non-food objects. One subject used a stick to transport an object that was too large to be handled by beak, which suggests the tool facilitated object control. The function in the other cases is unclear but seems to be an expression of play or exploration. Further studies should investigate whether it is adaptive in the wild and to what extent crows can flexibly apply the behaviour in experimental settings when purposive transportation of objects is advantageous.

  15. Applied Astronomy: Asteroid Prospecting

    Science.gov (United States)

    Elvis, M.

    2013-09-01

    In the age of asteroid mining the ability to find promising ore-bearing bodies will be valuable. This will give rise to a new discipline- "Applied Astronomy". Just as most geologists work in industry, not in academia, the same will be true of astronomers. Just how rare or common ore-rich asteroids are likely to be, and the skills needed to assay their value, are discussed here, with an emphasis on remote - telescopic - methods. Also considered are the resources needed to conduct extensive surveys of asteroids for prospecting purposes, and the cost and timescale involved. The longer-term need for applied astronomers is also covered.

  16. Applied mathematics made simple

    CERN Document Server

    Murphy, Patrick

    1982-01-01

    Applied Mathematics: Made Simple provides an elementary study of the three main branches of classical applied mathematics: statics, hydrostatics, and dynamics. The book begins with discussion of the concepts of mechanics, parallel forces and rigid bodies, kinematics, motion with uniform acceleration in a straight line, and Newton's law of motion. Separate chapters cover vector algebra and coplanar motion, relative motion, projectiles, friction, and rigid bodies in equilibrium under the action of coplanar forces. The final chapters deal with machines and hydrostatics. The standard and conte

  17. Retransmission Steganography Applied

    CERN Document Server

    Mazurczyk, Wojciech; Szczypiorski, Krzysztof

    2010-01-01

    This paper presents experimental results of the implementation of network steganography method called RSTEG (Retransmission Steganography). The main idea of RSTEG is to not acknowledge a successfully received packet to intentionally invoke retransmission. The retransmitted packet carries a steganogram instead of user data in the payload field. RSTEG can be applied to many network protocols that utilize retransmissions. We present experimental results for RSTEG applied to TCP (Transmission Control Protocol) as TCP is the most popular network protocol which ensures reliable data transfer. The main aim of the performed experiments was to estimate RSTEG steganographic bandwidth and detectability by observing its influence on the network retransmission level.

  18. Applied Electromagnetism and Materials

    CERN Document Server

    Moliton, André

    2007-01-01

    Applied Electromagnetism and Materials picks up where the author's Basic Electromagnetism and Materials left off by presenting practical and relevant technological information about electromagnetic material properties and their applications. This book is aimed at senior undergraduate and graduate students as well as researchers in materials science and is the product of many years of teaching basic and applied electromagnetism. Topics range from the spectroscopy and characterization of dielectrics and semiconductors, to non-linear effects and electromagnetic cavities, to ion-beam applications in materials science.

  19. Introduction to applied thermodynamics

    CERN Document Server

    Helsdon, R M; Walker, G E

    1965-01-01

    Introduction to Applied Thermodynamics is an introductory text on applied thermodynamics and covers topics ranging from energy and temperature to reversibility and entropy, the first and second laws of thermodynamics, and the properties of ideal gases. Standard air cycles and the thermodynamic properties of pure substances are also discussed, together with gas compressors, combustion, and psychrometry. This volume is comprised of 16 chapters and begins with an overview of the concept of energy as well as the macroscopic and molecular approaches to thermodynamics. The following chapters focus o

  20. On applying cognitive psychology.

    Science.gov (United States)

    Baddeley, Alan

    2013-11-01

    Recent attempts to assess the practical impact of scientific research prompted my own reflections on over 40 years worth of combining basic and applied cognitive psychology. Examples are drawn principally from the study of memory disorders, but also include applications to the assessment of attention, reading, and intelligence. The most striking conclusion concerns the many years it typically takes to go from an initial study, to the final practical outcome. Although the complexity and sheer timescale involved make external evaluation problematic, the combination of practical satisfaction and theoretical stimulation make the attempt to combine basic and applied research very rewarding.

  1. Applied Statistics with SPSS

    Science.gov (United States)

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  2. Advances in applied mechanics

    CERN Document Server

    Wu, Theodore Y; Wu, Theodore Y

    2000-01-01

    This highly acclaimed series provides survey articles on the present state and future direction of research in important branches of applied solid and fluid mechanics. Mechanics is defined as a branch of physics that focuses on motion and on the reaction of physical systems to internal and external forces.

  3. Essays on Applied Microeconomics

    Science.gov (United States)

    Mejia Mantilla, Carolina

    2013-01-01

    Each chapter of this dissertation studies a different question within the field of Applied Microeconomics. The first chapter examines the mid- and long-term effects of the 1998 Asian Crisis on the educational attainment of Indonesian children ages 6 to 18, at the time of the crisis. The effects are identified as deviations from a linear trend for…

  4. Applied Statistics with SPSS

    Science.gov (United States)

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  5. Applying Literature to ELT

    Institute of Scientific and Technical Information of China (English)

    翟悦

    2007-01-01

    Literature is no longer a frightening word to English language learner. Interactive teaching methods and attractive activities can help motivating Chinese university English learners. This essay will first elaborate the reasons to use literature in ELT ( English Language Teaching) class and how to apply literature to ELT class.

  6. Applied Music (Individual Study).

    Science.gov (United States)

    Texas Education Agency, Austin.

    Background information and resources to help students in grades 9-12 in Texas pursue an individual study contract in applied music is presented. To fulfill a contract students must publicly perform from memory, with accompaniment as specified, three selections from a list of approved music for their chosen field (instrument or voice). Material…

  7. Africa and Applied Linguistics.

    Science.gov (United States)

    Makoni, Sinfree, Ed.; Meinhof, Ulrike H., Ed.

    2003-01-01

    This collection of articles includes: "Introducing Applied Linguistics in Africa" (Sinfree Makoni and Ulrike H. Meinhof); "Language Ideology and Politics: A Critical Appraisal of French as Second Official Language in Nigeria" (Tope Omoniyi); "The Democratisation of Indigenous Languages: The Case of Malawi" (Themba…

  8. Management Applied to Research

    Directory of Open Access Journals (Sweden)

    Gregoria J. Castañeda M.

    2007-04-01

    Full Text Available The present article discusses about the paper that the investigation plays and the management in the construction of new knowledge according to the social demands in a dynamic and complex environment. It begins with a bibliographical revision of the Management, the investigation and the structure from the investigative process in order to conform a Management referential theoretical framework of the Investigation, their phases and principles, associating methodological and managerial aspects in each stage of the process. You could visualize a series of approaches, tendencies and managerial tools that contribute to the efficacy, efficiency and effectiveness of the investigation.

  9. Useful design tools?

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole

    2005-01-01

    Tools for design management are on the agenda in building projects in order to set targets, to choose and prioritise between alternative environmental solutions, to involve stakeholders and to document, evaluate and benchmark. Different types of tools are available, but what can we learn from...... the use or lack of use of current tools in the development of future design tools for sustainable buildings? Why are some used while others are not? Who is using them? The paper deals with design management, with special focus on sustainable building in Denmark, and the challenge of turning the generally...... vague and contested concept of sustainability into concrete concepts and building projects. It describes a typology of tools: process tools, impact assessment tools, multi-criteria tools and tools for monitoring. It includes a Danish paradigmatic case study of stakeholder participation in the planning...

  10. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  11. Applied data mining for business and industry

    CERN Document Server

    Giudici, Paolo

    2009-01-01

    The increasing availability of data in our current, information overloaded society has led to the need for valid tools for its modelling and analysis. Data mining and applied statistical methods are the appropriate tools to extract knowledge from such data. This book provides an accessible introduction to data mining methods in a consistent and application oriented statistical framework, using case studies drawn from real industry projects and highlighting the use of data mining methods in a variety of business applications. Introduces data mining methods and applications.Covers classical and Bayesian multivariate statistical methodology as well as machine learning and computational data mining methods.Includes many recent developments such as association and sequence rules, graphical Markov models, lifetime value modelling, credit risk, operational risk and web mining.Features detailed case studies based on applied projects within industry.Incorporates discussion of data mining software, with case studies a...

  12. Applying the WEAP Model to Water Resource

    DEFF Research Database (Denmark)

    Gao, Jingjing; Christensen, Per; Li, Wei

    Water resources assessment is a tool to provide decision makers with an appropriate basis to make informed judgments regarding the objectives and targets to be addressed during the Strategic Environmental Assessment (SEA) process. The study shows how water resources assessment can be applied in SEA...... in assessing the effects on water resources using a case study on a Coal Industry Development Plan in an arid region in North Western China. In the case the WEAP model (Water Evaluation And Planning System) were used to simulate various scenarios using a diversity of technological instruments like irrigation...... efficiency, treatment and reuse of water. The WEAP model was applied to the Ordos catchment where it was used for the first time in China. The changes in water resource utilization in Ordos basin were assessed with the model. It was found that the WEAP model is a useful tool for water resource assessment...

  13. Applied Control Systems Design

    CERN Document Server

    Mahmoud, Magdi S

    2012-01-01

    Applied Control System Design examines several methods for building up systems models based on real experimental data from typical industrial processes and incorporating system identification techniques. The text takes a comparative approach to the models derived in this way judging their suitability for use in different systems and under different operational circumstances. A broad spectrum of control methods including various forms of filtering, feedback and feedforward control is applied to the models and the guidelines derived from the closed-loop responses are then composed into a concrete self-tested recipe to serve as a check-list for industrial engineers or control designers. System identification and control design are given equal weight in model derivation and testing to reflect their equality of importance in the proper design and optimization of high-performance control systems. Readers’ assimilation of the material discussed is assisted by the provision of problems and examples. Most of these e...

  14. Applied evaluative informetrics

    CERN Document Server

    Moed, Henk F

    2017-01-01

    This book focuses on applied evaluative informetric artifacts or topics. It explains the base notions and assumptions of evaluative informetrics by discussing a series of important applications. The structure of the book is therefore not organized by methodological characteristics, but is centered around popular, often discussed or used informetric artifacts - indicators, methodologies, products, databases - or so called hot topics in which informetric indicators play an important role. Most of the artifacts and topics emerged during the past decade. The principal aim of the book is to present a state of the art in applied evaluative informetrics, and to inform the readers about the pros and cons, potentialities and limitations of the use of informetric/bibliometric indicators in research assessment. The book is a continuation of the book Citation Analysis in Research Evaluation (Springer, 2005). It is of interest to non-specialists, especially research students at advanced master level and higher, all thos...

  15. Methods of applied mathematics

    CERN Document Server

    Hildebrand, Francis B

    1992-01-01

    This invaluable book offers engineers and physicists working knowledge of a number of mathematical facts and techniques not commonly treated in courses in advanced calculus, but nevertheless extremely useful when applied to typical problems in many different fields. It deals principally with linear algebraic equations, quadratic and Hermitian forms, operations with vectors and matrices, the calculus of variations, and the formulations and theory of linear integral equations. Annotated problems and exercises accompany each chapter.

  16. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  17. Applied statistics for economists

    CERN Document Server

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  18. Applied Economics in Teaching

    Institute of Scientific and Technical Information of China (English)

    朱红萍

    2009-01-01

    This paper explains some plain phenomena in teaching and class management with an economic view. Some basic economic principles mentioned therein are: everything has its opportunity cost; the marginal utility of consumption of any kind is diminishing; Game theory is everywhere. By applying the economic theories to teaching, it is of great help for teachers to understand the students' behavior and thus improve the teaching effectiveness and efficiency.

  19. LensTools: Weak Lensing computing tools

    Science.gov (United States)

    Petri, A.

    2016-02-01

    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  20. Applying WCET Analysis at Architectural Level

    OpenAIRE

    Gilles, Olivier; Hugues, Jérôme

    2008-01-01

    Real-Time embedded systems must enforce strict timing constraints. In this context, achieving precise Worst Case Execution Time is a prerequisite to apply scheduling analysis and verify system viability. WCET analysis is usually a complex and time-consuming activity. It becomes increasingly complex when one also considers code generation strategies from high-level models. In this paper, we present an experiment made on the coupling of the WCET analysis tool Bound-T and our AADL to code ...

  1. Corn nitrogen fertilization rate tools compared over eight Midwest states

    Science.gov (United States)

    Publicly-available nitrogen (N) rate recommendation tools are utilized to help maximize yield in corn production. These tools often fail when N is over-applied and results in excess N being lost to the environment, or when N is under-applied and results in decreased yield and economic returns. Perfo...

  2. Critical review of prostate cancer predictive tools.

    Science.gov (United States)

    Shariat, Shahrokh F; Kattan, Michael W; Vickers, Andrew J; Karakiewicz, Pierre I; Scardino, Peter T

    2009-12-01

    Prostate cancer is a very complex disease, and the decision-making process requires the clinician to balance clinical benefits, life expectancy, comorbidities and potential treatment-related side effects. Accurate prediction of clinical outcomes may help in the difficult process of making decisions related to prostate cancer. In this review, we discuss attributes of predictive tools and systematically review those available for prostate cancer. Types of tools include probability formulas, look-up and propensity scoring tables, risk-class stratification prediction tools, classification and regression tree analysis, nomograms and artificial neural networks. Criteria to evaluate tools include discrimination, calibration, generalizability, level of complexity, decision analysis and ability to account for competing risks and conditional probabilities. The available predictive tools and their features, with a focus on nomograms, are described. While some tools are well-calibrated, few have been externally validated or directly compared with other tools. In addition, the clinical consequences of applying predictive tools need thorough assessment. Nevertheless, predictive tools can facilitate medical decision-making by showing patients tailored predictions of their outcomes with various alternatives. Additionally, accurate tools may improve clinical trial design.

  3. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  5. Plaster core washout tool

    Science.gov (United States)

    Heisman, R. M.; Keir, A. R.; Teramura, K.

    1977-01-01

    Tool powered by pressurized water or air removes water soluble plaster lining from Kevlar/epoxy duct. Rotating plastic cutterhead with sealed end fitting connects flexible shaft that allows tool to be used with curved ducts.

  6. OOTW COST TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    HARTLEY, D.S.III; PACKARD, S.L.

    1998-09-01

    This document reports the results of a study of cost tools to support the analysis of Operations Other Than War (OOTW). It recommends the continued development of the Department of Defense (DoD) Contingency Operational Support Tool (COST) as the basic cost analysis tool for 00TWS. It also recommends modifications to be included in future versions of COST and the development of an 00TW mission planning tool to supply valid input for costing.

  7. 合成氨工业:过去、现在和未来--合成氨工业创立100周年回顾、启迪和挑战%Ammonia synthesis industry:Past,present and future--Retrospect,enlightenment and challenge from 100 years of ammonia synthesis industry

    Institute of Scientific and Technical Information of China (English)

    刘化章

    2013-01-01

    Haber-Bosch 发明的催化合成氨技术创立已经100周年。合成氨工业的巨大成功,改变了世界粮食生产的历史,解决了人类因人口增长所需要的粮食,奠定了多相催化科学和化学工程科学基础。催化合成氨技术在20世纪化学工业的发展中起着核心的作用。本文回顾了合成氨工业的创立和发展过程及其启迪,展望了合成氨工业未来和面临的新挑战,指出经典的传统合成氨工业与新兴产业密切相关,可以说是新兴产业的基础,蕴含着一系列高新技术。了解和熟悉合成氨的工艺流程和设备以及合成氨过程的成熟技术和实践经验,对于了解现代化工、能源、材料、环保领域一系列共性、关键技术,尤其是对于现代新型煤化工,具有强烈的启迪和借鉴作用。%Ammonia synthesis industry founded by Haber-Bosch has achieved its history of 100 years. The huge success altered the history of food production in the world,and met the growing demand of food due to population increase. In addition,it established the solid foundation for the development of heterogeneous catalysis and chemical engineering. Catalytic ammonia synthesis technology has played a central role in the development of chemical industry during the 20th century. This paper reviews the discovery and enlightenment from foundation and development of ammonia synthesis industry,and presents its future and new challenges. There is close relationship between traditional ammonia industry and the emerging industries. To some extent,ammonia synthesis industry is the basis of these emerging industries because ammonia synthesis industry contains a series of high and new technologies. Similarly,new discoveries in the field of ammonia synthesis industry have been extended to other areas of industries. Therefore , the high and new technologies of ammonia synthesis have strong enlightenment and reference for understanding and improving

  8. Pro Tools HD

    CERN Document Server

    Camou, Edouard

    2013-01-01

    An easy-to-follow guide for using Pro Tools HD 11 effectively.This book is ideal for anyone who already uses ProTools and wants to learn more, or is new to Pro Tools HD and wants to use it effectively in their own audio workstations.

  9. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...

  10. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  11. Applied complex variables

    CERN Document Server

    Dettman, John W

    1965-01-01

    Analytic function theory is a traditional subject going back to Cauchy and Riemann in the 19th century. Once the exclusive province of advanced mathematics students, its applications have proven vital to today's physicists and engineers. In this highly regarded work, Professor John W. Dettman offers a clear, well-organized overview of the subject and various applications - making the often-perplexing study of analytic functions of complex variables more accessible to a wider audience. The first half of Applied Complex Variables, designed for sequential study, is a step-by-step treatment of fun

  12. Applied logistic regression

    CERN Document Server

    Hosmer, David W; Sturdivant, Rodney X

    2013-01-01

     A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-

  13. Applied impulsive mathematical models

    CERN Document Server

    Stamova, Ivanka

    2016-01-01

    Using the theory of impulsive differential equations, this book focuses on mathematical models which reflect current research in biology, population dynamics, neural networks and economics. The authors provide the basic background from the fundamental theory and give a systematic exposition of recent results related to the qualitative analysis of impulsive mathematical models. Consisting of six chapters, the book presents many applicable techniques, making them available in a single source easily accessible to researchers interested in mathematical models and their applications. Serving as a valuable reference, this text is addressed to a wide audience of professionals, including mathematicians, applied researchers and practitioners.

  14. Applied Semantic Web Technologies

    CERN Document Server

    Sugumaran, Vijayan

    2011-01-01

    The rapid advancement of semantic web technologies, along with the fact that they are at various levels of maturity, has left many practitioners confused about the current state of these technologies. Focusing on the most mature technologies, Applied Semantic Web Technologies integrates theory with case studies to illustrate the history, current state, and future direction of the semantic web. It maintains an emphasis on real-world applications and examines the technical and practical issues related to the use of semantic technologies in intelligent information management. The book starts with

  15. Applied Chaos Control

    Science.gov (United States)

    Spano, Mark

    1997-04-01

    The publication by Ott, Grebogi and Yorke(E. Ott, C. Grebogi and J. A. Yorke, Phys. Rev. Lett. 64, 1196 (1990).) of their theory of chaos control in 1990 led to an explosion of experimental work applying their theory to mechanical systems and electronic circuits, lasers and chemical reactors, and heart and brain tissue, to name only a few. In this talk the basics of chaos control as implemented in a simple mechanical system will be described, as well as extensions of the method to biological applications. Finally, current advances in the field, including the maintenance of chaos and the control of high dimensional chaos, will be discussed.

  16. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2013-01-01

    Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus

  17. Applying Popper's Probability

    CERN Document Server

    Whiting, Alan B

    2014-01-01

    Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.

  18. Applied energy an introduction

    CERN Document Server

    Abdullah, Mohammad Omar

    2012-01-01

    Introduction to Applied EnergyGeneral IntroductionEnergy and Power BasicsEnergy EquationEnergy Generation SystemsEnergy Storage and MethodsEnergy Efficiencies and LossesEnergy industry and Energy Applications in Small -Medium Enterprises (SME) industriesEnergy IndustryEnergy-Intensive industryEnergy Applications in SME Energy industriesEnergy Sources and SupplyEnergy SourcesEnergy Supply and Energy DemandEnergy Flow Visualization and Sankey DiagramEnergy Management and AnalysisEnergy AuditsEnergy Use and Fuel Consumption StudyEnergy Life-Cycle AnalysisEnergy and EnvironmentEnergy Pollutants, S

  19. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2005-01-01

    Master linear regression techniques with a new edition of a classic text Reviews of the Second Edition: ""I found it enjoyable reading and so full of interesting material that even the well-informed reader will probably find something new . . . a necessity for all of those who do linear regression."" -Technometrics, February 1987 ""Overall, I feel that the book is a valuable addition to the now considerable list of texts on applied linear regression. It should be a strong contender as the leading text for a first serious course in regression analysis."" -American Scientist, May-June 1987

  20. SIFT applied to CBIR

    Directory of Open Access Journals (Sweden)

    ALMEIDA, J.

    2009-12-01

    Full Text Available Content-Based Image Retrieval (CBIR is a challenging task. Common approaches use only low-level features. Notwithstanding, such CBIR solutions fail on capturing some local features representing the details and nuances of scenes. Many techniques in image processing and computer vision can capture these scene semantics. Among them, the Scale Invariant Features Transform~(SIFT has been widely used in a lot of applications. This approach relies on the choice of several parameters which directly impact its effectiveness when applied to retrieve images. In this paper, we discuss the results obtained in several experiments proposed to evaluate the application of the SIFT in CBIR tasks.

  1. MOD Tool (Microwave Optics Design Tool)

    Science.gov (United States)

    Katz, Daniel S.; Borgioli, Andrea; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.

    1999-01-01

    The Jet Propulsion Laboratory (JPL) is currently designing and building a number of instruments that operate in the microwave and millimeter-wave bands. These include MIRO (Microwave Instrument for the Rosetta Orbiter), MLS (Microwave Limb Sounder), and IMAS (Integrated Multispectral Atmospheric Sounder). These instruments must be designed and built to meet key design criteria (e.g., beamwidth, gain, pointing) obtained from the scientific goals for the instrument. These criteria are frequently functions of the operating environment (both thermal and mechanical). To design and build instruments which meet these criteria, it is essential to be able to model the instrument in its environments. Currently, a number of modeling tools exist. Commonly used tools at JPL include: FEMAP (meshing), NASTRAN (structural modeling), TRASYS and SINDA (thermal modeling), MACOS/IMOS (optical modeling), and POPO (physical optics modeling). Each of these tools is used by an analyst, who models the instrument in one discipline. The analyst then provides the results of this modeling to another analyst, who continues the overall modeling in another discipline. There is a large reengineering task in place at JPL to automate and speed-up the structural and thermal modeling disciplines, which does not include MOD Tool. The focus of MOD Tool (and of this paper) is in the fields unique to microwave and millimeter-wave instrument design. These include initial design and analysis of the instrument without thermal or structural loads, the automation of the transfer of this design to a high-end CAD tool, and the analysis of the structurally deformed instrument (due to structural and/or thermal loads). MOD Tool is a distributed tool, with a database of design information residing on a server, physical optics analysis being performed on a variety of supercomputer platforms, and a graphical user interface (GUI) residing on the user's desktop computer. The MOD Tool client is being developed using Tcl

  2. MOD Tool (Microwave Optics Design Tool)

    Science.gov (United States)

    Katz, Daniel S.; Borgioli, Andrea; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.

    1999-01-01

    The Jet Propulsion Laboratory (JPL) is currently designing and building a number of instruments that operate in the microwave and millimeter-wave bands. These include MIRO (Microwave Instrument for the Rosetta Orbiter), MLS (Microwave Limb Sounder), and IMAS (Integrated Multispectral Atmospheric Sounder). These instruments must be designed and built to meet key design criteria (e.g., beamwidth, gain, pointing) obtained from the scientific goals for the instrument. These criteria are frequently functions of the operating environment (both thermal and mechanical). To design and build instruments which meet these criteria, it is essential to be able to model the instrument in its environments. Currently, a number of modeling tools exist. Commonly used tools at JPL include: FEMAP (meshing), NASTRAN (structural modeling), TRASYS and SINDA (thermal modeling), MACOS/IMOS (optical modeling), and POPO (physical optics modeling). Each of these tools is used by an analyst, who models the instrument in one discipline. The analyst then provides the results of this modeling to another analyst, who continues the overall modeling in another discipline. There is a large reengineering task in place at JPL to automate and speed-up the structural and thermal modeling disciplines, which does not include MOD Tool. The focus of MOD Tool (and of this paper) is in the fields unique to microwave and millimeter-wave instrument design. These include initial design and analysis of the instrument without thermal or structural loads, the automation of the transfer of this design to a high-end CAD tool, and the analysis of the structurally deformed instrument (due to structural and/or thermal loads). MOD Tool is a distributed tool, with a database of design information residing on a server, physical optics analysis being performed on a variety of supercomputer platforms, and a graphical user interface (GUI) residing on the user's desktop computer. The MOD Tool client is being developed using Tcl

  3. Applying lean thinking in construction

    Directory of Open Access Journals (Sweden)

    Remon Fayek Aziz

    2013-12-01

    Full Text Available The productivity of the construction industry worldwide has been declining over the past 40 years. One approach for improving the situation is using lean construction. Lean construction results from the application of a new form of production management to construction. Essential features of lean construction include a clear set of objectives for the delivery process, aimed at maximizing performance for the customer at the project level, concurrent design, construction, and the application of project control throughout the life cycle of the project from design to delivery. An increasing number of construction academics and professionals have been storming the ramparts of conventional construction management in an effort to deliver better value to owners while making real profits. As a result, lean-based tools have emerged and have been successfully applied to simple and complex construction projects. In general, lean construction projects are easier to manage, safer, completed sooner, and cost less and are of better quality. Significant research remains to complete the translation to construction of lean thinking in Egypt. This research will discuss principles, methods, and implementation phases of lean construction showing the waste in construction and how it could be minimized. The Last Planner System technique, which is an important application of the lean construction concepts and methodologies and is more prevalent, proved that it could enhance the construction management practices in various aspects. Also, it is intended to develop methodology for process evaluation and define areas for improvement based on lean approach principles.

  4. Applied Impact Physics Research

    Science.gov (United States)

    Wickert, Matthias

    2013-06-01

    Applied impact physics research is based on the capability to examine impact processes for a wide range of impact conditions with respect to velocity as well as mass and shape of the projectile. For this reason, Fraunhofer EMI operates a large variety of launchers that address velocities up to ordnance velocities as single stage powder gun but which can also be operated as two-stage light gas guns achieving the regime of low earth orbital velocity. Thereby for projectile masses of up to 100 g hypervelocity impact phenomena up to 7.8 km/s can be addressed. Advanced optical diagnostic techniques like microsecond video are used as commercial systems but - since impact phenomena are mostly related with debris or dust - specialized diagnostics are developed in-house like x-ray cinematography and x-ray tomography. Selected topics of the field of applied impact physics will be presented like the interesting behavior of long rods penetrating low-density materials or experimental findings at hypervelocity for this class of materials as well as new x-ray diagnositic techniques.

  5. Machine tool structures

    CERN Document Server

    Koenigsberger, F

    1970-01-01

    Machine Tool Structures, Volume 1 deals with fundamental theories and calculation methods for machine tool structures. Experimental investigations into stiffness are discussed, along with the application of the results to the design of machine tool structures. Topics covered range from static and dynamic stiffness to chatter in metal cutting, stability in machine tools, and deformations of machine tool structures. This volume is divided into three sections and opens with a discussion on stiffness specifications and the effect of stiffness on the behavior of the machine under forced vibration c

  6. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    This paper describes and discusses two different Scheme documentation tools. The first is SchemeDoc, which is intended for documentation of the interfaces of Scheme libraries (APIs). The second is the Scheme Elucidator, which is for internal documentation of Scheme programs. Although the tools...... are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...

  7. Environmental management tools: international practicies for Russia

    OpenAIRE

    Smetanina, T.; Pintassilgo, P.; Matias, A.

    2014-01-01

    This article deals with the basic tools of environmental management applied by developed countries and discusses its application to Russia. The focus is on environmental management instruments such as environmental taxes, subsidies, standards, permits and also on the important role of voluntary tools. Russian practice is analyzed in terms of the current environmental management situation and the prospects of necessary legislative actions. The article refers to the formation of the basic parts...

  8. Microfluidic tools toward industrial biotechnology.

    Science.gov (United States)

    Oliveira, Aline F; Pessoa, Amanda C S N; Bastos, Reinaldo G; de la Torre, Lucimara G

    2016-11-01

    Microfluidics is a technology that operates with small amounts of fluids and makes possible the investigation of cells, enzymes, and biomolecules and encapsulation of biocatalysts in a greater variety of conditions than permitted using conventional methods. This review discusses technological possibilities that can be applied in the field of industrial biotechnology, presenting the principal definitions and fundamental aspects of microfluidic parameters to better understand advanced approaches. Specifically, concentration gradient generators, droplet-based microfluidics, and microbioreactors are explored as useful tools that can contribute to industrial biotechnology. These tools present potential applications, inclusive as commercial platforms to optimizing in bioprocesses development as screening cells, encapsulating biocatalysts, and determining critical kinetic parameters. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1372-1389, 2016. © 2016 American Institute of Chemical Engineers.

  9. Formal Assurance Certifiable Tooling Formal Assurance Certifiable Tooling Strategy Final Report

    Science.gov (United States)

    Bush, Eric; Oglesby, David; Bhatt, Devesh; Murugesan, Anitha; Engstrom, Eric; Mueller, Joe; Pelican, Michael

    2017-01-01

    This is the Final Report of a research project to investigate issues and provide guidance for the qualification of formal methods tools under the DO-330 qualification process. It consisted of three major subtasks spread over two years: 1) an assessment of theoretical soundness issues that may affect qualification for three categories of formal methods tools, 2) a case study simulating the DO-330 qualification of two actual tool sets, and 3) an investigation of risk mitigation strategies that might be applied to chains of such formal methods tools in order to increase confidence in their certification of airborne software.

  10. Applied partial differential equations

    CERN Document Server

    Logan, J David

    2015-01-01

    This text presents the standard material usually covered in a one-semester, undergraduate course on boundary value problems and PDEs.  Emphasis is placed on motivation, concepts, methods, and interpretation, rather than on formal theory. The concise treatment of the subject is maintained in this third edition covering all the major ideas: the wave equation, the diffusion equation, the Laplace equation, and the advection equation on bounded and unbounded domains. Methods include eigenfunction expansions, integral transforms, and characteristics. In this third edition, text remains intimately tied to applications in heat transfer, wave motion, biological systems, and a variety other topics in pure and applied science. The text offers flexibility to instructors who, for example, may wish to insert topics from biology or numerical methods at any time in the course. The exposition is presented in a friendly, easy-to-read, style, with mathematical ideas motivated from physical problems. Many exercises and worked e...

  11. Applied statistical thermodynamics

    CERN Document Server

    Lucas, Klaus

    1991-01-01

    The book guides the reader from the foundations of statisti- cal thermodynamics including the theory of intermolecular forces to modern computer-aided applications in chemical en- gineering and physical chemistry. The approach is new. The foundations of quantum and statistical mechanics are presen- ted in a simple way and their applications to the prediction of fluid phase behavior of real systems are demonstrated. A particular effort is made to introduce the reader to expli- cit formulations of intermolecular interaction models and to show how these models influence the properties of fluid sy- stems. The established methods of statistical mechanics - computer simulation, perturbation theory, and numerical in- tegration - are discussed in a style appropriate for newcom- ers and are extensively applied. Numerous worked examples illustrate how practical calculations should be carried out.

  12. Applied number theory

    CERN Document Server

    Niederreiter, Harald

    2015-01-01

    This textbook effectively builds a bridge from basic number theory to recent advances in applied number theory. It presents the first unified account of the four major areas of application where number theory plays a fundamental role, namely cryptography, coding theory, quasi-Monte Carlo methods, and pseudorandom number generation, allowing the authors to delineate the manifold links and interrelations between these areas.  Number theory, which Carl-Friedrich Gauss famously dubbed the queen of mathematics, has always been considered a very beautiful field of mathematics, producing lovely results and elegant proofs. While only very few real-life applications were known in the past, today number theory can be found in everyday life: in supermarket bar code scanners, in our cars’ GPS systems, in online banking, etc.  Starting with a brief introductory course on number theory in Chapter 1, which makes the book more accessible for undergraduates, the authors describe the four main application areas in Chapters...

  13. Applied systems theory

    CERN Document Server

    Dekkers, Rob

    2017-01-01

    Offering an up-to-date account of systems theories and its applications, this book provides a different way of resolving problems and addressing challenges in a swift and practical way, without losing overview and grip on the details. From this perspective, it offers a different way of thinking in order to incorporate different perspectives and to consider multiple aspects of any given problem. Drawing examples from a wide range of disciplines, it also presents worked cases to illustrate the principles. The multidisciplinary perspective and the formal approach to modelling of systems and processes of ‘Applied Systems Theory’ makes it suitable for managers, engineers, students, researchers, academics and professionals from a wide range of disciplines; they can use this ‘toolbox’ for describing, analysing and designing biological, engineering and organisational systems as well as getting a better understanding of societal problems. This revised, updated and expanded second edition includes coverage of a...

  14. Brain oxygenation patterns during the execution of tool use demonstration, tool use pantomime, and body-part-as-object tool use.

    Science.gov (United States)

    Helmich, Ingo; Holle, Henning; Rein, Robert; Lausberg, Hedda

    2015-04-01

    Divergent findings exist whether left and right hemispheric pre- and postcentral cortices contribute to the production of tool use related hand movements. In order to clarify the neural substrates of tool use demonstrations with tool in hand, tool use pantomimes without tool in hand, and body-part-as-object presentations of tool use (BPO) in a naturalistic mode of execution, we applied functional Near InfraRed Spectroscopy (fNIRS) in twenty-three right-handed participants. Functional NIRS techniques allow for the investigation of brain oxygenation during the execution of complex hand movements with an unlimited movement range. Brain oxygenation patterns were retrieved from 16 channels of measurement above pre- and postcentral cortices of each hemisphere. The results showed that tool use demonstration with tool in hand leads to increased oxygenation as compared to tool use pantomimes in the left hemispheric somatosensory gyrus. Left hand executions of the demonstration of tool use, pantomime of tool use, and BPO of tool use led to increased oxygenation in the premotor and somatosensory cortices of the left hemisphere as compared to right hand executions of either condition. The results indicate that the premotor and somatosensory cortices of the left hemisphere constitute relevant brain structures for tool related hand movement production when using the left hand, whereas the somatosensory cortex of the left hemisphere seems to provide specific mental representations when performing tool use demonstrations with the tool in hand.

  15. Miniaturised Spotter-Compatible Multicapillary Stamping Tool for Microarray Printing

    CERN Document Server

    Drobyshev, A L; Zasedatelev, A S; Drobyshev, Alexei L; Verkhodanov, Nikolai N; Zasedatelev, Alexander S

    2007-01-01

    Novel microstamping tool for microarray printing is proposed. The tool is capable to spot up to 127 droplets of different solutions in single touch. It is easily compatible with commercially available microarray spotters. The tool is based on multichannel funnel with polypropylene capillaries inserted into its channels. Superior flexibility is achieved by ability to replace any printing capillary of the tool. As a practical implementation, hydrogel-based microarrays were stamped and successfully applied to identify the Mycobacterium tuberculosis drug resistance.

  16. Recent Advances in Algal Genetic Tool Development

    Energy Technology Data Exchange (ETDEWEB)

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well as prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.

  17. QOS-Enabled Networks Tools and Foundations

    CERN Document Server

    Barreiros, Miguel

    2011-01-01

    This book discusses the theory of Quality of Service (QoS) mechanisms in packet networks and shows how to apply them in practice. QOS-Enabled Networks: Tools and Foundations provides an understanding of the theory of QOS mechanisms in packet networks and how to apply them in practice. It explains QOS management features found in modern routers used by Internet Service Providers (ISP) and large companies.  The goal of this book is to provide the background and tools to help network managers and engineers configure production networks for providing services with quality of service (QOS) guarante

  18. Capacitive tool standoff sensor for dismantlement tasks

    Energy Technology Data Exchange (ETDEWEB)

    Schmitt, D.J.; Weber, T.M. [Sandia National Labs., Albuquerque, NM (United States); Liu, J.C. [Univ. of Illinois, Urbana, IL (United States)

    1996-12-31

    A capacitive sensing technology has been applied to develop a Standoff Sensor System for control of robotically deployed tools utilized in Decontamination and Dismantlement (D and D) activities. The system combines four individual sensor elements to provide non-contact, multiple degree-of-freedom control of tools at distances up to five inches from a surface. The Standoff Sensor has been successfully integrated to a metal cutting router and a pyrometer, and utilized for real-time control of each of these tools. Experiments demonstrate that the system can locate stationary surfaces with a repeatability of 0.034 millimeters.

  19. Applied physiology of cycling.

    Science.gov (United States)

    Faria, I E

    1984-01-01

    Historically, the bicycle has evolved through the stages of a machine for efficient human transportation, a toy for children, a finely-tuned racing machine, and a tool for physical fitness development, maintenance and testing. Recently, major strides have been made in the aerodynamic design of the bicycle. These innovations have resulted in new land speed records for human powered machines. Performance in cycling is affected by a variety of factors, including aerobic and anaerobic capacity, muscular strength and endurance, and body composition. Bicycle races range from a 200m sprint to approximately 5000km. This vast range of competitive racing requires special attention to the principle of specificity of training. The physiological demands of cycling have been examined through the use of bicycle ergometers, rollers, cycling trainers, treadmill cycling, high speed photography, computer graphics, strain gauges, electromyography, wind tunnels, muscle biopsy, and body composition analysis. These techniques have been useful in providing definitive data for the development of a work/performance profile of the cyclist. Research evidence strongly suggests that when measuring the cyclist's aerobic or anaerobic capacity, a cycling protocol employing a high pedalling rpm should be used. The research bicycle should be modified to resemble a racing bicycle and the cyclist should wear cycling shoes. Prolonged cycling requires special nutritional considerations. Ingestion of carbohydrates, in solid form and carefully timed, influences performance. Caffeine appears to enhance lipid metabolism. Injuries, particularly knee problems which are prevalent among cyclists, may be avoided through the use of proper gearing and orthotics. Air pollution has been shown to impair physical performance. When pollution levels are high, training should be altered or curtailed. Effective training programmes simulate competitive conditions. Short and long interval training, blended with long

  20. Enterprise integration: A tool`s perspective

    Energy Technology Data Exchange (ETDEWEB)

    Polito, J. [Sandia National Labs., Albuquerque, NM (United States); Jones, A. [National Inst. of Standards and Technology, Gaithersburg, MD (United States); Grant, H. [National Science Foundation, Washington, DC (United States)

    1993-06-01

    The advent of sophisticated automation equipment and computer hardware and software is changing the way manufacturing is carried out. To compete in the global marketplace, manufacturing companies must integrate these new technologies into their factories. In addition, they must integrate the planning, control, and data management methodologies needed to make effective use of these technologies. This paper provides an overview of recent approaches to achieving this enterprise integration. It then describes, using simulation as a particular example, a new tool`s perspective of enterprise integration.

  1. PAT tools for fermentation processes

    DEFF Research Database (Denmark)

    Gernaey, Krist; Bolic, Andrijana; Svanholm, Bent

    2012-01-01

    The publication of the Process Analytical Technology (PAT) guidance has been one of the most important milestones for pharmaceutical production during the past ten years. The ideas outlined in the PAT guidance are also applied in other industries, for example the fermentation industry. Process...... knowledge is central in PAT projects. This manuscript therefore gives a brief overview of a number of PAT tools for collecting process knowledge on fermentation processes: on-line sensors, mechanistic models and small-scale equipment for high-throughput experimentation. The manuscript ends with a short...

  2. PAT tools for fermentation processes

    DEFF Research Database (Denmark)

    Gernaey, Krist; Bolic, Andrijana; Svanholm, Bent

    2012-01-01

    The publication of the Process Analytical Technology (PAT) guidance has been one of the most important milestones for pharmaceutical production during the past ten years. The ideas outlined in the PAT guidance are also applied in other industries, for example the fermentation industry. Process...... knowledge is central in PAT projects. This manuscript therefore gives a brief overview of a number of PAT tools for collecting process knowledge on fermentation processes: on-line sensors, mechanistic models and small-scale equipment for high-throughput experimentation. The manuscript ends with a short...

  3. Apply of Automatic Generation Technology Segment Tool Electrode Blank Geometry and Cutting Dimension%花纹块工具电极毛坯几何体及下料尺寸图自动生成技术的应用

    Institute of Scientific and Technical Information of China (English)

    胡海明; 张浩

    2013-01-01

    To solve the inefficient problem of relying on traditional manual drawing the tool electrode blank geometry and cutting dimension separately,it write the tool electrode blank geometry and cutting dimension generated procedures automatically and simultaneously are writed using GRIP language.Users need only directly to select the three-dimensional model of the tool electrode and enter the unilateral margin,the tool electrode blank geometry and cutting dimension could be generated automatically.This improves greatly work efficiency.%为了解决传统手工单独绘制工具电极毛坯几何体及下料尺寸图效率低的问题,应用GRIP语言编写了工具电极毛坯几何体及下料尺寸图自动同时生成程序.用户只需选择工具电极的三维模型并输入单边余量值,便可自动生成该工具电极的毛坯几何体及其下料尺寸图,大大提高了工作效率.

  4. Applied Linguistics and the "Annual Review of Applied Linguistics."

    Science.gov (United States)

    Kaplan, Robert B.; Grabe, William

    2000-01-01

    Examines the complexities and differences involved in granting disciplinary status to the role of applied linguistics, discusses the role of the "Annual Review of Applied Linguistics" as a contributor to the development of applied linguistics, and highlights a set of publications for the future of applied linguistics. (Author/VWL)

  5. 近百年全球草地生态系统净初级生产力时空动态对气候变化的响应%The NPP spatiotemporal variation of global grassland ecosystems in response to climate change over the past 100 years

    Institute of Scientific and Technical Information of China (English)

    刚成诚; 王钊齐; 杨悦; 陈奕兆; 张艳珍; 李建龙; 程积民

    2016-01-01

    Classification System (CSCS) and a segmentation model.Correlation analysis was also conducted to reveal the responses of grassland types to different climate variables.The results showed that the total global area of grassland ecosystems declined from 5175.73×104 km2 in the 1920s to 5102.16×104 km2 in the 1990s.The largest decrease,192.35×104 km2 ,oc-curred in tundra & alpine steppe ecosystems.The areas of desert grassland,typical grassland and temperate humid grassland decreased by 14.31,34.15 and 70.81×104 km2 respectively,while tropical savanna expanded by 238.06×104 km2 .Climate warming forced most grasslands to shift northwards,particularly in the northern hemisphere.Global grassland NPP increased from 25.93 Pg DW/yr in the 1920s to 26.67 Pg DW/yr in the 1990s.In terms of each grassland type,the NPP of the tundra and alpine steppe,desert grassland,typical grassland and temperate humid grassland decreased by 709.57,24.98,115.74 and 291.56 Tg DW/yr respec-tively.The NPP of tropical savanna increased by 1887.37 Tg DW/yr.At the global scale,precipitation was the dominant factor affecting grassland NPP.In general,grassland ecosystems have been substantially affected by climate change over the past 100 years.Although the global grassland NPP showed an overall increasing trend, the structure and distribution of particular grassland ecosystems had been adversely affected by the warmer and wetter climate.

  6. Applying evolutionary anthropology.

    Science.gov (United States)

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution. © 2015 Wiley Periodicals, Inc.

  7. Applied hydraulic transients

    CERN Document Server

    Chaudhry, M Hanif

    2014-01-01

    This book covers hydraulic transients in a comprehensive and systematic manner from introduction to advanced level and presents various methods of analysis for computer solution. The field of application of the book is very broad and diverse and covers areas such as hydroelectric projects, pumped storage schemes, water-supply systems, cooling-water systems, oil pipelines and industrial piping systems. Strong emphasis is given to practical applications, including several case studies, problems of applied nature, and design criteria. This will help design engineers and introduce students to real-life projects. This book also: ·         Presents modern methods of analysis suitable for computer analysis, such as the method of characteristics, explicit and implicit finite-difference methods and matrix methods ·         Includes case studies of actual projects ·         Provides extensive and complete treatment of governed hydraulic turbines ·         Presents design charts, desi...

  8. Academic training: Applied superconductivity

    CERN Multimedia

    2007-01-01

    LECTURE SERIES 17, 18, 19 January from 11.00 to 12.00 hrs Council Room, Bldg 503 Applied Superconductivity : Theory, superconducting Materials and applications E. PALMIERI/INFN, Padova, Italy When hearing about persistent currents recirculating for several years in a superconducting loop without any appreciable decay, one realizes that we are dealing with a phenomenon which in nature is the closest known to the perpetual motion. Zero resistivity and perfect diamagnetism in Mercury at 4.2 K, the breakthrough during 75 years of several hundreds of superconducting materials, the revolution of the "liquid Nitrogen superconductivity"; the discovery of still a binary compound becoming superconducting at 40 K and the subsequent re-exploration of the already known superconducting materials: Nature discloses drop by drop its intimate secrets and nobody can exclude that the last final surprise must still come. After an overview of phenomenology and basic theory of superconductivity, the lectures for this a...

  9. Applying Triz for Production Quality Improvement

    Directory of Open Access Journals (Sweden)

    Swee Nikalus Shu Luing

    2017-01-01

    Full Text Available This paper aims to provide a thorough analysis on the application of TRIZ in improving the quality of canned food production. TRIZ tools such as engineering systems analysis, function analysis, cause and effect chain analysis, By-separation model and 40 Inventive Principles are applied in order to discover some feasible and elegant solutions to alleviate the problem. Findings revealed that the rejected canned products on the conveyor belt will be isolated or picked up with other good condition canned products which are lined up very closely to the rejected cans; though the visioning system is able detect the fault printing on the canned product. The main root cause is that the rejected canned product is picked up with other canned products in good condition because all cans are lined up on the belt and are very close to each other or having no gaps between the cans. Conversely, all cans on the conveyor belts are required to be very close to each other to avoid collisions that may damage the cans. The root cause is solved by applying function analysis, By-separation tool and Inventive Principles. Therefore, it can be concluded that TRIZ is a powerful tool in inventive problem solving.

  10. Lunar hand tools

    Science.gov (United States)

    Bentz, Karl F.; Coleman, Robert D.; Dubnik, Kathy; Marshall, William S.; Mcentee, Amy; Na, Sae H.; Patton, Scott G.; West, Michael C.

    1987-01-01

    Tools useful for operations and maintenance tasks on the lunar surface were determined and designed. Primary constraints are the lunar environment, the astronaut's space suit and the strength limits of the astronaut on the moon. A multipurpose rotary motion tool and a collapsible tool carrier were designed. For the rotary tool, a brushless motor and controls were specified, a material for the housing was chosen, bearings and lubrication were recommended and a planetary reduction gear attachment was designed. The tool carrier was designed primarily for ease of access to the tools and fasteners. A material was selected and structural analysis was performed on the carrier. Recommendations were made about the limitations of human performance and about possible attachments to the torque driver.

  11. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  12. Authoring tool evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, A.L.; Klenk, K.S.; Coday, A.C.; McGee, J.P.; Rivenburgh, R.R.; Gonzales, D.M.; Mniszewski, S.M.

    1994-09-15

    This paper discusses and evaluates a number of authoring tools currently on the market. The tools evaluated are Visix Galaxy, NeuronData Open Interface Elements, Sybase Gain Momentum, XVT Power++, Aimtech IconAuthor, Liant C++/Views, and Inmark Technology zApp. Also discussed is the LIST project and how this evaluation is being used to fit an authoring tool to the project.

  13. Population Density Modeling Tool

    Science.gov (United States)

    2014-02-05

    194 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke 26 June 2012 Distribution...MARYLAND NAWCADPAX/TR-2012/194 26 June 2012 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke...Density Modeling Tool 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Davy Andrew Michael Knott David Burke 5d. PROJECT NUMBER

  14. CMS offline web tools

    CERN Document Server

    Metson, S; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Evans, D; Fanfani, A; Feichtinger, D; Kavka, C; Kuznetsov, V; Van Lingen, F; Newbold, D; Tuura, L; Wakefield, S

    2008-01-01

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments.

  15. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  16. Java Power Tools

    CERN Document Server

    Smart, John

    2008-01-01

    All true craftsmen need the best tools to do their finest work, and programmers are no different. Java Power Tools delivers 30 open source tools designed to improve the development practices of Java developers in any size team or organization. Each chapter includes a series of short articles about one particular tool -- whether it's for build systems, version control, or other aspects of the development process -- giving you the equivalent of 30 short reference books in one package. No matter which development method your team chooses, whether it's Agile, RUP, XP, SCRUM, or one of many other

  17. Qlikview Audit Tool (QLIKVIEW) -

    Data.gov (United States)

    Department of Transportation — This tool supports the cyclical financial audit process. Qlikview supports large volumes of financial transaction data that can be mined, summarized and presented to...

  18. Double diameter boring tool

    Science.gov (United States)

    Ashbaugh, Fred N.; Murry, Kenneth R.

    1988-12-27

    A boring tool and a method of operation are provided for boring two concentric holes of precision diameters and depths in a single operation. The boring tool includes an elongated tool body, a shank for attachment to a standard adjustable boring head which is used on a manual or numerical control milling machine and first and second diametrically opposed cutting edges formed for cutting in opposite directions. The diameter of the elongated tool body is substantially equal to the distance from the first cutting edge tip to the axis of rotation plus the distance from the second cutting edge tip to the axis of rotation. The axis of rotation of the tool is spaced from the tool centerline a distance substantially equal to one-half the distance from the second cutting edge tip to the axis of rotation minus one-half the distance from the first cutting edge tip to the axis of rotation. The method includes the step of inserting the boring tool into the boring head, adjusting the distance between the tool centerline and the tool axis of rotation as described above and boring the two concentric holes.

  19. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  20. Applied large eddy simulation.

    Science.gov (United States)

    Tucker, Paul G; Lardeau, Sylvain

    2009-07-28

    Large eddy simulation (LES) is now seen more and more as a viable alternative to current industrial practice, usually based on problem-specific Reynolds-averaged Navier-Stokes (RANS) methods. Access to detailed flow physics is attractive to industry, especially in an environment in which computer modelling is bound to play an ever increasing role. However, the improvement in accuracy and flow detail has substantial cost. This has so far prevented wider industrial use of LES. The purpose of the applied LES discussion meeting was to address questions regarding what is achievable and what is not, given the current technology and knowledge, for an industrial practitioner who is interested in using LES. The use of LES was explored in an application-centred context between diverse fields. The general flow-governing equation form was explored along with various LES models. The errors occurring in LES were analysed. Also, the hybridization of RANS and LES was considered. The importance of modelling relative to boundary conditions, problem definition and other more mundane aspects were examined. It was to an extent concluded that for LES to make most rapid industrial impact, pragmatic hybrid use of LES, implicit LES and RANS elements will probably be needed. Added to this further, highly industrial sector model parametrizations will be required with clear thought on the key target design parameter(s). The combination of good numerical modelling expertise, a sound understanding of turbulence, along with artistry, pragmatism and the use of recent developments in computer science should dramatically add impetus to the industrial uptake of LES. In the light of the numerous technical challenges that remain it appears that for some time to come LES will have echoes of the high levels of technical knowledge required for safe use of RANS but with much greater fidelity.

  1. Essays in applied microeconomics

    Science.gov (United States)

    Wang, Xiaoting

    In this dissertation I use Microeconomic theory to study firms' behavior. Chapter One introduces the motivations and main findings of this dissertation. Chapter Two studies the issue of information provision through advertisement when markets are segmented and consumers' price information is incomplete. Firms compete in prices and advertising strategies for consumers with transportation costs. High advertising costs contribute to market segmentation. Low advertising costs promote price competition among firms and improves consumer welfare. Chapter Three also investigates market power as a result of consumers' switching costs. A potential entrant can offer a new product bundled with an existing product to compensate consumers for their switching cost. If the primary market is competitive, bundling simply plays the role of price discrimination, and it does not dominate unbundled sales in the process of entry. If the entrant has market power in the primary market, then bundling also plays the role of leveraging market power and it dominates unbundled sales. The market for electric power generation has been opened to competition in recent years. Chapter Four looks at issues involved in the deregulated electricity market. By comparing the performance of the competitive market with the social optimum, we identify the conditions under which market equilibrium generates socially efficient levels of electric power. Chapter Two to Four investigate the strategic behavior among firms. Chapter Five studies the interaction between firms and unemployed workers in a frictional labor market. We set up an asymmetric job auction model, where two types of workers apply for two types of job openings by bidding in auctions and firms hire the applicant offering them the most profits. The job auction model internalizes the determination of the share of surplus from a match, therefore endogenously generates incentives for an efficient division of the matching surplus. Microeconomic

  2. Vygotsky in applied neuropsychology

    Directory of Open Access Journals (Sweden)

    Glozman J. M.

    2016-12-01

    Full Text Available The aims of this paper are: 1 to show the role of clinical experience for the theoretical contributions of L.S. Vygotsky, and 2 to analyze the development of these theories in contemporary applied neuropsychology. An analysis of disturbances of mental functioning is impossible without a systemic approach to the evidence observed. Therefore, medical psychology is fundamental for forming a systemic approach to psychology. The assessment of neurological patients at the neurological hospital of Moscow University permitted L.S. Vygotsky to create, in collaboration with A.R. Luria, the theory of systemic dynamic localization of higher mental functions and their relationship to cultural conditions. In his studies of patients with Parkinson’s disease, Vygotsky also set out 3 steps of systemic development: interpsychological, then extrapsychological, then intrapsychological. L.S. Vygotsky and A.R. Luria in the late 1920s created a program to compensate for the motor subcortical disturbances in Parkinson’s disease (PD through a cortical (visual mediation of movements. We propose to distinguish the objective mediating factors — like teaching techniques and modalities — from subjective mediating factors, like the individual’s internal representation of his/her own disease. The cultural-historical approach in contemporary neuropsychology forces neuropsychologists to re-analyze and re-interpret the classic neuropsychological syndromes; to develop new assessment procedures more in accordance with the patient’s conditions of life; and to reconsider the concept of the social brain as a social and cultural determinant and regulator of brain functioning. L.S. Vygotsky and A.R. Luria proved that a defect interferes with a child’s appropriation of his/her culture, but cultural means can help the child overcome the defect. In this way, the cultural-historical approach became, and still is, a methodological basis for remedial education.

  3. Applied Historical Astronomy

    Science.gov (United States)

    Stephenson, F. Richard

    2014-01-01

    F. Richard Stephenson has spent most of his research career -- spanning more than 45 years -- studying various aspects of Applied Historical Astronomy. The aim of this interdisciplinary subject is the application of historical astronomical records to the investigation of problems in modern astronomy and geophysics. Stephenson has almost exclusively concentrated on pre-telescopic records, especially those preserved from ancient and medieval times -- the earliest reliable observations dating from around 700 BC. The records which have mainly interested him are of eclipses (both solar and lunar), supernovae, sunspots and aurorae, and Halley's Comet. The main sources of early astronomical data are fourfold: records from ancient and medieval East Asia (China, together with Korea and Japan); ancient Babylon; ancient and medieval Europe; and the medieval Arab world. A feature of Stephenson's research is the direct consultation of early astronomical texts in their original language -- either working unaided or with the help of colleagues. He has also developed a variety of techniques to help interpret the various observations. Most pre-telescopic observations are very crude by present-day standards. In addition, early motives for skywatching were more often astrological rather than scientific. Despite these drawbacks, ancient and medieval astronomical records have two remarkable advantages over modern data. Firstly, they can enable the investigation of long-term trends (e.g. in the terrestrial rate of rotation), which in the relatively short period covered by telescopic observations are obscured by short-term fluctuations. Secondly, over the lengthy time-scale which they cover, significant numbers of very rare events (such as Galactic supernovae) were reported, which have few -- if any-- counterparts in the telescopic record. In his various researches, Stephenson has mainly focused his attention on two specific topics. These are: (i) long-term changes in the Earth's rate of

  4. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  5. Applied physiology of swimming.

    Science.gov (United States)

    Lavoie, J M; Montpetit, R R

    1986-01-01

    Scientific research in swimming over the past 10 to 15 years has been oriented toward multiple aspects that relate to applied and basic physiology, metabolism, biochemistry, and endocrinology. This review considers recent findings on: 1) specific physical characteristics of swimmers; 2) the energetics of swimming; 3) the evaluation of aerobic fitness in swimming; and 4) some metabolic and hormonal aspects related to swimmers. Firstly, the age of finalists in Olympic swimming is not much different from that of the participants from other sports. They are taller and heavier than a reference population of the same age. The height bias in swimming may be the reason for lack of success from some Asian and African countries. Experimental data point toward greater leanness, particularly in female swimmers, than was seen 10 years ago. Overall, female swimmers present a range of 14 to 19% body fat whereas males are much lower (5 to 10%). Secondly, the relationship between O2 uptake and crawl swimming velocity (at training and competitive speeds) is thought to be linear. The energy cost varies between strokes with a dichotomy between the 2 symmetrical and the 2 asymmetrical strokes. Energy expenditure in swimming is represented by the sum of the cost of translational motion (drag) and maintenance of horizontal motion (gravity). The cost of the latter decreases as speed increases. Examination of the question of size-associated effects on the cost of swimming using Huxley's allometric equation (Y = axb) shows an almost direct relationship with passive drag. Expressing energy cost in litres of O2/m/kg is proposed as a better index of technical swimming ability than the traditional expression of VO2/distance in L/km. Thirdly, maximal direct conventional techniques used to evaluate maximal oxygen consumption (VO2 max) in swimming include free swimming, tethered swimming, and flume swimming. Despite the individual peculiarities of each method, with similar experimental conditions

  6. Essays in Applied Microeconomics

    Science.gov (United States)

    Ge, Qi

    This dissertation consists of three self-contained applied microeconomics essays on topics related to behavioral economics and industrial organization. Chapter 1 studies how sentiment as a result of sports event outcomes affects consumers' tipping behavior in the presence of social norms. I formulate a model of tipping behavior that captures consumer sentiment following a reference-dependent preference framework and empirically test its relevance using the game outcomes of the NBA and the trip and tipping data on New York City taxicabs. While I find that consumers' tipping behavior responds to unexpected wins and losses of their home team, particularly in close game outcomes, I do not find evidence for loss aversion. Coupled with the findings on default tipping, my empirical results on the asymmetric tipping responses suggest that while social norms may dominate loss aversion, affect and surprises can result in freedom on the upside of tipping. Chapter 2 utilizes a novel data source of airline entry and exit announcements and examines how the incumbent airlines adjust quality provisions as a response to their competitors' announcements and the role of timing in such responses. I find no evidence that the incumbents engage in preemptive actions when facing probable entry and exit threats as signaled by the competitors' announcements in either short term or long term. There is, however, evidence supporting their responses to the competitors' realized entry or exit. My empirical findings underscore the role of timing in determining preemptive actions and suggest that previous studies may have overestimated how the incumbent airlines respond to entry threats. Chapter 3, which is collaborated with Benjamin Ho, investigates the habit formation of consumers' thermostat setting behavior, an often implicitly made decision and yet a key determinant of home energy consumption and expenditures. We utilize a high frequency dataset on household thermostat usage and find that

  7. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  8. Pneumatically actuated hand tool

    NARCIS (Netherlands)

    Cool, J.C.; Rijnsaardt, K.A.

    1996-01-01

    Abstract of NL 9401195 (A) Pneumatically actuated hand tool for carrying out a mechanical operation, provided with an exchangeable gas cartridge in which the gas which is required for pneumatic actuation is stored. More particularly, the hand tool is provided with at least one pneumatic motor, at

  9. Coring Sample Acquisition Tool

    Science.gov (United States)

    Haddad, Nicolas E.; Murray, Saben D.; Walkemeyer, Phillip E.; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Kriechbaum, Kristopher L.; Richardson, Megan; Klein, Kerry J.

    2012-01-01

    A sample acquisition tool (SAT) has been developed that can be used autonomously to sample drill and capture rock cores. The tool is designed to accommodate core transfer using a sample tube to the IMSAH (integrated Mars sample acquisition and handling) SHEC (sample handling, encapsulation, and containerization) without ever touching the pristine core sample in the transfer process.

  10. WATERS Expert Query Tool

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Expert Query Tool is a web-based reporting tool using the EPA’s WATERS database.There are just three steps to using Expert Query:1. View Selection – Choose what...

  11. Study of Tools Interoperability

    NARCIS (Netherlands)

    Krilavičius, T.

    2007-01-01

    Interoperability of tools usually refers to a combination of methods and techniques that address the problem of making a collection of tools to work together. In this study we survey different notions that are used in this context: interoperability, interaction and integration. We point out relation

  12. Maailma suurim tool

    Index Scriptorium Estoniae

    2000-01-01

    AS Tartu näitused, Tartu Kunstikool ja ajakiri 'Diivan' korraldavad 9.-11. III Tartu messikeskuse I paviljonis näituse 'Tool 2000'. Eksponeeritakse 2000 tooli, mille hulgast valitakse TOP 12. Messikeskuse territooriumile on kavas püstitada maailma suurim tool. Samal ajal II paviljonis kaksikmess 'Sisustus 2000' ja 'Büroo 2000'.

  13. PREFACE: Celebrating 100 years of superconductivity: special issue on the iron-based superconductors Celebrating 100 years of superconductivity: special issue on the iron-based superconductors

    Science.gov (United States)

    Crabtree, George; Greene, Laura; Johnson, Peter

    2011-12-01

    In honor of this year's 100th anniversary of the discovery of superconductivity, this special issue of Reports on Progress in Physics is a dedicated issue to the 'iron-based superconductors'—a new class of high-temperature superconductors that were discovered in 2008. This is the first time the journal has generated a 'theme issue', and we provide this to the community to provide a 'snapshot' of the present status, both for researchers working in this fast-paced field, and for the general physics community. Reports on Progress in Physics publishes three classes of articles—comprehensive full Review Articles, Key Issues Reviews and, most recently, Reports on Progress articles that recount the current status of a rapidly evolving field, befitting of the articles in this special issue. It has been an exciting year for superconductivity—there have been numerous celebrations for this centenary recounting the fascinating history of this field, from seven Nobel prizes to life-saving discoveries that brought us medically useful magnetic resonance imaging. The discovery of a completely new class of high-temperature superconductors, whose mechanism remains as elusive as the cuprates discovered in 1986, has injected a new vitality into this field, and this year those new to the field were provided with the opportunity of interacting with those who have enjoyed a long history in superconductivity. Furthermore, as high-density current carriers with little or no power loss, high-temperature superconductors offer unique solutions to fundamental grid challenges of the 21st century and hold great promise in addressing our global energy challenges. The complexity and promise of these materials has caused our community to more freely share our ideas and results than ever before, and it is gratifying to see how we have grown into an enthusiastic global network to advance the field. This invited collection is true to this agenda and we are delighted to have received contributions from many of the world leaders for an initiative that is designed to benefit both newcomers and established researchers in superconductivity.

  14. Language Management Tools

    DEFF Research Database (Denmark)

    Sanden, Guro Refsum

    management tools is proposed, differentiating between three categories of tools. Firstly, corporate policies are the deliberate control of issues pertaining to language and communication developed at the managerial level of a firm. Secondly, corporate measures are the planned activities the firm’s leadership......This paper offers a review of existing literature on the topic of language management tools – the means by which language is managed – in multilingual organisations. By drawing on a combination of sociolinguistics and international business and management studies, a new taxonomy of language...... may deploy in order to address the language needs of the organisation. Finally, front-line practices refer to the use of informal, emergent language management tools available to staff members. The language management tools taxonomy provides a framework for operationalising the management of language...

  15. Software Tool Issues

    Science.gov (United States)

    Hennell, Michael

    This chapter relies on experience with tool development gained over the last thirty years. It shows that there are a large number of techniques that contribute to any successful project, and that formality is always the key: a modern software test tool is based on a firm mathematical foundation. After a brief introduction, Section 2 recalls and extends the terminology of Chapter 1. Section 3 discusses the the design of different sorts of static and dynamic analysis tools. Nine important issues to be taken into consideration when evaluating such tools are presented in Section 4. Section 5 investigates the interplay between testing and proof. In Section 6, we call for developers to take their own medicine and verify their tools. Finally, we conclude in Section 7 with a summary of our main messages, emphasising the important role of testing.

  16. Tools for top physics at D0

    Energy Technology Data Exchange (ETDEWEB)

    Harel, Amnon

    2008-07-01

    Top quark measurements rely on the jet energy calibration and often on b-quark identification. We discuss these and other tools and how they apply to top quark analyses at D0. In particular some of the nuances that result from D0's data driven approach to these issues are presented.

  17. A cross-species alignment tool (CAT)

    DEFF Research Database (Denmark)

    Li, Heng; Guan, Liang; Liu, Tao;

    2007-01-01

    sensitive methods which are usually applied in aligning inter-species sequences. RESULTS: Here we present a new algorithm called CAT (for Cross-species Alignment Tool). It is designed to align mRNA sequences to mammalian-sized genomes. CAT is implemented using C scripts and is freely available on the web...

  18. Computational social networks tools, perspectives and applications

    CERN Document Server

    Abraham, Ajith

    2012-01-01

    Provides the latest advances in computational social networks, and illustrates how organizations can gain a competitive advantage by applying these ideas in real-world scenarios Presents a specific focus on practical tools and applications Provides experience reports, survey articles, and intelligence techniques and theories relating to specific problems in network technology

  19. Micro and nano fabrication tools and processes

    CERN Document Server

    Gatzen, Hans H; Leuthold, Jürg

    2015-01-01

    For Microelectromechanical Systems (MEMS) and Nanoelectromechanical Systems (NEMS) production, each product requires a unique process technology. This book provides a comprehensive insight into the tools necessary for fabricating MEMS/NEMS and the process technologies applied. Besides, it describes enabling technologies which are necessary for a successful production, i.e., wafer planarization and bonding, as well as contamination control.

  20. OOTW Force Design Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.

    1999-05-01

    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.