WorldWideScience

Sample records for twenty-first space simulation

  1. The twenty-first century in space

    CERN Document Server

    Evans, Ben

    2015-01-01

    This final entry in the History of Human Space Exploration mini-series by Ben Evans continues with an in-depth look at the latter part of the 20th century and the start of the new millennium. Picking up where Partnership in Space left off, the story commemorating the evolution of manned space exploration unfolds in further detail. More than fifty years after Yuri Gagarin’s pioneering journey into space, Evans extends his overview of how that momentous voyage continued through the decades which followed. The Twenty-first Century in Space, the sixth book in the series, explores how the fledgling partnership between the United States and Russia in the 1990s gradually bore fruit and laid the groundwork for today’s International Space Station. The narrative follows the convergence of the Shuttle and Mir programs, together with standalone missions, including servicing the Hubble Space Telescope, many of whose technical and human lessons enabled the first efforts to build the ISS in orbit. The book also looks to...

  2. Automation and robotics for Space Station in the twenty-first century

    Science.gov (United States)

    Willshire, K. F.; Pivirotto, D. L.

    1986-01-01

    Space Station telerobotics will evolve beyond the initial capability into a smarter and more capable system as we enter the twenty-first century. Current technology programs including several proposed ground and flight experiments to enable development of this system are described. Advancements in the areas of machine vision, smart sensors, advanced control architecture, manipulator joint design, end effector design, and artificial intelligence will provide increasingly more autonomous telerobotic systems.

  3. Space power technology for the twenty-first century (SPT21)

    International Nuclear Information System (INIS)

    Borger, W.U.; Massie, L.D.

    1988-01-01

    During the spring and summer months of 1987, the Aero Propulsion Laboratory of the Air Force Wright Aeronautical Laboratories, Wright-Patterson AFB, Ohio in cooperation with the Air Force Space Technology Center at Kirtland AFB, New Mexico, undertook an initiative to develop a Strategic Plan for Space Power Technology Development. The initiative was called SPT21, Space Power Technology for the Twenty-First Century. The planning process involved the participation of other Government organizations (U.S. Army, Navy, DOE and NASA) along with major aerospace companies and universities. Following an SPT21 kickoff meeting on 28 May 1987, detailed strategic planning was accomplished through seven (7) Space Power Technology Discipline Workshops commencing in June 1987 and concluding in August 1987. Technology Discipline Workshops were conducted in the following areas: (1) Solar Thermal Dynamic Power Systems (2) Solar Photovoltaic Cells and Arrays (3) Thermal Management Technology (4) Energy Storage Technology (5) Nuclear Power Systems Technology (6) Power Conditioning, Distribution and Control and (7) Systems Technology/Advanced Concepts. This technical paper summarizes the planning process and describes the salient findings and conclusions of the workshops

  4. Achievements in the past twenty years and perspective outlook of crop space breeding in China

    International Nuclear Information System (INIS)

    Liu Luxiang; Guo Huijun; Zhao Linshu; Gu Jiayu; Zhao Shirong

    2007-01-01

    Space breeding is a novel effective approach to crop mutational improvement, which was firstly founded by the chinese scientists in 1987. A national collaborative research network has been established and significant achievements have been made during the past twenty years. More than forty new mutant varieties derived from space mutagenesis in rice, wheat, cotton, sweet pepper, tomato, sesame and alfalfa have been developed, officially released and put into production. A series of useful rare mutant germplasms which might make a great breakthrough in crop grain yield and/or quality improvement have been obtained. Technique innovations in space breeding and ground simulation of space environmental factors have been made good progresses. Intellective property right protection and industrialization of space mutation techniques and mutant varieties, exploration of the mechanism of space mutation induction have also been stably advanced. In this paper, the main achievements of crop space breeding in the past twenty years had been reviewed. The perspective development strategies of space breeding were also discussed. (authors)

  5. Twenty-five years of simulator training

    International Nuclear Information System (INIS)

    Anon.

    2002-01-01

    The first training simulator for nuclear power plant personnel in Germany was commissioned twenty-five years ago. The strategy of training by simulators was developed and pursued consistently and continuously in order to ensure sound training of nuclear power plant personnel. The present thirteen simulators cover a broad range of plants. A systematic training concept also helps to ensure a high level of competence and permanent qualification of plant personnel. The anniversary was marked by a festive event at which Erich K. Steiner read a paper on 'The Importance of Simulator Training', and Professor Dr. Adolf Birkhofer spoke about 'Nuclear Technology Education and Training'. (orig.)

  6. Early twenty-first-century droughts during the warmest climate

    Directory of Open Access Journals (Sweden)

    Felix Kogan

    2016-01-01

    Full Text Available The first 13 years of the twenty-first century have begun with a series of widespread, long and intensive droughts around the world. Extreme and severe-to-extreme intensity droughts covered 2%–6% and 7%–16% of the world land, respectively, affecting environment, economies and humans. These droughts reduced agricultural production, leading to food shortages, human health deterioration, poverty, regional disturbances, population migration and death. This feature article is a travelogue of the twenty-first-century global and regional droughts during the warmest years of the past 100 years. These droughts were identified and monitored with the National Oceanic and Atmospheric Administration operational space technology, called vegetation health (VH, which has the longest period of observation and provides good data quality. The VH method was used for assessment of vegetation condition or health, including drought early detection and monitoring. The VH method is based on operational satellites data estimating both land surface greenness (NDVI and thermal conditions. The twenty-first-century droughts in the USA, Russia, Australia and Horn of Africa were intensive, long, covered large areas and caused huge losses in agricultural production, which affected food security and led to food riots in some countries. This research also investigates drought dynamics presenting no definite conclusion about drought intensification or/and expansion during the time of the warmest globe.

  7. Capital in the Twenty-First Century

    DEFF Research Database (Denmark)

    Hansen, Per H.

    2014-01-01

    Review essay on: Capital in the Twenty-First Century. By Thomas Piketty . Translated by Arthur Goldhammer . Cambridge, Mass.: The Belknap Press of Harvard University Press, 2014. viii + 685 pp......Review essay on: Capital in the Twenty-First Century. By Thomas Piketty . Translated by Arthur Goldhammer . Cambridge, Mass.: The Belknap Press of Harvard University Press, 2014. viii + 685 pp...

  8. The twenty-first century commercial space imperative

    CERN Document Server

    Young, Anthony

    2015-01-01

    Young addresses the impressive expansion across existing and developing commercial space business markets, with multiple private companies competing in the payload launch services sector. The author pinpoints the new markets, technologies, and players in the industry, as well as highlighting the overall reasons why it is important for us to develop space. NASA now relies on commercial partners to supply cargo and crew spacecraft and services to and from the International Space Station. The sizes of satellites are diminishing and their capabilities expanding, while costs to orbit are decreasing. Suborbital space tourism holds the potential of new industries and jobs. Commercial space exploration of the Moon and the planets also holds promise. All this activity is a catalyst for anyone interested in joining the developing space industry, from students and researchers to engineers and entrepreneurs. As more and more satellites and rockets are launched and the business of space is expanding at a signifi...

  9. Teachers' Critical Reflective Practice in the Context of Twenty-First Century Learning

    Science.gov (United States)

    Benade, Leon

    2015-01-01

    In the twenty-first century, learning and teaching at school must prepare young people for engaging in a complex and dynamic world deeply influenced by globalisation and the revolution in digital technology. In addition to the use of digital technologies, is the development of flexible learning spaces. It is claimed that these developments demand,…

  10. Uncertainty in Twenty-First-Century CMIP5 Sea Level Projections

    Science.gov (United States)

    Little, Christopher M.; Horton, Radley M.; Kopp, Robert E.; Oppenheimer, Michael; Yip, Stan

    2015-01-01

    The representative concentration pathway (RCP) simulations included in phase 5 of the Coupled Model Intercomparison Project (CMIP5) quantify the response of the climate system to different natural and anthropogenic forcing scenarios. These simulations differ because of 1) forcing, 2) the representation of the climate system in atmosphere-ocean general circulation models (AOGCMs), and 3) the presence of unforced (internal) variability. Global and local sea level rise projections derived from these simulations, and the emergence of distinct responses to the four RCPs depend on the relative magnitude of these sources of uncertainty at different lead times. Here, the uncertainty in CMIP5 projections of sea level is partitioned at global and local scales, using a 164-member ensemble of twenty-first-century simulations. Local projections at New York City (NYSL) are highlighted. The partition between model uncertainty, scenario uncertainty, and internal variability in global mean sea level (GMSL) is qualitatively consistent with that of surface air temperature, with model uncertainty dominant for most of the twenty-first century. Locally, model uncertainty is dominant through 2100, with maxima in the North Atlantic and the Arctic Ocean. The model spread is driven largely by 4 of the 16 AOGCMs in the ensemble; these models exhibit outlying behavior in all RCPs and in both GMSL and NYSL. The magnitude of internal variability varies widely by location and across models, leading to differences of several decades in the local emergence of RCPs. The AOGCM spread, and its sensitivity to model exclusion and/or weighting, has important implications for sea level assessments, especially if a local risk management approach is utilized.

  11. Simulating care: technology-mediated learning in twenty-first century nursing education.

    Science.gov (United States)

    Diener, Elizabeth; Hobbs, Nelda

    2012-01-01

    The increased reliance on simulation classrooms has proven successful in learning skills. Questions persist concerning the ability of technology-driven robotic devices to form and cultivate caring behaviors, or sufficiently develop interactive nurse-client communication necessary in the context of nursing. This article examines the disconnects created by use of simulation technology in nursing education, raising the question: "Can learning of caring-as-being, be facilitated in simulation classrooms?" We propose that unless time is spent with human beings in the earliest stages of nursing education, transpersonal caring relationships do not have space to develop. Learning, crafting, and maturation of caring behaviors threatens to become a serendipitous event or is no longer perceived as an essential characteristic of nursing. Technology does not negate caring-the isolation it fosters makes transpersonal caring all the more important. We are called to create a new paradigm for nursing education that merges Nightingale's vision with technology's promise. © 2012 Wiley Periodicals, Inc.

  12. Analysis of the projected regional sea-ice changes in the Southern Ocean during the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Lefebvre, W.; Goosse, H. [Universite Catholique de Louvain, Institut d' Astronomie et de Geophysique Georges Lemaitre, Louvain-la-Neuve (Belgium)

    2008-01-15

    Using the set of simulations performed with atmosphere-ocean general circulation models (AOGCMs) for the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR4), the projected regional distribution of sea ice for the twenty-first century has been investigated. Averaged over all those model simulations, the current climate is reasonably well reproduced. However, this averaging procedure hides the errors from individual models. Over the twentieth century, the multimodel average simulates a larger sea-ice concentration decrease around the Antarctic Peninsula compared to other regions, which is in qualitative agreement with observations. This is likely related to the positive trend in the Southern Annular Mode (SAM) index over the twentieth century, in both observations and in the multimodel average. Despite the simulated positive future trend in SAM, such a regional feature around the Antarctic Peninsula is absent in the projected sea-ice change for the end of the twenty-first century. The maximum decrease is indeed located over the central Weddell Sea and the Amundsen-Bellingshausen Seas. In most models, changes in the oceanic currents could play a role in the regional distribution of the sea ice, especially in the Ross Sea, where stronger southward currents could be responsible for a smaller sea-ice decrease during the twenty-first century. Finally, changes in the mixed layer depth can be found in some models, inducing locally strong changes in the sea-ice concentration. (orig.)

  13. Why the American public supports twenty-first century learning.

    Science.gov (United States)

    Sacconaghi, Michele

    2006-01-01

    Aware that constituent support is essential to any educational endeavor, the AOL Time Warner Foundation (now the Time Warner Foundation), in conjunction with two respected national research firms, measured Americans' attitudes toward the implementation of twenty-first century skills. The foundation's national research survey was intended to explore public perceptions of the need for changes in the educational system, in school and after school, with respect to the teaching of twenty-first century skills. The author summarizes the findings of the survey, which were released by the foundation in June 2003. One thousand adults were surveyed by telephone, including African Americans, Latinos, teachers, and business executives. In general, the survey found that Americans believe today's students need a "basics-plus" education, meaning communication, technology, and critical thinking skills in addition to the traditional basics of reading, writing, and math. In fact, 92 percent of respondents stated that students today need different skills from those of ten to twenty years ago. Also, after-school programs were found to be an appropriate vehicle to teach these skills. Furthermore, the survey explored how well the public perceives schools to be preparing youth for the workforce and postsecondary education, which twenty-first century skills are seen as being taught effectively, and the level of need for after-school and summer programs. The survey results provide conclusive evidence of national support for basics-plus education. Thus, a clear opportunity exists to build momentum for a new model of education for the twenty-first century.

  14. Afterword: Victorian Sculpture for the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    David J. Getsy

    2016-06-01

    Full Text Available Commenting on the directions proposed by this issue of '19', the afterword discusses the broad trends in twenty-first century studies of Victorian sculpture and the opportunity for debate arising from the first attempt at a comprehensive exhibition.

  15. Greenland Surface Mass Balance as Simulated by the Community Earth System Model. Part II: Twenty-First-Century Changes

    NARCIS (Netherlands)

    Vizcaino, M.; Lipscomb, W.H.; Sacks, W.J.; van den Broeke, M.R.

    2014-01-01

    This study presents the first twenty-first-century projections of surface mass balance (SMB) changes for the Greenland Ice Sheet (GIS) with the Community Earth System Model (CESM), which includes a new ice sheet component. For glaciated surfaces, CESM includes a sophisticated calculation of energy

  16. Increasing precipitation volatility in twenty-first-century California

    Science.gov (United States)

    Swain, Daniel L.; Langenbrunner, Baird; Neelin, J. David; Hall, Alex

    2018-05-01

    Mediterranean climate regimes are particularly susceptible to rapid shifts between drought and flood—of which, California's rapid transition from record multi-year dryness between 2012 and 2016 to extreme wetness during the 2016-2017 winter provides a dramatic example. Projected future changes in such dry-to-wet events, however, remain inadequately quantified, which we investigate here using the Community Earth System Model Large Ensemble of climate model simulations. Anthropogenic forcing is found to yield large twenty-first-century increases in the frequency of wet extremes, including a more than threefold increase in sub-seasonal events comparable to California's `Great Flood of 1862'. Smaller but statistically robust increases in dry extremes are also apparent. As a consequence, a 25% to 100% increase in extreme dry-to-wet precipitation events is projected, despite only modest changes in mean precipitation. Such hydrological cycle intensification would seriously challenge California's existing water storage, conveyance and flood control infrastructure.

  17. Establishing the R&D Agenda for Twenty-First Century Learning

    Science.gov (United States)

    Kay, Ken; Honey, Margaret

    2006-01-01

    Much ink has flowed over the past few years describing the need to incorporate twenty-first century skills into K-12 education. Preparing students to succeed as citizens, thinkers, and workers--the bedrock of any educational system--in this environment means arming them with more than a list of facts and important dates. Infusing twenty-first…

  18. Twenty-First Century Literacy: A Matter of Scale from Micro to Mega

    Science.gov (United States)

    Brown, Abbie; Slagter van Tryon, Patricia J.

    2010-01-01

    Twenty-first century technologies require educators to look for new ways to teach literacy skills. Current communication methods are combinations of traditional and newer, network-driven forms. This article describes the changes twenty-first century technologies cause in the perception of time, size, distance, audience, and available data, and…

  19. Twenty-first century vaccines

    Science.gov (United States)

    Rappuoli, Rino

    2011-01-01

    In the twentieth century, vaccination has been possibly the greatest revolution in health. Together with hygiene and antibiotics, vaccination led to the elimination of many childhood infectious diseases and contributed to the increase in disability-free life expectancy that in Western societies rose from 50 to 78–85 years (Crimmins, E. M. & Finch, C. E. 2006 Proc. Natl Acad. Sci. USA 103, 498–503; Kirkwood, T. B. 2008 Nat. Med 10, 1177–1185). In the twenty-first century, vaccination will be expected to eliminate the remaining childhood infectious diseases, such as meningococcal meningitis, respiratory syncytial virus, group A streptococcus, and will address the health challenges of this century such as those associated with ageing, antibiotic resistance, emerging infectious diseases and poverty. However, for this to happen, we need to increase the public trust in vaccination so that vaccines can be perceived as the best insurance against most diseases across all ages. PMID:21893537

  20. Digital earth applications in the twenty-first century

    NARCIS (Netherlands)

    de By, R.A.; Georgiadou, P.Y.

    2014-01-01

    In these early years of the twenty-first century, we must look at how the truly cross-cutting information technology supports other innovations, and how it will fundamentally change the information positions of government, private sector and the scientific domain as well as the citizen. In those

  1. Nuclear power in the twenty-first century - An assessment (Part 1)

    OpenAIRE

    von Hirschhausen, Christian

    2017-01-01

    Nuclear power was one of the most important discoveries of the twentieth century, and it continues to play an important role in twenty-first century discussions about the future energy mix, climate change, innovation, proliferation, geopolitics, and many other crucial policy topics. This paper addresses some key issues around the emergence of nuclear power in the twentieth century and perspectives going forward in the twenty-first, including questions of economics and competitiveness, the str...

  2. SEAPOWER: A GUIDE FOR THE TWENTY- FIRST CENTURY

    African Journals Online (AJOL)

    Abel

    $154,37 (amazon.com hardback). With the publication of Seapower: A Guide for the Twenty-First Century. Geoffrey Till has set the standard for publications on all things maritime. The updated and expanded new edition of the book is an essential guide for students of naval history and maritime strategy and provides ...

  3. Vesico-vaginal fistula repair: experience with first twenty-three ...

    African Journals Online (AJOL)

    Vesico-vaginal fistula repair: experience with first twenty-three patients seen at a tertiary hospital in north-central Nigeria. Stephen D. Ngwan, Bassey E. Edem, Ajen S. Anzaku, Barnabas A. Eke, Mohammed A. Shittu, Solomon A. Obekpa ...

  4. Space robot simulator vehicle

    Science.gov (United States)

    Cannon, R. H., Jr.; Alexander, H.

    1985-01-01

    A Space Robot Simulator Vehicle (SRSV) was constructed to model a free-flying robot capable of doing construction, manipulation and repair work in space. The SRSV is intended as a test bed for development of dynamic and static control methods for space robots. The vehicle is built around a two-foot-diameter air-cushion vehicle that carries batteries, power supplies, gas tanks, computer, reaction jets and radio equipment. It is fitted with one or two two-link manipulators, which may be of many possible designs, including flexible-link versions. Both the vehicle body and its first arm are nearly complete. Inverse dynamic control of the robot's manipulator has been successfully simulated using equations generated by the dynamic simulation package SDEXACT. In this mode, the position of the manipulator tip is controlled not by fixing the vehicle base through thruster operation, but by controlling the manipulator joint torques to achieve the desired tip motion, while allowing for the free motion of the vehicle base. One of the primary goals is to minimize use of the thrusters in favor of intelligent control of the manipulator. Ways to reduce the computational burden of control are described.

  5. United States Military Space: Into the Twenty-First Century

    Science.gov (United States)

    2002-01-01

    famous and articulate spokesmen for planetary science; Pale Blue Dot : A Vision of the Human Future in Space (New York: Random House, 1994) was one...and defining human characteristic. Carl Sagan is a primary spokesman for those who view spaceflight in scientific and ecological terms and see it as...Spacefaring Civilization (New York: Jeremy P. Tarcher/Putnam, 1999). Carl Sagan cofounded the Planetary Society in 1980 and was one of the most

  6. Twenty-first nuclear accident dosimetry intercomparison study, August 6-10, 1984

    International Nuclear Information System (INIS)

    Swaja, R.E.; Ragan, G.E.; Sims, C.S.

    1985-05-01

    The twenty-first in a series of nuclear accident dosimetry (NAD) intercomparison (NAD) studies was conducted at the Oak Ridge National Laboratory's Dosimetry Applications Research Facility during August 6-10, 1984. The Health Physics Research Reactor operated in the pulse mode was used to simulate three criticality accidents with different radiation fields. Participants from five organizations measured neutron doses between 0.53 and 4.36 Gy and gamma doses between 0.19 and 1.01 Gy at area monitoring stations and on phantoms. About 75% of all neutron dose estimates based on foil activation, hair activation, simulated blood sodium activation, and thermoluminescent methods were within +-25% of reference values. Approximately 86% of all gamma results measured using thermoluminescent (TLD-700 or CaSO 4 ) systems were within +-20% of reference doses which represents a significant improvement over previous studies. Improvements observed in the ability of intercomparison participants to estimate neutron and gamma doses under criticality accident conditions can be partly attributed to experience in previous NAD studies which have provided practical tests of dosimetry systems, enabled participants to improve evaluation methods, and standardized dose reporting conventions. 16 refs., 15 tabs

  7. Technological sciences society of the twenty-first century

    International Nuclear Information System (INIS)

    1999-04-01

    This book introduces information-oriented society of the twenty-first century connected to computer network for example memory of dream : F-ram, information-oriented society : New media, communications network for next generation ; ISDN on what is IDSN?, development of information service industry, from office automation to an intelligent building in the future, home shopping and home banking and rock that hinders information-oriented society.

  8. Designing Vaccines for the Twenty-First Century Society

    OpenAIRE

    Finco, Oretta; Rappuoli, Rino

    2014-01-01

    The history of vaccination clearly demonstrates that vaccines have been highly successful in preventing infectious diseases, reducing significantly the incidence of childhood diseases and mortality. However, many infections are still not preventable with the currently available vaccines and they represent a major cause of mortality worldwide. In the twenty-first century, the innovation brought by novel technologies in antigen discovery and formulation together with a deeper knowledge of the h...

  9. Book Review: Africa and Europe in the Twenty-First Century ...

    African Journals Online (AJOL)

    Abstract. Title: Africa and Europe in the Twenty-First Century. Author: Osita C. Eze and Amadu Sesay. Publisher: Nigerian Institute of International Affairs, 2010, xvi + 397pp, Tables, Index. ISBN: 978-002-102-7 ...

  10. China's iGeneration - Cinema and Moving Image Culture for the Twenty-First Century

    OpenAIRE

    Johnson, Matthew D.; Wagner, Keith B.; Yu, Tianqui; Vulpiani, Luke

    2014-01-01

    Collection of essays on twenty-first century Chinese cinema and moving image culture. This innovative collection of essays on twenty-first century Chinese cinema and moving image culture features contributions from an international community of scholars, critics, and practitioners. Taken together, their perspectives make a compelling case that the past decade has witnessed a radical transformation of conventional notions of cinema. Following China's accession to the WTO in 2001, personal ...

  11. Critical Remarks on Piketty's 'Capital in the Twenty-first Century'

    OpenAIRE

    Homburg, Stefan

    2014-01-01

    This paper discusses the central macroeconomic claims that are made in Thomas Piketty's book 'Capital in the Twenty-first Century'. The paper aims to show that Piketty's contentions are not only logically flawed but also contradicted by his own data.

  12. Changes of climate regimes during the last millennium and the twenty-first century simulated by the Community Earth System Model

    Science.gov (United States)

    Huang, Wei; Feng, Song; Liu, Chang; Chen, Jie; Chen, Jianhui; Chen, Fahu

    2018-01-01

    This study examines the shifts in terrestrial climate regimes using the Köppen-Trewartha (K-T) climate classification by analyzing the Community Earth System Model Last Millennium Ensemble (CESM-LME) simulations for the period 850-2005 and CESM Medium Ensemble (CESM-ME), CESM Large Ensemble (CESM-LE) and CESM with fixed aerosols Medium Ensemble (CESM-LE_FixA) simulations for the period 1920-2080. We compare K-T climate types from the Medieval Climate Anomaly (MCA) (950-1250) with the Little Ice Age (LIA) (1550-1850), from present day (PD) (1971-2000) with the last millennium (LM) (850-1850), and from the future (2050-2080) with the LM in order to place anthropogenic changes in the context of changes due to natural forcings occurring during the last millennium. For CESM-LME, we focused on the simulations with all forcings, though the impacts of individual forcings (e.g., solar activities, volcanic eruptions, greenhouse gases, aerosols and land use changes) were also analyzed. We found that the climate types changed slightly between the MCA and the LIA due to weak changes in temperature and precipitation. The climate type changes in PD relative to the last millennium have been largely driven by greenhouse gas-induced warming, but anthropogenic aerosols have also played an important role on regional scales. At the end of the twenty-first century, the anthropogenic forcing has a much greater effect on climate types than the PD. Following the reduction of aerosol emissions, the impact of greenhouse gases will further promote global warming in the future. Compared to precipitation, changes in climate types are dominated by greenhouse gas-induced warming. The large shift in climate types by the end of this century suggests possible wide-spread redistribution of surface vegetation and a significant change in species distributions.

  13. Guidelines to design engineering education in the twenty-first century for supporting innovative product development

    Science.gov (United States)

    Violante, Maria Grazia; Vezzetti, Enrico

    2017-11-01

    In the twenty-first century, meeting our technological challenges demands educational excellence, a skilled populace that is ready for the critical challenges society faces. There is widespread consensus, however, that education systems are failing to adequately prepare all students with the essential twenty-first century knowledge and skills necessary to succeed in life, career, and citizenship. The purpose of this paper is to understand how twenty-first century knowledge and skills can be appropriately embedded in engineering education finalised to innovative product development by using additive manufacturing (AM). The study designs a learning model by which to achieve effective AM education to address the requirements of twenty-first century and to offer students the occasion to experiment with STEM (Science, technology, engineering, and mathematics) concepts. The study is conducted using the quality function deployment (QFD) methodology.

  14. Understanding Contamination; Twenty Years of Simulating Radiological Contamination

    Energy Technology Data Exchange (ETDEWEB)

    Emily Snyder; John Drake; Ryan James

    2012-02-01

    A wide variety of simulated contamination methods have been developed by researchers to reproducibly test radiological decontamination methods. Some twenty years ago a method of non-radioactive contamination simulation was proposed at the Idaho National Laboratory (INL) that mimicked the character of radioactive cesium and zirconium contamination on stainless steel. It involved baking the contamination into the surface of the stainless steel in order to 'fix' it into a tenacious, tightly bound oxide layer. This type of contamination was particularly applicable to nuclear processing facilities (and nuclear reactors) where oxide growth and exchange of radioactive materials within the oxide layer became the predominant model for material/contaminant interaction. Additional simulation methods and their empirically derived basis (from a nuclear fuel reprocessing facility) are discussed. In the last ten years the INL, working with the Defense Advanced Research Projects Agency (DARPA) and the National Homeland Security Research Center (NHSRC), has continued to develop contamination simulation methodologies. The most notable of these newer methodologies was developed to compare the efficacy of different decontamination technologies against radiological dispersal device (RDD, 'dirty bomb') type of contamination. There are many different scenarios for how RDD contamination may be spread, but the most commonly used one at the INL involves the dispersal of an aqueous solution containing radioactive Cs-137. This method was chosen during the DARPA projects and has continued through the NHSRC series of decontamination trials and also gives a tenacious 'fixed' contamination. Much has been learned about the interaction of cesium contamination with building materials, particularly concrete, throughout these tests. The effects of porosity, cation-exchange capacity of the material and the amount of dirt and debris on the surface are very important factors

  15. Energy content of stormtime ring current from phase space mapping simulations

    International Nuclear Information System (INIS)

    Chen, M.W.; Schulz, M.; Lyons, L.R.

    1993-01-01

    The authors perform a model study to account for the increase in energy content of the trapped-particle population which occurs during the main phase of major geomagnetic storms. They consider stormtime particle transport in the equatorial region of the magnetosphere. They start with a phase space distribution of the ring current before the storm, created by a steady state transport model. They then use a previously developed guiding center particle simulation to map the stormtime ring current phase space, following Liouville's theorem. This model is able to account for the ten to twenty fold increase in energy content of magnetospheric ions during the storm

  16. Guidelines to Design Engineering Education in the Twenty-First Century for Supporting Innovative Product Development

    Science.gov (United States)

    Violante, Maria Grazia; Vezzetti, Enrico

    2017-01-01

    In the twenty-first century, meeting our technological challenges demands educational excellence, a skilled populace that is ready for the critical challenges society faces. There is widespread consensus, however, that education systems are failing to adequately prepare all students with the essential twenty-first century knowledge and skills…

  17. The conundrum of religious schools in twenty-first-century Europe

    NARCIS (Netherlands)

    Merry, M.S.

    2015-01-01

    In this paper Merry examines in detail the continued - and curious - popularity of religious schools in an otherwise ‘secular’ twenty-first century Europe. To do this he considers a number of motivations underwriting the decision to place one's child in a religious school and delineates what are

  18. Projected Changes on the Global Surface Wave Drift Climate towards the END of the Twenty-First Century

    Science.gov (United States)

    Carrasco, Ana; Semedo, Alvaro; Behrens, Arno; Weisse, Ralf; Breivik, Øyvind; Saetra, Øyvind; Håkon Christensen, Kai

    2016-04-01

    The global wave-induced current (the Stokes Drift - SD) is an important feature of the ocean surface, with mean values close to 10 cm/s along the extra-tropical storm tracks in both hemispheres. Besides the horizontal displacement of large volumes of water the SD also plays an important role in the ocean mix-layer turbulence structure, particularly in stormy or high wind speed areas. The role of the wave-induced currents in the ocean mix-layer and in the sea surface temperature (SST) is currently a hot topic of air-sea interaction research, from forecast to climate ranges. The SD is mostly driven by wind sea waves and highly sensitive to changes in the overlaying wind speed and direction. The impact of climate change in the global wave-induced current climate will be presented. The wave model WAM has been forced by the global climate model (GCM) ECHAM5 wind speed (at 10 m height) and ice, for present-day and potential future climate conditions towards the end of the end of the twenty-first century, represented by the Intergovernmental Panel for Climate Change (IPCC) CMIP3 (Coupled Model Inter-comparison Project phase 3) A1B greenhouse gas emission scenario (usually referred to as a ''medium-high emissions'' scenario). Several wave parameters were stored as output in the WAM model simulations, including the wave spectra. The 6 hourly and 0.5°×0.5°, temporal and space resolution, wave spectra were used to compute the SD global climate of two 32-yr periods, representative of the end of the twentieth (1959-1990) and twenty-first (1969-2100) centuries. Comparisons of the present climate run with the ECMWF (European Centre for Medium-Range Weather Forecasts) ERA-40 reanalysis are used to assess the capability of the WAM-ECHAM5 runs to produce realistic SD results. This study is part of the WRCP-JCOMM COWCLIP (Coordinated Ocean Wave Climate Project) effort.

  19. Simulating Coupling Complexity in Space Plasmas: First Results from a new code

    Science.gov (United States)

    Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.

    2005-12-01

    The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal

  20. Accelerators for the twenty-first century a review

    CERN Document Server

    Wilson, Edmund J N

    1990-01-01

    The development of the synchrotron, and later the storage ring, was based upon the electrical technology at the turn of this century, aided by the microwave radar techniques of World War II. This method of acceleration seems to have reached its limit. Even superconductivity is not likely to lead to devices that will satisfy physics needs into the twenty-first century. Unless a new principle for accelerating elementary particles is discovered soon, it is difficult to imagine that high-energy physics will continue to reach out to higher energies and luminosities.

  1. Proceedings of the twenty-first LAMPF users group meeting

    International Nuclear Information System (INIS)

    1988-04-01

    The Twenty-First Annual LAMPF Users Group Meeting was held November 9-10, 1987, at the Clinton P. Anderson Meson Physics Facility. The program included a number of invited talks on various aspects of nuclear and particle physics as well as status reports on LAMPF and discussions of upgrade options. The LAMPF working groups met and discussed plans for the secondary beam lines, experimental programs, and computing facilities

  2. NATO’s Relevance in the Twenty-First Century

    Science.gov (United States)

    2012-03-22

    reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...5d. PROJECT NUMBER Colonel John K. Jones 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES...Christopher Coker, Globalisation and Insecurity in the Twenty-first Century: NATO and the Management of Risk (The International Institute for Strategic

  3. Twenty-first workshop on geothermal reservoir engineering: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-01-26

    PREFACE The Twenty-First Workshop on Geothermal Reservoir Engineering was held at the Holiday Inn, Palo Alto on January 22-24, 1996. There were one-hundred fifty-five registered participants. Participants came from twenty foreign countries: Argentina, Austria, Canada, Costa Rica, El Salvador, France, Iceland, Indonesia, Italy, Japan, Mexico, The Netherlands, New Zealand, Nicaragua, the Philippines, Romania, Russia, Switzerland, Turkey and the UK. The performance of many geothermal reservoirs outside the United States was described in several of the papers. Professor Roland N. Horne opened the meeting and welcomed visitors. The key note speaker was Marshall Reed, who gave a brief overview of the Department of Energy's current plan. Sixty-six papers were presented in the technical sessions of the workshop. Technical papers were organized into twenty sessions concerning: reservoir assessment, modeling, geology/geochemistry, fracture modeling hot dry rock, geoscience, low enthalpy, injection, well testing, drilling, adsorption and stimulation. Session chairmen were major contributors to the workshop, and we thank: Ben Barker, Bobbie Bishop-Gollan, Tom Box, Jim Combs, John Counsil, Sabodh Garg, Malcolm Grant, Marcel0 Lippmann, Jim Lovekin, John Pritchett, Marshall Reed, Joel Renner, Subir Sanyal, Mike Shook, Alfred Truesdell and Ken Williamson. Jim Lovekin gave the post-dinner speech at the banquet and highlighted the exciting developments in the geothermal field which are taking place worldwide. The Workshop was organized by the Stanford Geothermal Program faculty, staff, and graduate students. We wish to thank our students who operated the audiovisual equipment. Shaun D. Fitzgerald Program Manager.

  4. Twenty-first century learning for teachers: helping educators bring new skills into the classroom.

    Science.gov (United States)

    Wilson, John I

    2006-01-01

    The motivation behind every educator's dedication and hard work in the classroom is the knowledge that his or her teaching will result in students' success in life. Educators are committed to implementing twenty-first century skills; they have no question that students need such skills to be equipped for life beyond school. Members of the National Education Association are enthusiastic about the Partnership for 21st Century Skills framework, yet express frustration that many schools do not have adequate resources to make the necessary changes. Teaching these skills poses significant new responsibilities for schools and educators. To make it possible for teachers to build twenty-first century skills into the curriculum, physical and policy infrastructures must exist, professional development and curriculum materials must be offered, and meaningful assessments must be available. With an established understanding of what skills need to be infused into the classroom-problem solving, analysis, and com- munications-and educators' commitment to the new skill set, this chapter explores how to make such a dramatic reform happen. The author discusses existing strategies that will guide educators in infusing twenty-first century skills into traditional content areas such as math, English, geography, and science. Ultimately, public policy regarding educational standards, professional development, assessments, and physical school structures must exist to enable educators to employ twenty-first century skills, leading to student success in contemporary life. Any concern about the cost of bringing this nation's educational system up to par internationally should be offset by the price that not making twenty-first century skills a priority in the classroom will have on future economic well-being.

  5. A Critical Feminist and Race Critique of Thomas Piketty's "Capital in the Twenty-First Century"

    Science.gov (United States)

    Moeller, Kathryn

    2016-01-01

    Thomas Piketty's "Capital in the Twenty-first Century" documents the foreboding nature of rising wealth inequality in the twenty-first century. In an effort to promote a more just and democratic global society and rein in the unfettered accumulation of wealth by the few, Piketty calls for a global progressive annual tax on corporate…

  6. Humanities: The Unexpected Success Story of the Twenty-First Century

    Science.gov (United States)

    Davis, Virginia

    2012-01-01

    Humanities within universities faced challenges in the latter half of the twentieth century as their value in the modern world was questioned. This paper argues that there is strong potential for the humanities to thrive in the twenty-first century university sector. It outlines some of the managerial implications necessary to ensure that this…

  7. Simulation of space charge effects in a synchrotron

    International Nuclear Information System (INIS)

    Machida, Shinji; Ikegami, Masanori

    1998-01-01

    We have studied space charge effects in a synchrotron with multi-particle tracking in 2-D and 3-D configuration space (4-D and 6-D phase space, respectively). First, we will describe the modelling of space charge fields in the simulation and a procedure of tracking. Several ways of presenting tracking results will be also mentioned. Secondly, it is discussed as a demonstration of the simulation study that coherent modes of a beam play a major role in beam stability and intensity limit. The incoherent tune in a resonance condition should be replaced by the coherent tune. Finally, we consider the coherent motion of a beam core as a driving force of halo formation. The mechanism is familiar in linac, and we apply it in a synchrotron

  8. Twenty-First Water Reaction Safety Information Meeting

    International Nuclear Information System (INIS)

    Monteleone, S.

    1994-04-01

    This three-volume report contains 90 papers out of the 102 that were presented at the Twenty-First Water Reactor Safety Information Meeting held at the Bethesda Marriott Hotel, Bethesda, Maryland, during the week of October 25--27, 1993. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Germany, Japan, Russia, Switzerland, Taiwan, and United Kingdom. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. Individual papers have been cataloged separately. This document, Volume 2, presents papers on severe accident research

  9. Twenty-First Water Reactor Safety Information Meeting

    International Nuclear Information System (INIS)

    Monteleone, S.

    1994-04-01

    This three-volume report contains 90 papers out of the 102 that were presented at the Twenty-First Water Reactor Safety Information Meeting held at the Bethesda Marriott Hotel, Bethesda, Maryland, during the week of October 25-27, 1993. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Germany, Japan, Russia, Switzerland, Taiwan, and United Kingdom. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. Selected papers were indexed separately for inclusion in the Energy Science and Technology Database

  10. Twenty-first century learning in schools: A case study of New Technology High School in Napa, California.

    Science.gov (United States)

    Pearlman, Bob

    2006-01-01

    The most pertinent question concerning teaching and learning in the twenty-first century is not what knowledge and skills students need--that laundry list was identified over a decade ago--but rather how to foster twenty-first century learning. What curricula, experiences, assessments, environments, and technology best support twenty-first century learning? New Technology High School (NTHS) in Napa, California, is one example of a successful twenty-first century school. In this chapter, the author describes the components of this exemplary high school, illustrating an environment that will cultivate twenty-first century student learning. New Technology High School began by defining eight learning outcomes, aligned with the standards of the Partnership for 21st Century Skills; to graduate, students demonstrate mastery of these outcomes through an online portfolio. To help students achieve the outcomes, NTHS employs project- and problem-based learning. Whereas in traditional classrooms students work alone on short-term assignments that do not lend themselves to deep understanding, the project-based learning approach has students working in teams on long-term, in-depth, rigorous projects. Students' work is supported by the school's workplace-like environment and effectiv use of technology. Meaningful assessment is essential to project-based learning; students receive continuous feedback, helping them become self-directed learners. In fact, NTHS uses outcome-based grading through which students constantly know how they are performing on the twenty-first century outcomes. Research has shown that NTHS graduates are better prepared for postsecondary education, careers, and citizenship than their peers from other schools. To facilitate twenty-first century learning, all schools need to rethink their approach to teaching and learning. New Technology High School is one way to do so.

  11. Space changes after premature loss of the mandibular primary first molar: a longitudinal study.

    Science.gov (United States)

    Lin, Y T; Chang, L C

    1998-01-01

    The purpose of this study was to evaluate the space changes after premature loss of the primary mandibular first molar. Twenty-one children (12 boys and 9 girls), with premature loss of the primary mandibular first molar, were selected from the children's dental clinic for this study. The age ranged from 5.1 to 7.2 years with an average of 6 years and 11 months. Mandibular study casts were made from alginate impression for each initial examination and a follow-up examination eight months later. Four measurements including D+E (first and second primary molars) space, arch width, arch length and arch perimeter were tested for comparisons between the initial examination and the follow-up examination eight months later. The D+E space of intact primary molars served as a control. The results showed that the D+E space on the extraction side after the follow-up examination eight months later was significantly shorter than the control side (p = 0.025) and less than the initial D+E space (p 0.05). It is concluded that the space change after the eruption of the first permanent molar in the mandible is mostly distal movement of the primary cuspid during the early stage of premature loss of the primary first molar.

  12. Theoretical Contexts and Conceptual Frames for the Study of Twenty-First Century Capitalism

    DEFF Research Database (Denmark)

    Hull Kristensen, Peer; Morgan, Glenn

    2012-01-01

    This chapter argues that the comparative institutionalist approach requires rethinking in the light of developments in the twenty-first century. The chapter emphasizes the following features of the new environment: first, the rise of the BRIC and the emerging economies; secondly, the changed...

  13. TPACK Updated to Measure Pre-Service Teachers' Twenty-First Century Skills

    Science.gov (United States)

    Valtonen, Teemu; Sointu, Erkko; Kukkonen, Jari; Kontkanen, Sini; Lambert, Matthew C.; Mäkitalo-Siegl, Kati

    2017-01-01

    Twenty-first century skills have attracted significant attention in recent years. Students of today and the future are expected to have the skills necessary for collaborating, problem solving, creative and innovative thinking, and the ability to take advantage of information and communication technology (ICT) applications. Teachers must be…

  14. Why American business demands twenty-first century skills: an industry perspective.

    Science.gov (United States)

    Bruett, Karen

    2006-01-01

    Public education is the key to individual and business prosperity. With a vested stake in education, educators, employers, parents, policymakers, and the public should question how this nation's public education system is faring. Knowing that recent international assessments have shown little or no gains in American students' achievement, the author asserts the clear need for change. As both a large American corporate employer and a provider of technology for schools, Dell is concerned with ensuring that youth will thrive in their adult lives. Changing workplace expectations lead to a new list of skills students will need to acquire before completing their schooling. Through technology, Dell supports schools in meeting educational goals, striving to supply students with the necessary skills, referred to as twenty-first century skills. The Partnership for 21st Century Skills, of which Dell is a member, has led an initiative to define what twenty-first century learning should entail. Through extensive research, the partnership has built a framework outlining twenty-first century skills: analytical thinking, communication, collaboration, global awareness, and technological and economic literacy. Dell and the partnership are working state by state to promote the integration of these skills into curricula, professional development for teachers, and classroom environments. The authors describe two current initiatives, one in Virginia, the other in Texas, which both use technology to help student learning. All stakeholders can take part in preparing young people to compete in the global economy. Educators and administrators, legislators, parents, and employers must play their role in helping students be ready for what the workforce and the world has in store for them.

  15. The Dialectics of Discrimination in the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    John Stone

    2007-12-01

    Full Text Available This article explores some of the latest developments in the scholarship on race relations and nationalism that seek to address the impact of globalization and the changed geo-political relations of the first decade of the twenty-first century. New patterns of identification, some of which challenge existing group boundaries and others that reinforce them, can be seen to flow from the effects of global market changes and the political counter-movements against them. The impact of the “war on terrorism”, the limits of the utility of hard power, and the need for new mechanisms of inter-racial and inter-ethnic conflict resolution are evaluated to emphasize the complexity of these group relations in the new world disorder.

  16. Twenty first century climate change as simulated by European climate models

    International Nuclear Information System (INIS)

    Cubasch, Ulrich

    2007-01-01

    Full text: Climate change simulation results for seven European state-of-the-art climate models, participating in the European research project ENSEMBLES (ENSEMBLE-based Predictions of Climate Changes and their Impacts), will be presented. Models from Norway, France, Germany, Denmark, and Great Britain, representing a sub-ensemble of the models contributing to the Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCC AR4), are included. Climate simulations are conducted with all the models for present-day climate and for future climate under the SRES A1B, A2, and B1 scenarios. The design of the simulations follows the guidelines of the IPCC AR4. The 21st century projections are compared to the corresponding present-day simulations. The ensemble mean global mean near surface temperature rise for the year 2099 compared to the 1961-1990 period amounts to 3.2Kforthe A1B scenario, to 4.1 K for the A2 scenario, and to 2.1 K for the B1 scenario. The spatial patterns of temperature change are robust among the contributing models with the largest temperature increase over the Arctic in boreal winter, stronger warming overland than over ocean, and little warming over the southern oceans. The ensemble mean globally averaged precipitation increases for the three scenarios (5.6%, 5.7%, and 3.8% for scenarios A1B, A2, and B1, respectively). The precipitation signals of the different models display a larger spread than the temperature signals. In general, precipitation increases in the Intertropical Convergence Zone and the mid- to high latitudes (most pronounced during the hemispheric winter) and decreases in the subtropics. Sea-level pressure decreases over the polar regions in all models and all scenarios, which is mainly compensated by a pressure increase in the subtropical highs. These changes imply an intensification of the Southern and Northern Annular Modes

  17. Thomas Piketty – The Adam Smith of the Twenty-First Century?

    Directory of Open Access Journals (Sweden)

    Jacob Dahl Rendtorff

    2014-11-01

    Full Text Available Piketty’s book, Capital in the Twenty-First Century (2014 has become a bestseller in the world. Two month after its publication, it had sold more than 200.000 copies, and this success will surely continue for a long time. Piketty has established a new platform to discuss political economy.

  18. Space plasma simulation chamber

    International Nuclear Information System (INIS)

    1986-01-01

    Scientific results of experiments and tests of instruments performed with the Space Plasma Simulation Chamber and its facility are reviewed in the following six categories. 1. Tests of instruments on board rockets, satellites and balloons. 2. Plasma wave experiments. 3. Measurements of plasma particles. 4. Optical measurements. 5. Plasma production. 6. Space plasms simulations. This facility has been managed under Laboratory Space Plasma Comittee since 1969 and used by scientists in cooperative programs with universities and institutes all over country. A list of publications is attached. (author)

  19. The premature loss of primary first molars: space loss to molar occlusal relationships and facial patterns.

    Science.gov (United States)

    Alexander, Stanley A; Askari, Marjan; Lewis, Patricia

    2015-03-01

    To investigate space changes with the premature loss of primary first molars and their relationship to permanent molar occlusion and facial forms. Two hundred twenty-six participants (ranging in age from 7 years 8 months to 8 years 2 months; 135 female, 91 male) met all inclusion criteria designed to study space loss as a result of the premature loss of the primary first molar. After 9 months, space loss was evaluated in relationship to molar occlusion and facial form. Statistical evaluation was performed with the paired t-test and with a two-way analysis of variance for independent groups. Patients with leptoprosopic facial form and end-on molar occlusions all exhibited a statistically significant difference when compared to controls in terms of space loss (P molar occlusion displayed space loss as well (P molar occlusion displayed space loss in the maxilla (P molar occlusions showed no significant difference in space loss. The relationship between the first permanent molar occlusion and facial form of the child has an influence on the loss of space at the primary first molar site.

  20. Twenty First Century Education: Transformative Education for Sustainability and Responsible Citizenship

    Science.gov (United States)

    Bell, David V. J.

    2016-01-01

    Many ministries of education focus on twenty-first century education but unless they are looking at this topic through a sustainability lens, they will be missing some of its most important elements. The usual emphasis on developing skills for employability in the current global economy begs the question whether the global economy is itself…

  1. How Do Students Value the Importance of Twenty-First Century Skills?

    Science.gov (United States)

    Ahonen, Arto Kalevi; Kinnunen, Päivi

    2015-01-01

    Frameworks of twenty-first century skills have attained a central role in school development and curriculum changes all over the world. There is a common understanding of the need for meta-skills such as problem solving, reasoning, collaboration, and self-regulation. This article presents results from a Finnish study, in which 718 school pupils…

  2. Assessing twenty-first century skills through a teacher created video game for high school biology students

    Science.gov (United States)

    Annetta, Leonard A.; Cheng, Meng-Tzu; Holmes, Shawn

    2010-07-01

    As twenty-first century skills become a greater focus in K-12 education, an infusion of technology that meets the needs of today's students is paramount. This study looks at the design and creation of a Multiplayer Educational Gaming Application (MEGA) for high school biology students. The quasi-experimental, qualitative design assessed the twenty-first century skills of digital age literacy, inventive thinking, high productivity, and effective communication techniques of the students exposed to a MEGA. Three factors, as they pertained to these skills, emerged from classroom observations. Interaction with the teacher, discussion with peers, and engagement/time-on-task while playing the MEGA suggested that students playing an educational video game exhibited all of the projected twenty-first century skills while being engrossed in the embedded science content.

  3. Twenty-first-century medical microbiology services in the UK.

    Science.gov (United States)

    Duerden, Brian

    2005-12-01

    With infection once again a high priority for the UK National Health Service (NHS), the medical microbiology and infection-control services require increased technology resources and more multidisciplinary staff. Clinical care and health protection need a coordinated network of microbiology services working to consistent standards, provided locally by NHS Trusts and supported by the regional expertise and national reference laboratories of the new Health Protection Agency. Here, I outline my thoughts on the need for these new resources and the ways in which clinical microbiology services in the UK can best meet the demands of the twenty-first century.

  4. Science Teacher Education in the Twenty-First Century: a Pedagogical Framework for Technology-Integrated Social Constructivism

    Science.gov (United States)

    Barak, Miri

    2017-04-01

    Changes in our global world have shifted the skill demands from acquisition of structured knowledge to mastery of skills, often referred to as twenty-first century competencies. Given these changes, a sequential explanatory mixed methods study was undertaken to (a) examine predominant instructional methods and technologies used by teacher educators, (b) identify attributes for learning and teaching in the twenty-first century, and (c) develop a pedagogical framework for promoting meaningful usage of advanced technologies. Quantitative and qualitative data were collected via an online survey, personal interviews, and written reflections with science teacher educators and student teachers. Findings indicated that teacher educators do not provide sufficient models for the promotion of reform-based practice via web 2.0 environments, such as Wikis, blogs, social networks, or other cloud technologies. Findings also indicated four attributes for teaching and learning in the twenty-first century: (a) adapting to frequent changes and uncertain situations, (b) collaborating and communicating in decentralized environments, (c) generating data and managing information, and (d) releasing control by encouraging exploration. Guided by social constructivist paradigms and twenty-first century teaching attributes, this study suggests a pedagogical framework for fostering meaningful usage of advanced technologies in science teacher education courses.

  5. CLARREO shortwave observing system simulation experiments of the twenty-first century: Simulator design and implementation

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, D.R.; Algieri, C.A.; Ong, J.R.; Collins, W.D.

    2011-04-01

    Projected changes in the Earth system will likely be manifested in changes in reflected solar radiation. This paper introduces an operational Observational System Simulation Experiment (OSSE) to calculate the signals of future climate forcings and feedbacks in top-of-atmosphere reflectance spectra. The OSSE combines simulations from the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report for the NCAR Community Climate System Model (CCSM) with the MODTRAN radiative transfer code to calculate reflectance spectra for simulations of current and future climatic conditions over the 21st century. The OSSE produces narrowband reflectances and broadband fluxes, the latter of which have been extensively validated against archived CCSM results. The shortwave reflectance spectra contain atmospheric features including signals from water vapor, liquid and ice clouds, and aerosols. The spectra are also strongly influenced by the surface bidirectional reflectance properties of predicted snow and sea ice and the climatological seasonal cycles of vegetation. By comparing and contrasting simulated reflectance spectra based on emissions scenarios with increasing projected and fixed present-day greenhouse gas and aerosol concentrations, we find that prescribed forcings from increases in anthropogenic sulfate and carbonaceous aerosols are detectable and are spatially confined to lower latitudes. Also, changes in the intertropical convergence zone and poleward shifts in the subsidence zones and the storm tracks are all detectable along with large changes in snow cover and sea ice fraction. These findings suggest that the proposed NASA Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission to measure shortwave reflectance spectra may help elucidate climate forcings, responses, and feedbacks.

  6. Testing Students under Cognitive Capitalism: Knowledge Production of Twenty-First Century Skills

    Science.gov (United States)

    Morgan, Clara

    2016-01-01

    Scholars studying the global governance of education have noted the increasingly important role corporations play in educational policy making. I contribute to this scholarship by examining the Assessment and Teaching of twenty-first century skills (ATC21S™) project, a knowledge production apparatus operating under cognitive capitalism. I analyze…

  7. EXOGENOUS CHALLENGES FOR THE TOURISM INDUSTRY IN THE BEGINNING OF THE TWENTY FIRST CENTURY

    Directory of Open Access Journals (Sweden)

    Akosz Ozan

    2009-05-01

    Full Text Available Tourism is one of the fastest growing industries in the world. Besides its sustained growth the tourism industry has shown in the first years of the twenty first century that it can deal with political, military and natural disasters. The present paper ac

  8. Twenty-first Semiannual Report of the Commission to the Congress, January 1957

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, Lewis L.

    1957-01-31

    The document represents the twenty-first semiannual Atomic Energy Commission (AEC) report to Congress. The report sums up the major activities and developments in the national atomic energy program covering the period July - December 1956. A special part two of this semiannual report addresses specifically Radiation Safety in Atomic Energy Activities.

  9. Managing the twenty-first century reference department challenges and prospects

    CERN Document Server

    Katz, Linda S

    2014-01-01

    Learn the skills needed to update and manage a reference department that efficiently meets the needs of clients today?and tomorrow! Managing the Twenty-First Century Reference Department: Challenges and Prospects provides librarians with the knowledge and skills they need to manage an effective reference service. Full of useful and practical ideas, this book presents successful methods for recruiting and retaining capable reference department staff and management, training new employees and adapting current services to an evolving field. Expert practitioners address the changing role of the r

  10. Border Crossing in Contemporary Brazilian Culture: Global Perspectives from the Twenty-First Century Literary Scene

    Directory of Open Access Journals (Sweden)

    Cimara Valim de Melo

    2016-06-01

    Full Text Available Abstract: This paper investigates the process of internationalisation of Brazilian literature in the twenty-first century from the perspective of the publishing market. For this, we analyse how Brazil has responded to globalisation and what effects of cultural globalisation can be seen in the Brazilian literary scene, focusing on the novel. Observing the movement of the novelists throughout the globe, the reception of Brazilian literature in the United Kingdom and the relations between art and the literary market in Brazil, we intend to provoke some reflections on Brazilian cultural history in the light of the twenty-first century.

  11. WENESSA, Wide Eye-Narrow Eye Space Simulation fo Situational Awareness

    Science.gov (United States)

    Albarait, O.; Payne, D. M.; LeVan, P. D.; Luu, K. K.; Spillar, E.; Freiwald, W.; Hamada, K.; Houchard, J.

    In an effort to achieve timelier indications of anomalous object behaviors in geosynchronous earth orbit, a Planning Capability Concept (PCC) for a “Wide Eye-Narrow Eye” (WE-NE) telescope network has been established. The PCC addresses the problem of providing continuous and operationally robust, layered and cost-effective, Space Situational Awareness (SSA) that is focused on monitoring deep space for anomalous behaviors. It does this by first detecting the anomalies with wide field of regard systems, and then providing reliable handovers for detailed observational follow-up by another optical asset. WENESSA will explore the added value of such a system to the existing Space Surveillance Network (SSN). The study will assess and quantify the degree to which the PCC completely fulfills, or improves or augments, these deep space knowledge deficiencies relative to current operational systems. In order to improve organic simulation capabilities, we will explore options for the federation of diverse community simulation approaches, while evaluating the efficiencies offered by a network of small and larger aperture, ground-based telescopes. Existing Space Modeling and Simulation (M&S) tools designed for evaluating WENESSA-like problems will be taken into consideration as we proceed in defining and developing the tools needed to perform this study, leading to the creation of a unified Space M&S environment for the rapid assessment of new capabilities. The primary goal of this effort is to perform a utility assessment of the WE-NE concept. The assessment will explore the mission utility of various WE-NE concepts in discovering deep space anomalies in concert with the SSN. The secondary goal is to generate an enduring modeling and simulation environment to explore the utility of future proposed concepts and supporting technologies. Ultimately, our validated simulation framework would support the inclusion of other ground- and space-based SSA assets through integrated

  12. The Turn to Precarity in Twenty-First Century Fiction

    Directory of Open Access Journals (Sweden)

    Morrison Jago

    2014-01-01

    Full Text Available Recent years have seen several attempts by writers and critics to understand the changed sensibility in post-9/11 fiction through a variety of new -isms. This essay explores this cultural shift in a different way, finding a ‘turn to precarity’ in twenty-first century fiction characterised by a renewal of interest in the flow and foreclosure of affect, the resurgence of questions about vulnerability and our relationships to the other, and a heightened awareness of the social dynamics of seeing. The essay draws these tendencies together via the work of Judith Butler in Frames of War, in an analysis of Trezza Azzopardi’s quasi-biographical study of precarious life, Remember Me.

  13. Consideration of land-use and land-cover changes in the projection of climate extremes over North America by the end of the twenty-first century

    Science.gov (United States)

    Alexandru, Adelina

    2018-03-01

    Changes in the essential climate extremes indices and surface variables for the end of the twenty-first century are assessed in this study based on two transient climate change simulations, with and without land-use and land-cover changes (LULCC), but identical atmospheric forcing. The two simulations are performed with the 5th generation of the Canadian Regional Climate Model (CRCM5) driven by the Canadian Earth System Model for the (2006-2100)-Representative Concentration Pathway 4.5 (RCP4.5) scenario. For the simulation with LULCC, land-cover data sets are taken from the global change assessment model (GCAM) representing the RCP4.5 scenario for the period 2006-2100. LULCC in RCP4.5 scenario suggest significant reduction in cultivated land (e.g. Canadian Prairies and Mississippi basin) due to afforestation. CRCM5 climate projections imply a general warming by the end of the twenty-first century, especially over the northern regions in winter. CRCM5 projects more warm spell-days per year over most areas of the continent, and implicitly more summer days and tropical nights at the expense of cold-spell, frost and ice days whose number is projected to decrease by up to 40% by the end of the twenty-first century with respect to the baseline period 1971-2000. Most land areas north of 45°N, in all seasons, as well as the southeastern United States in summer, exhibit increases in mean precipitation under the RCP4.5 scenario. In contrast, central parts of the continent in summer and much of Mexico in all seasons show reduced precipitation. In addition, large areas of North America exhibit changes of 10 to 40% (depending on the season and geographical location) in the number of heavy precipitation days. Results also suggest that the biogeophysical effects of LULCC on climate, assessed through differences between the two simulations, lead to warmer regional climates, especially in winter. The investigation of processes leading to this response shows high sensitivity of the

  14. Visual Literacy: Does It Enhance Leadership Abilities Required for the Twenty-First Century?

    Science.gov (United States)

    Bintz, Carol

    2016-01-01

    The twenty-first century hosts a well-established global economy, where leaders are required to have increasingly complex skills that include creativity, innovation, vision, relatability, critical thinking and well-honed communications methods. The experience gained by learning to be visually literate includes the ability to see, observe, analyze,…

  15. Building On Builder: The Persistent Icarus Syndrome at Twenty Years

    Science.gov (United States)

    2013-06-01

    mission of the United States Air Force is to "fly, fight, and win…in air, space and cyberspace"--as an intergral member of the Joint team that...Scenarios: A Military Futurist Explores War in the Twenty-First Century (New York: Bantam Books Trade Paperbacksl, 2009), 17. 33 Carl H. Builder

  16. Interplanetary Transit Simulations Using the International Space Station

    Science.gov (United States)

    Charles, J. B.; Arya, Maneesh

    2010-01-01

    It has been suggested that the International Space Station (ISS) be utilized to simulate the transit portion of long-duration missions to Mars and near-Earth asteroids (NEA). The ISS offers a unique environment for such simulations, providing researchers with a high-fidelity platform to study, enhance, and validate technologies and countermeasures for these long-duration missions. From a space life sciences perspective, two major categories of human research activities have been identified that will harness the various capabilities of the ISS during the proposed simulations. The first category includes studies that require the use of the ISS, typically because of the need for prolonged weightlessness. The ISS is currently the only available platform capable of providing researchers with access to a weightless environment over an extended duration. In addition, the ISS offers high fidelity for other fundamental space environmental factors, such as isolation, distance, and accessibility. The second category includes studies that do not require use of the ISS in the strictest sense, but can exploit its use to maximize their scientific return more efficiently and productively than in ground-based simulations. In addition to conducting Mars and NEA simulations on the ISS, increasing the current increment duration on the ISS from 6 months to a longer duration will provide opportunities for enhanced and focused research relevant to long-duration Mars and NEA missions. Although it is currently believed that increasing the ISS crew increment duration to 9 or even 12 months will pose little additional risk to crewmembers, additional medical monitoring capabilities may be required beyond those currently used for the ISS operations. The use of the ISS to simulate aspects of Mars and NEA missions seems practical, and it is recommended that planning begin soon, in close consultation with all international partners.

  17. 2010 Critical Success Factors for the North Carolina Community College System. Twenty First Annual Report

    Science.gov (United States)

    North Carolina Community College System (NJ1), 2010

    2010-01-01

    First mandated by the North Carolina General Assembly in 1989 (S.L. 1989; C. 752; S. 80), the Critical Success Factors report has evolved into the major accountability document for the North Carolina Community College System. This twenty first annual report on the critical success factors is the result of a process undertaken to streamline and…

  18. A First Look at the Upcoming SISO Space Reference FOM

    Science.gov (United States)

    Mueller, Bjorn; Crues, Edwin Z.; Dexter, Dan; Garro, Alfredo; Skuratovskiy, Anton; Vankov, Alexander

    2016-01-01

    Spaceflight is difficult, dangerous and expensive; human spaceflight even more so. In order to mitigate some of the danger and expense, professionals in the space domain have relied, and continue to rely, on computer simulation. Simulation is used at every level including concept, design, analysis, construction, testing, training and ultimately flight. As space systems have grown more complex, new simulation technologies have been developed, adopted and applied. Distributed simulation is one those technologies. Distributed simulation provides a base technology for segmenting these complex space systems into smaller, and usually simpler, component systems or subsystems. This segmentation also supports the separation of responsibilities between participating organizations. This segmentation is particularly useful for complex space systems like the International Space Station (ISS), which is composed of many elements from many nations along with visiting vehicles from many nations. This is likely to be the case for future human space exploration activities. Over the years, a number of distributed simulations have been built within the space domain. While many use the High Level Architecture (HLA) to provide the infrastructure for interoperability, HLA without a Federation Object Model (FOM) is insufficient by itself to insure interoperability. As a result, the Simulation Interoperability Standards Organization (SISO) is developing a Space Reference FOM. The Space Reference FOM Product Development Group is composed of members from several countries. They contribute experiences from projects within NASA, ESA and other organizations and represent government, academia and industry. The initial version of the Space Reference FOM is focusing on time and space and will provide the following: (i) a flexible positioning system using reference frames for arbitrary bodies in space, (ii) a naming conventions for well-known reference frames, (iii) definitions of common time scales

  19. Comparative evaluation of twenty pilot workload assessment measure using a psychomotor task in a moving base aircraft simulator

    Science.gov (United States)

    Connor, S. A.; Wierwille, W. W.

    1983-01-01

    A comparison of the sensitivity and intrusion of twenty pilot workload assessment techniques was conducted using a psychomotor loading task in a three degree of freedom moving base aircraft simulator. The twenty techniques included opinion measures, spare mental capacity measures, physiological measures, eye behavior measures, and primary task performance measures. The primary task was an instrument landing system (ILS) approach and landing. All measures were recorded between the outer marker and the middle marker on the approach. Three levels (low, medium, and high) of psychomotor load were obtained by the combined manipulation of windgust disturbance level and simulated aircraft pitch stability. Six instrument rated pilots participated in four seasons lasting approximately three hours each.

  20. Fusion energy from the Moon for the twenty-first century

    International Nuclear Information System (INIS)

    Kulcinski, G.L.; Cameron, E.N.; Santarius, J.F.; Sviatoslavsky, I.N.; Wittenberg, L.J.; Schmitt, H.H.

    1992-01-01

    It is shown in this paper that the D-He-3 fusion fuel cycle is not only credible from a physics standpoint, but that its breakeven and ignition characteristics could be developed on roughly the same time schedule as the DT cycle. It was also shown that the extremely low fraction of power in neutrons, the lack of significant radioactivity in the reactants, and the potential for very high conversion efficiencies, can result in definite advantages for the D-He-3 cycle with respect to DT fusion and fission reactors in the twenty-first century. More specifically, the D-He-3 cycle can accomplish the following: (1) eliminate the need for deep geologic waste burial facilities and the wastes can qualify for Class A, near-surface land burial; (2) allow inherently safe reactors to be built that, under the worst conceivable accident, cannot cause a civilian fatality or result in a significant (greater than 100 mrem) exposure to a member of the public; (3) reduce the radiation damage levels to a point where no scheduled replacement of reactor structural components is required, i.e., full reactor lifetimes (approximately 30 FPY) can be credibly claimed; (4) increase the reliability and availability of fusion reactors compared to DT systems because of the greatly reduced radioactivity, the low neutron damage, and the elimination of T breeding; and (5) greatly reduce the capital costs of fusion power plants (compared to DT systems) by as much as 50 percent and present the potential for a significant reduction on the COE. The concepts presented in this paper tie together two of the most ambitious high-technology endeavors of the twentieth century: the development of controlled thermonuclear fusion for civilian power applications and the utilization of outer space for the benefit of mankind on Earth

  1. Development of space simulation / net-laboratory system

    Science.gov (United States)

    Usui, H.; Matsumoto, H.; Ogino, T.; Fujimoto, M.; Omura, Y.; Okada, M.; Ueda, H. O.; Murata, T.; Kamide, Y.; Shinagawa, H.; Watanabe, S.; Machida, S.; Hada, T.

    A research project for the development of space simulation / net-laboratory system was approved by Japan Science and Technology Corporation (JST) in the category of Research and Development for Applying Advanced Computational Science and Technology(ACT-JST) in 2000. This research project, which continues for three years, is a collaboration with an astrophysical simulation group as well as other space simulation groups which use MHD and hybrid models. In this project, we develop a proto type of unique simulation system which enables us to perform simulation runs by providing or selecting plasma parameters through Web-based interface on the internet. We are also developing an on-line database system for space simulation from which we will be able to search and extract various information such as simulation method and program, manuals, and typical simulation results in graphic or ascii format. This unique system will help the simulation beginners to start simulation study without much difficulty or effort, and contribute to the promotion of simulation studies in the STP field. In this presentation, we will report the overview and the current status of the project.

  2. Strategic Leader Competencies for the Twenty-First Century

    National Research Council Canada - National Science Library

    Becker, Bradley A

    2007-01-01

    ...: interpersonal skills, conceptual skills, and technical skills. From these three primary strategic leadership skills, there is a list of twenty-one competencies that a strategic leader should posses...

  3. Catholic school governance in the twenty-first century: continuity, incongruity and challenge

    OpenAIRE

    Storr, Christopher John

    2007-01-01

    This study has two main aspects: first, it reports the results of a survey of ninety nine governors working in Roman Catholic primary and secondary schools situated in four English Catholic dioceses, and publishes hitherto unknown information about them; and, second, it examines how, in seeking to maintain a distinctive educational ethos, these governors are responding both to the legislative changes of the last twenty years, and to changes in English social and cultural attitudes. It shows h...

  4. Strategies for Teaching Maritime Archaeology in the Twenty First Century

    Science.gov (United States)

    Staniforth, Mark

    2008-12-01

    Maritime archaeology is a multi-faceted discipline that requires both theoretical learning and practical skills training. In the past most universities have approached the teaching of maritime archaeology as a full-time on-campus activity designed for ‘traditional’ graduate students; primarily those in their early twenties who have recently come from full-time undergraduate study and who are able to study on-campus. The needs of mature-age and other students who work and live in different places (or countries) and therefore cannot attend lectures on a regular basis (or at all) have largely been ignored. This paper provides a case study in the teaching of maritime archaeology from Australia that, in addition to ‘traditional’ on-campus teaching, includes four main components: (1) learning field methods through field schools; (2) skills training through the AIMA/NAS avocational training program; (3) distance learning topics available through CD-ROM and using the Internet; and (4) practicums, internships and fellowships. The author argues that programs to teach maritime archaeology in the twenty first century need to be flexible and to address the diverse needs of students who do not fit the ‘traditional’ model. This involves collaborative partnerships with other universities as well as government underwater cultural heritage management agencies and museums, primarily through field schools, practicums and internships.

  5. Planetary and Space Simulation Facilities (PSI) at DLR

    Science.gov (United States)

    Panitz, Corinna; Rabbow, E.; Rettberg, P.; Kloss, M.; Reitz, G.; Horneck, G.

    2010-05-01

    The Planetary and Space Simulation facilities at DLR offer the possibility to expose biological and physical samples individually or integrated into space hardware to defined and controlled space conditions like ultra high vacuum, low temperature and extraterrestrial UV radiation. An x-ray facility stands for the simulation of the ionizing component at the disposal. All of the simulation facilities are required for the preparation of space experiments: - for testing of the newly developed space hardware - for investigating the effect of different space parameters on biological systems as a preparation for the flight experiment - for performing the 'Experiment Verification Tests' (EVT) for the specification of the test parameters - and 'Experiment Sequence Tests' (EST) by simulating sample assemblies, exposure to selected space parameters, and sample disassembly. To test the compatibility of the different biological and chemical systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed among many others for the ESA facilities of the ongoing missions EXPOSE-R and EXPOSE-E on board of the International Space Station ISS . Several experiment verification tests EVTs and an experiment sequence test EST have been conducted in the carefully equipped and monitored planetary and space simulation facilities PSI of the Institute of Aerospace Medicine at DLR in Cologne, Germany. These ground based pre-flight studies allowed the investigation of a much wider variety of samples and the selection of the most promising organisms for the flight experiment. EXPOSE-E had been attached to the outer balcony of the European Columbus module of the ISS in February 2008 and stayed for 1,5 years in space; EXPOSE-R has been attached to the Russian Svezda module of the ISS in spring 2009 and mission duration will be approx. 1,5 years. The missions will give new insights into the survivability of terrestrial

  6. Theory and Simulation of the Physics of Space Charge Dominated Beams

    International Nuclear Information System (INIS)

    Haber, Irving

    2002-01-01

    This report describes modeling of intense electron and ion beams in the space charge dominated regime. Space charge collective modes play an important role in the transport of intense beams over long distances. These modes were first observed in particle-in-cell simulations. The work presented here is closely tied to the University of Maryland Electron Ring (UMER) experiment and has application to accelerators for heavy ion beam fusion

  7. Twenty-first Century Space Science in The Urban High School Setting: The NASA/John Dewey High School Educational Outreach Partnership

    Science.gov (United States)

    Fried, B.; Levy, M.; Reyes, C.; Austin, S.

    2003-05-01

    A unique and innovative partnership has recently developed between NASA and John Dewey High School, infusing Space Science into the curriculum. This partnership builds on an existing relationship with MUSPIN/NASA and their regional center at the City University of New York based at Medgar Evers College. As an outgrowth of the success and popularity of our Remote Sensing Research Program, sponsored by the New York State Committee for the Advancement of Technology Education (NYSCATE), and the National Science Foundation and stimulated by MUSPIN-based faculty development workshops, our science department has branched out in a new direction - the establishment of a Space Science Academy. John Dewey High School, located in Brooklyn, New York, is an innovative inner city public school with students of a diverse multi-ethnic population and a variety of economic backgrounds. Students were recruited from this broad spectrum, which covers the range of learning styles and academic achievement. This collaboration includes students of high, average, and below average academic levels, emphasizing participation of students with learning disabilities. In this classroom without walls, students apply the strategies and methodologies of problem-based learning in solving complicated tasks. The cooperative learning approach simulates the NASA method of problem solving, as students work in teams, share research and results. Students learn to recognize the complexity of certain tasks as they apply Earth Science, Mathematics, Physics, Technology and Engineering to design solutions. Their path very much follows the NASA model as they design and build various devices. Our Space Science curriculum presently consists of a one-year sequence of elective classes taken in conjunction with Regents-level science classes. This sequence consists of Remote Sensing, Planetology, Mission to Mars (NASA sponsored research program), and Microbiology, where future projects will be astronomy related. This

  8. Nuclear energy into the twenty-first century

    International Nuclear Information System (INIS)

    Hammond, G.P.

    1996-01-01

    The historical development of the civil nuclear power generation industry is examined in the light of the need to meet conflicting energy-supply and environmental pressures over recent decades. It is suggested that fission (thermal and fast) reactors will dominate the market up to the period 2010-2030, with fusion being relegated to the latter part of the twenty-first century. A number of issues affecting the use of nuclear electricity generation in Western Europe are considered including its cost, industrial strategy needs, and the public acceptability of nuclear power. The contribution of nuclear power stations to achieving CO2 targets aimed at relieving global warming is discussed in the context of alternative strategies for sustainable development, including renewable energy sources and energy-efficiency measures. Trends in the generation of nuclear electricity from fission reactors are finally considered in terms of the main geopolitical groupings that make up the world in the mid-1990s. Several recent, but somewhat conflicting, forecasts of the role of nuclear power in the fuel mix to about 2020 are reviewed. It is argued that the only major expansion in generating capacity will take place on the Asia-Pacific Rim and not in the developing countries generally. Nevertheless, the global nuclear industry overall will continue to be dominated by a small number of large nuclear electricity generating countries; principally the USA, France and Japan. (UK)

  9. Ecological restoration should be redefined for the twenty-first century.

    Science.gov (United States)

    Martin, David M

    2017-09-24

    Forty years ago, ecological restoration was conceptualized through a natural science lens. Today, ecological restoration has evolved into a social and scientific concept. The duality of ecological restoration is acknowledged in guidance documents on the subject but is not apparent in its definition. Current definitions reflect our views about what ecological restoration does but not why we do it. This viewpoint does not give appropriate credit to contributions from social sciences, nor does it provide compelling goals for people with different motivating rationales to engage in or support restoration. In this study, I give a concise history of the conceptualization and definition of ecological restoration, and I propose an alternative definition and corresponding viewpoint on restoration goal-setting to meet twenty-first century scientific and public inquiry.

  10. Space science in the twenty-first century: imperatives for the decades 1995 to 2015 : life sciences

    National Research Council Canada - National Science Library

    1988-01-01

    Early in 1984, NASA asked the Space Science Board to undertake a study to determine the principal scientific issues that the disciplines of space science would face during the period from about 1995 to 2015...

  11. Evolution and modulation of tropical heating from the last glacial maximum through the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Hoyos, Carlos D.; Webster, Peter J. [Georgia Institute of Technology, School of Earth and Atmospheric Sciences, Atlanta, GA (United States)

    2012-04-15

    Twentieth century observations show that during the last 50 years the sea-surface temperature (SST) of the tropical oceans has increased by {proportional_to}0.5 C and the area of SST >26.5 and 28 C (arbitrarily referred to as the oceanic warm pool: OWP) by 15 and 50% respectively in association with an increase in green house gas concentrations, with non-understood natural variability or a combination of both. Based on CMIP3 projections the OWP is projected to double during twenty-first century in a moderate CO{sub 2} forcing scenario (IPCC A1B scenario). However, during the observational period the area of positive atmospheric heating (referred to as the dynamic warm pool, DWP), has remained constant. The threshold SST (T{sub H}), which demarks the region of net heating and cooling, has increased from 26.6 C in the 1950s to 27.1 C in the last decade and it is projected to increase to {proportional_to}28.5 C by 2100. Based on climate model simulations, the area of the DWP is projected to remain constant during the twenty-first century. Analysis of the paleoclimate model intercomparison project (PMIP I and II) simulations for the Last Glacial maximum and the Mid-Holocene periods show a very similar behaviour, with a larger OWP in periods of elevated tropical SST, and an almost constant DWP associated with a varying T{sub H}. The constancy of the DWP area, despite shifts in the background SST, is shown to be the result of a near exact matching between increases in the integrated convective heating within the DWP and the integrated radiative cooling outside the DWP as SST changes. Although the area of the DWP remains constant, the total tropical atmospheric heating is a strong function of the SST. For example the net heating has increased by about 10% from 1950 to 2000 and it is projected to increase by a further 20% by 2100. Such changes must be compensated by a more vigorous atmospheric circulation, with growth in convective heating within the warm pool, and an

  12. 76 FR 21741 - Twenty-First Century Communications and Video Programming Accessibility Act; Announcement of Town...

    Science.gov (United States)

    2011-04-18

    ... equipment distribution program for people who are deaf-blind. In addition, the law will fill accessibility... Programming Accessibility Act; Announcement of Town Hall Meeting AGENCY: Federal Communications Commission... The Twenty-First Century Communications and Video Programming Accessibility Act (the Act or CVAA...

  13. The twenty-first century challenges to sexuality and religion.

    Science.gov (United States)

    Turner, Yolanda; Stayton, William

    2014-04-01

    Clergy and religious leaders are facing a wide variety of sexual needs and concerns within their faith communities. Conflicts over sexual issues are growing across the entire spectrum of religious denominations, and clerics remain ill prepared to deal with them. As religious communities work to remain influential in public policy debates, clergy and the institutions that train them need to be properly prepared for twenty-first century challenges that impact sexuality and religion. Clergy are often the first point of contact for sexual problems and concerns of their faith community members-complex issues centered on morals, spirituality, and ethics. Yet, there still exists a significant lack of sexual curricula in the programs that are educating our future religious leaders. The resulting paucity of knowledge leaves these leaders unprepared to address the needs and concerns of their congregants. However, with accurate, relevant human sexuality curricula integrated into theological formation programs, future leaders will be equipped to competently serve their constituencies. This paper provides a rationale for the need for such training, an overview of the faith- and theology-based history of a pilot training project, and a description of how the Christian faith and the social sciences intersect in a training pilot project's impetus and process.

  14. Movies to the Rescue: Keeping the Cold War Relevant for Twenty-First-Century Students

    Science.gov (United States)

    Gokcek, Gigi; Howard, Alison

    2013-01-01

    What are the challenges of teaching Cold War politics to the twenty-first-century student? How might the millennial generation be educated about the political science theories and concepts associated with this period in history? A college student today, who grew up in the post-Cold War era with the Internet, Facebook, Twitter, smart phones,…

  15. Virtual reality: teaching tool of the twenty-first century?

    Science.gov (United States)

    Hoffman, H; Vu, D

    1997-12-01

    Virtual reality (VR) is gaining recognition for its enormous educational potential. While not yet in the mainstream of academic medical training, many prototype and first-generation VR applications are emerging, with target audiences ranging from first- and second-year medical students to residents in advanced clinical training. Visualization tools that take advantage of VR technologies are being designed to provide engaging and intuitive environments for learning visually and spatially complex topics such as human anatomy, biochemistry, and molecular biology. These applications present dynamic, three-dimensional views of structures and their spatial relationships, enabling users to move beyond "real-world" experiences by interacting with or altering virtual objects in ways that would otherwise be difficult or impossible. VR-based procedural and surgical simulations, often compared with flight simulators in aviation, hold significant promise for revolutionizing medical training. Already a wide range of simulations, representing diverse content areas and utilizing a variety of implementation strategies, are either under development or in their early implementation stages. These new systems promise to make broad-based training experiences available for students at all levels, without the risks and ethical concerns typically associated with using animal and human subjects. Medical students could acquire proficiency and gain confidence in the ability to perform a wide variety of techniques long before they need to use them clinically. Surgical residents could rehearse and refine operative procedures, using an unlimited pool of virtual patients manifesting a wide range of anatomic variations, traumatic wounds, and disease states. Those simulated encounters, in combination with existing opportunities to work with real patients, could increase the depth and breadth of learners' exposure to medical problems, ensure uniformity of training experiences, and enhance the

  16. Status Report of Simulated Space Radiation Environment Facility

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Phil Hyun; Nho, Young Chang; Jeun, Joon Pyo; Choi, Jae Hak; Lim, Youn Mook; Jung, Chan Hee; Jeon, Young Kyu

    2007-11-15

    The technology for performance testing and improvement of materials which are durable at space environment is a military related technology and veiled and securely regulated in advanced countries such as US and Russia. This core technology cannot be easily transferred to other country too. Therefore, this technology is the most fundamental and necessary research area for the successful establishment of space environment system. Since the task for evaluating the effects of space materials and components by space radiation plays important role in satellite lifetime extension and running failure percentage decrease, it is necessary to establish simulated space radiation facility and systematic testing procedure. This report has dealt with the status of the technology to enable the simulation of space environment effects, including the effect of space radiation on space materials. This information such as the fundamental knowledge of space environment and research status of various countries as to the simulation of space environment effects of space materials will be useful for the research on radiation hardiness of the materials. Furthermore, it will be helpful for developer of space material on deriving a better choice of materials, reducing the design cycle time, and improving safety.

  17. Status Report of Simulated Space Radiation Environment Facility

    International Nuclear Information System (INIS)

    Kang, Phil Hyun; Nho, Young Chang; Jeun, Joon Pyo; Choi, Jae Hak; Lim, Youn Mook; Jung, Chan Hee; Jeon, Young Kyu

    2007-11-01

    The technology for performance testing and improvement of materials which are durable at space environment is a military related technology and veiled and securely regulated in advanced countries such as US and Russia. This core technology cannot be easily transferred to other country too. Therefore, this technology is the most fundamental and necessary research area for the successful establishment of space environment system. Since the task for evaluating the effects of space materials and components by space radiation plays important role in satellite lifetime extension and running failure percentage decrease, it is necessary to establish simulated space radiation facility and systematic testing procedure. This report has dealt with the status of the technology to enable the simulation of space environment effects, including the effect of space radiation on space materials. This information such as the fundamental knowledge of space environment and research status of various countries as to the simulation of space environment effects of space materials will be useful for the research on radiation hardiness of the materials. Furthermore, it will be helpful for developer of space material on deriving a better choice of materials, reducing the design cycle time, and improving safety

  18. Simulation of Martian surface-atmosphere interaction in a space-simulator: Technical considerations and feasibility

    Science.gov (United States)

    Moehlmann, D.; Kochan, H.

    1992-01-01

    The Space Simulator of the German Aerospace Research Establishment at Cologne, formerly used for testing satellites, is now, since 1987, the central unit within the research sub-program 'Comet-Simulation' (KOSI). The KOSI team has investigated physical processes relevant to comets and their surfaces. As a byproduct we gained experience in sample-handling under simulated space conditions. In broadening the scope of the research activities of the DLR Institute of Space Simulation an extension to 'Laboratory-Planetology' is planned. Following the KOSI-experiments a Mars Surface-Simulation with realistic minerals and surface soil in a suited environment (temperature, pressure, and CO2-atmosphere) is foreseen as the next step. Here, our main interest is centered on thermophysical properties of the Martian surface and energy transport (and related gas transport) through the surface. These laboratory simulation activities can be related to space missions as typical pre-mission and during-the-mission support of the experiments design and operations (simulation in parallel). Post mission experiments for confirmation and interpretation of results are of great value. The physical dimensions of the Space Simulator (cylinder of about 2.5 m diameter and 5 m length) allows for testing and qualification of experimental hardware under realistic Martian conditions.

  19. Synthesis of Carbon Nano tubes: A Revolution in Material Science for the Twenty-First Century

    International Nuclear Information System (INIS)

    Allaf, Abd. W.

    2003-01-01

    The aim of this work is to explain the preparation procedures of single walled carbon nano tubes using arc discharge technique. The optimum conditions of carbon nano tubes synthesis are given. It should be pointed out that this sort of materials would be the twenty-first century materials

  20. Proceedings: Twenty years of energy policy: Looking toward the twenty-first century

    International Nuclear Information System (INIS)

    1992-01-01

    In 1973, immediately following the Arab Oil Embargo, the Energy Resources Center, University of Illinois at Chicago initiated an innovative annual public service program called the Illinois Energy Conference. The objective was to provide a public forum each year to address an energy or environmental issue critical to the state, region and nation. Twenty years have passed since that inaugural program, and during that period we have covered a broad spectrum of issues including energy conservation nuclear power, Illinois coal, energy policy options, natural gas, alternative fuels, new energy technologies, utility deregulation and the National Energy Strategy

  1. Proceedings: Twenty years of energy policy: Looking toward the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    1992-12-31

    In 1973, immediately following the Arab Oil Embargo, the Energy Resources Center, University of Illinois at Chicago initiated an innovative annual public service program called the Illinois Energy Conference. The objective was to provide a public forum each year to address an energy or environmental issue critical to the state, region and nation. Twenty years have passed since that inaugural program, and during that period we have covered a broad spectrum of issues including energy conservation nuclear power, Illinois coal, energy policy options, natural gas, alternative fuels, new energy technologies, utility deregulation and the National Energy Strategy.

  2. Leadership for Twenty-First-Century Schools and Student Achievement: Lessons Learned from Three Exemplary Cases

    Science.gov (United States)

    Schrum, Lynne; Levin, Barbara B.

    2013-01-01

    The purpose of this research was to understand ways exemplary award winning secondary school leaders have transformed their schools for twenty-first-century education and student achievement. This article presents three diverse case studies and identifies ways that each school's leader and leadership team reconfigured its culture and expectations,…

  3. Speaking American: Comparing Supreme Court and Hollywood Racial Interpretation in the Early Twenty-First Century

    Science.gov (United States)

    Hawkins, Paul Henry

    2010-01-01

    Apprehending that race is social, not biological, this study examines U.S. racial formation in the early twenty-first century. In particular, Hollywood and Supreme Court texts are analyzed as media for gathering, shaping and transmitting racial ideas. Representing Hollywood, the 2004 film "Crash" is analyzed. Representing the Supreme Court, the…

  4. Teaching and Learning in the Twenty-First Century: What Is an "Institute of Education" for?

    Science.gov (United States)

    Husbands, Chris

    2012-01-01

    As we begin the twenty-first century, schools and teachers are subject to enormous pressures for change. The revolution in digital technologies, the pressure to develop consistently high-performing schools systems, and the drive between excellence and equity all combine to raise profound questions about the nature of successful teaching and…

  5. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    Science.gov (United States)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  6. Twenty-first century learning after school: the case of Junior Achievement Worldwide.

    Science.gov (United States)

    Box, John M

    2006-01-01

    Efforts to increase after-school programming indicate the nation's concern about how youth are engaged during out-of-school time. There are clear benefits to extending the learning that goes on during the school day. Research from the U.S. Departments of Education and Justice shows that after-school participants do better in school and have stronger expectations for the future than youth who are not occupied after school. And the need is evident: 14.3 million students return to an empty house after school, yet only 6.5 million children are currently enrolled in after-school programs. If an after-school program were available, parents of 15.3 million would enroll their child. JA Worldwide began in 1919 and has been rooted in the afterschool arena from its origins. Its after-school programs teach students about the free enterprise system through curriculum focusing on business, citizenship, economics, entrepreneurship, ethics and character, financial literacy, and career development. At the same time, JA Worldwide incorporates hands-on learning and engagement with adults as role models, both key elements to a successful after-school program. Now focused on developing curriculum emphasizing skills needed for the twenty-first century, JA adopted the key elements laid out for after-school programs by the Partnership for 21st Century Skills. To ensure that the next generation of students enters the workforce prepared, America's education system must provide the required knowledge, skills, and attitudes. Programs such as JA Worldwide serve as models of how to provide the twenty-first century skills that all students need to succeed.

  7. Neurogenetics in Child Neurology: Redefining a Discipline in the Twenty-first Century.

    Science.gov (United States)

    Kaufmann, Walter E

    2016-12-01

    Increasing knowledge on genetic etiology of pediatric neurologic disorders is affecting the practice of the specialty. I reviewed here the history of pediatric neurologic disorder classification and the role of genetics in the process. I also discussed the concept of clinical neurogenetics, with its role in clinical practice, education, and research. Finally, I propose a flexible model for clinical neurogenetics in child neurology in the twenty-first century. In combination with disorder-specific clinical programs, clinical neurogenetics can become a home for complex clinical issues, repository of genetic diagnostic advances, educational resource, and research engine in child neurology.

  8. Autonomous Robotic Weapons: US Army Innovation for Ground Combat in the Twenty-First Century

    Science.gov (United States)

    2015-05-21

    1 Introduction Today the robot is an accepted fact, but the principle has not been pushed far enough. In the twenty-first century the...2013, accessed March 29, 2015, http://www.bbc.com/news/magazine-21576376?print=true. 113 Steven Kotler , “Say Hello to Comrade Terminator: Russia’s...of autonomous robotic weapons, black- marketed directed energy weapons, and or commercially available software, potential adversaries may find

  9. Watershed-scale response to climate change through the twenty-first century for selected basins across the United States

    Science.gov (United States)

    Hay, Lauren E.; Markstrom, Steven; Ward-Garrison, Christian D.

    2011-01-01

    The hydrologic response of different climate-change emission scenarios for the twenty-first century were evaluated in 14 basins from different hydroclimatic regions across the United States using the Precipitation-Runoff Modeling System (PRMS), a process-based, distributed-parameter watershed model. This study involves four major steps: 1) setup and calibration of the PRMS model in 14 basins across the United States by local U.S. Geological Survey personnel; 2) statistical downscaling of the World Climate Research Programme’s Coupled Model Intercomparison Project phase 3 climate-change emission scenarios to create PRMS input files that reflect these emission scenarios; 3) run PRMS for the climate-change emission scenarios for the 14 basins; and 4) evaluation of the PRMS output.This paper presents an overview of this project, details of the methodology, results from the 14 basin simulations, and interpretation of these results. A key finding is that the hydrological response of the different geographical regions of the United States to potential climate change may be very different, depending on the dominant physical processes of that particular region. Also considered is the tremendous amount of uncertainty present in the climate emission scenarios and how this uncertainty propagates through the hydrologic simulations. This paper concludes with a discussion of the lessons learned and potential for future work.

  10. Disordered crystals from first principles I: Quantifying the configuration space

    Science.gov (United States)

    Kühne, Thomas D.; Prodan, Emil

    2018-04-01

    This work represents the first chapter of a project on the foundations of first-principle calculations of the electron transport in crystals at finite temperatures. We are interested in the range of temperatures, where most electronic components operate, that is, room temperature and above. The aim is a predictive first-principle formalism that combines ab-initio molecular dynamics and a finite-temperature Kubo-formula for homogeneous thermodynamic phases. The input for this formula is the ergodic dynamical system (Ω , G , dP) defining the thermodynamic crystalline phase, where Ω is the configuration space for the atomic degrees of freedom, G is the space group acting on Ω and dP is the ergodic Gibbs measure relative to the G-action. The present work develops an algorithmic method for quantifying (Ω , G , dP) from first principles. Using the silicon crystal as a working example, we find the Gibbs measure to be extremely well characterized by a multivariate normal distribution, which can be quantified using a small number of parameters. The latter are computed at various temperatures and communicated in the form of a table. Using this table, one can generate large and accurate thermally-disordered atomic configurations to serve, for example, as input for subsequent simulations of the electronic degrees of freedom.

  11. Twenty-first century quantum mechanics Hilbert space to quantum computers mathematical methods and conceptual foundations

    CERN Document Server

    Fano, Guido

    2017-01-01

    This book is designed to make accessible to nonspecialists the still evolving concepts of quantum mechanics and the terminology in which these are expressed. The opening chapters summarize elementary concepts of twentieth century quantum mechanics and describe the mathematical methods employed in the field, with clear explanation of, for example, Hilbert space, complex variables, complex vector spaces and Dirac notation, and the Heisenberg uncertainty principle. After detailed discussion of the Schrödinger equation, subsequent chapters focus on isotropic vectors, used to construct spinors, and on conceptual problems associated with measurement, superposition, and decoherence in quantum systems. Here, due attention is paid to Bell’s inequality and the possible existence of hidden variables. Finally, progression toward quantum computation is examined in detail: if quantum computers can be made practicable, enormous enhancements in computing power, artificial intelligence, and secure communication will result...

  12. Essential Soft Skills for Success in the Twenty-First Century Workforce as Perceived by Business Educators

    Science.gov (United States)

    Mitchell, Geana W.; Skinner, Leane B.; White, Bonnie J.

    2010-01-01

    Background: Soft skills describe career attributes that individuals should possess, such as team skills, communication skills, ethics, time-management skills, and an appreciation for diversity. In the twenty-first century workforce, soft skills are important in every business sector. However, employers in business continuously report that new…

  13. Twenty-First Century Instructional Classroom Practices and Reading Motivation: Probing the Effectiveness of Interventional Reading Programs

    Science.gov (United States)

    Boulhrir, Taoufik

    2017-01-01

    Twenty-first century education has undoubtedly witnessed changes of the definition of literacy to cope with the economic, social, and intellectual trends. Technological advances, which include skills of communication, creativity, critical thinking, and collaboration have become key in education, especially when dealing with literacy and reading…

  14. The Space Nuclear Thermal Propulsion Program: Propulsion for the twenty first century

    International Nuclear Information System (INIS)

    Bleeker, G.; Moody, J.; Kesaree, M.

    1993-01-01

    As mission requirements approach the limits of the chemical propulsion systems, new engines must be investigated that can meet the advanced mission requirements of higher payload fractions, higher velocities, and consequently higher specific Impulses (Isp). The propulsion system that can meet these high demands is a nuclear thermal rocket engine. This engine generates the thrust by expanding/existing the hydrogen, heated from the energy derived from the fission process in a reactor, through a nozzle. The Department of Defense (DoD), however, initiated a new nuclear rocket development program in 1987 for ballistic missile defense application. The Space Nuclear Thermal Propulsion (SNTP) Program that seeks to improve on the technology of ROVER/NERVA grew out of this beginning and has been managed by the Air Force, with the involvement of DoE and NASA. The goal of the SNTP Program is to develop an engine to meet potential Air Force requirements for upper stage engine, bimodal propulsion/power applications, and orbital transfer vehicles, as well as the NASA requirements for possible missions to the Moon and Mars. During the entire life of the program, the DoD has considered safety to be of paramount importance, and is following all national environmental policies

  15. 25th Space Simulation Conference. Environmental Testing: The Earth-Space Connection

    Science.gov (United States)

    Packard, Edward

    2008-01-01

    Topics covered include: Methods of Helium Injection and Removal for Heat Transfer Augmentation; The ESA Large Space Simulator Mechanical Ground Support Equipment for Spacecraft Testing; Temperature Stability and Control Requirements for Thermal Vacuum/Thermal Balance Testing of the Aquarius Radiometer; The Liquid Nitrogen System for Chamber A: A Change from Original Forced Flow Design to a Natural Flow (Thermo Siphon) System; Return to Mercury: A Comparison of Solar Simulation and Flight Data for the MESSENGER Spacecraft; Floating Pressure Conversion and Equipment Upgrades of Two 3.5kw, 20k, Helium Refrigerators; Affect of Air Leakage into a Thermal-Vacuum Chamber on Helium Refrigeration Heat Load; Special ISO Class 6 Cleanroom for the Lunar Reconnaissance Orbiter (LRO) Project; A State-of-the-Art Contamination Effects Research and Test Facility Martian Dust Simulator; Cleanroom Design Practices and Their Influence on Particle Counts; Extra Terrestrial Environmental Chamber Design; Contamination Sources Effects Analysis (CSEA) - A Tool to Balance Cost/Schedule While Managing Facility Availability; SES and Acoustics at GSFC; HST Super Lightweight Interchangeable Carrier (SLIC) Static Test; Virtual Shaker Testing: Simulation Technology Improves Vibration Test Performance; Estimating Shock Spectra: Extensions beyond GEVS; Structural Dynamic Analysis of a Spacecraft Multi-DOF Shaker Table; Direct Field Acoustic Testing; Manufacture of Cryoshroud Surfaces for Space Simulation Chambers; The New LOTIS Test Facility; Thermal Vacuum Control Systems Options for Test Facilities; Extremely High Vacuum Chamber for Low Outgassing Processing at NASA Goddard; Precision Cleaning - Path to Premier; The New Anechoic Shielded Chambers Designed for Space and Commercial Applications at LIT; Extraction of Thermal Performance Values from Samples in the Lunar Dust Adhesion Bell Jar; Thermal (Silicon Diode) Data Acquisition System; Aquarius's Instrument Science Data System (ISDS) Automated

  16. Preparing Teacher-Students for Twenty-First-Century Learning Practices (PREP 21): A Framework for Enhancing Collaborative Problem-Solving and Strategic Learning Skills

    Science.gov (United States)

    Häkkinen, Päivi; Järvelä, Sanna; Mäkitalo-Siegl, Kati; Ahonen, Arto; Näykki, Piia; Valtonen, Teemu

    2017-01-01

    With regard to the growing interest in developing teacher education to match the twenty-first-century skills, while many assumptions have been made, there has been less theoretical elaboration and empirical research on this topic. The aim of this article is to present our pedagogical framework for the twenty-first-century learning practices in…

  17. Monte Carlo simulation of a medical linear accelerator for generation of phase spaces

    International Nuclear Information System (INIS)

    Oliveira, Alex C.H.; Santana, Marcelo G.; Lima, Fernando R.A.; Vieira, Jose W.

    2013-01-01

    Radiotherapy uses various techniques and equipment for local treatment of cancer. The equipment most often used in radiotherapy to the patient irradiation are linear accelerators (Linacs) which produce beams of X-rays in the range 5-30 MeV. Among the many algorithms developed over recent years for evaluation of dose distributions in radiotherapy planning, the algorithms based on Monte Carlo methods have proven to be very promising in terms of accuracy by providing more realistic results. The MC methods allow simulating the transport of ionizing radiation in complex configurations, such as detectors, Linacs, phantoms, etc. The MC simulations for applications in radiotherapy are divided into two parts. In the first, the simulation of the production of the radiation beam by the Linac is performed and then the phase space is generated. The phase space contains information such as energy, position, direction, etc. og millions of particles (photos, electrons, positrons). In the second part the simulation of the transport of particles (sampled phase space) in certain configurations of irradiation field is performed to assess the dose distribution in the patient (or phantom). The objective of this work is to create a computational model of a 6 MeV Linac using the MC code Geant4 for generation of phase spaces. From the phase space, information was obtained to asses beam quality (photon and electron spectra and two-dimensional distribution of energy) and analyze the physical processes involved in producing the beam. (author)

  18. Index to the Twenty-first Semiannual Report of the Commission to the Congress. July 1956 - December 1956

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, Lewis L.

    1957-01-31

    This volume contains a name and subject indext for the twenty-first semiannual report of the United States Atomic Energy Commission to Congress. The full semiannual report covers the major unclassified activities of the Commission from July 1956 through December 1956.

  19. Public Heath in Colonial and Post-Colonial Ghana: Lesson-Drawing for The Twenty-First Century

    Directory of Open Access Journals (Sweden)

    Adu-Gyamfi, Samuel

    2017-06-01

    Full Text Available Public health in twenty-first century Ghana is mired with several issues ranging from the inadequacy of public health facilities, improper settlement planning, insanitary conditions, and the inadequacy of laws and their implementation. This situation compared to the colonial era is a direct contradiction. Development in the pre-colonial era to the colonial era sought to make the prevention of diseases a priority in the colonial administration. This was begun with the establishment of the health branch in 1909 as a response to the bubonic plague that was fast spreading in the colony. From here public health policies and strategies were enacted to help the diseases prevention cause. Various public health boards, the medical research institute or the laboratory branch, the waste management department, the use of preventive medicine and maintenance of good settlement planning and sanitation were public health measures in the colonial era. This research seeks to analyse the public health system in the colonial era so as to draw basic lessons for twenty-first century Ghana. Archival data and other secondary sources are reviewed and analysed to help draw these lessons. Richard Rose’s lesson-drawing approach was used to draw the lessons.

  20. Space headache on Earth: head-down-tilted bed rest studies simulating outer-space microgravity.

    Science.gov (United States)

    van Oosterhout, W P J; Terwindt, G M; Vein, A A; Ferrari, M D

    2015-04-01

    Headache is a common symptom during space travel, both isolated and as part of space motion syndrome. Head-down-tilted bed rest (HDTBR) studies are used to simulate outer space microgravity on Earth, and allow countermeasure interventions such as artificial gravity and training protocols, aimed at restoring microgravity-induced physiological changes. The objectives of this article are to assess headache incidence and characteristics during HDTBR, and to evaluate the effects of countermeasures. In a randomized cross-over design by the European Space Agency (ESA), 22 healthy male subjects, without primary headache history, underwent three periods of -6-degree HDTBR. In two of these episodes countermeasure protocols were added, with either centrifugation or aerobic exercise training protocols. Headache occurrence and characteristics were daily assessed using a specially designed questionnaire. In total 14/22 (63.6%) subjects reported a headache during ≥1 of the three HDTBR periods, in 12/14 (85.7%) non-specific, and two of 14 (14.4%) migraine. The occurrence of headache did not differ between HDTBR with and without countermeasures: 12/22 (54.5%) subjects vs. eight of 22 (36.4%) subjects; p = 0.20; 13/109 (11.9%) headache days vs. 36/213 (16.9%) headache days; p = 0.24). During countermeasures headaches were, however, more often mild (p = 0.03) and had fewer associated symptoms (p = 0.008). Simulated microgravity during HDTBR induces headache episodes, mostly on the first day. Countermeasures are useful in reducing headache severity and associated symptoms. Reversible, microgravity-induced cephalic fluid shift may cause headache, also on Earth. HDTBR can be used to study space headache on Earth. © International Headache Society 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  1. Evaluation of spontaneous space closure and development of permanent dentition after extraction of hypomineralized permanent first molars.

    Science.gov (United States)

    Jälevik, Birgitta; Möller, Marie

    2007-09-01

    The aim of this study was to evaluate spontaneous space closure, development of the permanent dentition, and need for orthodontic treatment after extraction of permanent first molars due to severe molar-incisor hypomineralization (MIH). Twenty-seven children aged 5.6-12.7 (median 8.2) years had one to four permanent first molars extracted due to severe MIH. Each case was followed up on individual indications 3.8-8.3 (median 5.7) years after extractions. The eruption of the permanent dentition, and space closure were documented by orthopantomograms, casts, photographs, and/or bitewings. Fifteen children were judged to have a favourable spontaneous development of their permanent dentition without any orthodontic intervention. Seven children were or should be subjected to orthodontic treatment for other reasons registered prior to the extraction. Five children were judged to have a treatment at least caused by the extractions, but three of them abstained because of no subjective treatment need. Extraction of permanent first molars severely affected by MIH is a good treatment alternative. Favourable spontaneous space reduction and development of the permanent dentition positioning can be expected without any intervention in the majority of cases extracted prior to the eruption of the second molar.

  2. A Commentary on "Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century"

    Science.gov (United States)

    Brandt, Steffen

    2010-01-01

    This article presents the author's commentary on "Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century," in which Isaac I. Bejar and E. Aurora Graf propose the application of a test design--the duplex design (which was proposed in 1988 by Bock and Mislevy) for application in current accountability assessments.…

  3. Galactic cosmic ray simulation at the NASA Space Radiation Laboratory

    Science.gov (United States)

    Norbury, John W.; Schimmerling, Walter; Slaba, Tony C.; Azzam, Edouard I.; Badavi, Francis F.; Baiocco, Giorgio; Benton, Eric; Bindi, Veronica; Blakely, Eleanor A.; Blattnig, Steve R.; Boothman, David A.; Borak, Thomas B.; Britten, Richard A.; Curtis, Stan; Dingfelder, Michael; Durante, Marco; Dynan, William S.; Eisch, Amelia J.; Elgart, S. Robin; Goodhead, Dudley T.; Guida, Peter M.; Heilbronn, Lawrence H.; Hellweg, Christine E.; Huff, Janice L.; Kronenberg, Amy; La Tessa, Chiara; Lowenstein, Derek I.; Miller, Jack; Morita, Takashi; Narici, Livio; Nelson, Gregory A.; Norman, Ryan B.; Ottolenghi, Andrea; Patel, Zarana S.; Reitz, Guenther; Rusek, Adam; Schreurs, Ann-Sofie; Scott-Carnell, Lisa A.; Semones, Edward; Shay, Jerry W.; Shurshakov, Vyacheslav A.; Sihver, Lembit; Simonsen, Lisa C.; Story, Michael D.; Turker, Mitchell S.; Uchihori, Yukio; Williams, Jacqueline; Zeitlin, Cary J.

    2017-01-01

    Most accelerator-based space radiation experiments have been performed with single ion beams at fixed energies. However, the space radiation environment consists of a wide variety of ion species with a continuous range of energies. Due to recent developments in beam switching technology implemented at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL), it is now possible to rapidly switch ion species and energies, allowing for the possibility to more realistically simulate the actual radiation environment found in space. The present paper discusses a variety of issues related to implementation of galactic cosmic ray (GCR) simulation at NSRL, especially for experiments in radiobiology. Advantages and disadvantages of different approaches to developing a GCR simulator are presented. In addition, issues common to both GCR simulation and single beam experiments are compared to issues unique to GCR simulation studies. A set of conclusions is presented as well as a discussion of the technical implementation of GCR simulation. PMID:26948012

  4. Sub-Saharan Northern African climate at the end of the twenty-first century: forcing factors and climate change processes

    Energy Technology Data Exchange (ETDEWEB)

    Patricola, C.M. [Cornell University, Department of Earth and Atmospheric Sciences, Ithaca, NY (United States); Texas A and M University, Department of Atmospheric Sciences, College Station, TX (United States); Cook, K.H. [The University of Texas at Austin, Department of Geological Sciences, Jackson School of Geosciences, Austin, TX (United States)

    2011-09-15

    A regional climate model, the Weather Research and Forecasting (WRF) Model, is forced with increased atmospheric CO{sub 2} and anomalous SSTs and lateral boundary conditions derived from nine coupled atmosphere-ocean general circulation models to produce an ensemble set of nine future climate simulations for northern Africa at the end of the twenty-first century. A well validated control simulation, agreement among ensemble members, and a physical understanding of the future climate change enhance confidence in the predictions. The regional model ensembles produce consistent precipitation projections over much of northern tropical Africa. A moisture budget analysis is used to identify the circulation changes that support future precipitation anomalies. The projected midsummer drought over the Guinean Coast region is related partly to weakened monsoon flow. Since the rainfall maximum demonstrates a southward bias in the control simulation in July-August, this may be indicative of future summer drying over the Sahel. Wetter conditions in late summer over the Sahel are associated with enhanced moisture transport by the West African westerly jet, a strengthening of the jet itself, and moisture transport from the Mediterranean. Severe drought in East Africa during August and September is accompanied by a weakened Indian monsoon and Somali jet. Simulations with projected and idealized SST forcing suggest that overall SST warming in part supports this regional model ensemble agreement, although changes in SST gradients are important over West Africa in spring and fall. Simulations which isolate the role of individual climate forcings suggest that the spatial distribution of the rainfall predictions is controlled by the anomalous SST and lateral boundary conditions, while CO{sub 2} forcing within the regional model domain plays an important secondary role and generally produces wetter conditions. (orig.)

  5. Can crawl space temperature and moisture conditions be calculated with a whole-building hygrothermal simulation tool?

    DEFF Research Database (Denmark)

    Vanhoutteghem, Lies; Morelli, Martin; Sørensen, Lars Schiøtt

    2017-01-01

    of measurements was compared with simulations of temperature and moisture condition in the floor structure and crawl space. The measurements showed that the extra 50 mm insulation placed below the beams reduced moisture content in the beams below 20 weight% all year. A reasonable agreement between......The hygrothermal behaviour of an outdoor ventilated crawl space with two different designs of the floor structure was investigated. The first design had 250 mm insulation and visible wooden beams towards the crawl space. The second design had 300 mm insulation and no visible wooden beams. One year...... the measurements and simulations was found; however, the evaporation from the soil was a dominant parameter affecting the hygrothermal response in the crawl space and floor structure....

  6. Historical Approach to the Role of Women in the Legislation of Iran: A Case Study on the Twenty-First Parliament

    Directory of Open Access Journals (Sweden)

    Sarah Sheibani

    2017-01-01

    Full Text Available One hundred and ten years ago, men and women took constitutionalism to achieve justice in Iran. National Council was the result of the Iranian people's struggle for justice, both women and men. Men policies from the beginning of legislation put women as minors and lunatics and bankrupted and banned them from vote. However, the Constitutional Revolution as a turning point and a national revolution played a key role in changing attitudes to women and structural context of their participation provided. In this paper, with the use of descriptive-analytical as well as quantitative methods, we sought to answer the question that what was the position of women in the twenty-first Parliament. The results of this study suggest that when Iranian women were allowed to participate politics, they have achieved to show their ability in politics as we saw examples in the twenty-first Parliament in which women had twenty-two percent participation.

  7. Nonlinear Pedagogy and Its Role in Encouraging Twenty-First Century Competencies through Physical Education: A Singapore Experience

    Science.gov (United States)

    Lee, Miriam Chang Yi; Chow, Jia Yi; Button, Chris; Tan, Clara Wee Keat

    2017-01-01

    Nonlinear Pedagogy is an exploratory approach to teaching and learning Physical Education that can be potentially effective to help children acquire relevant twenty-first century competencies. Underpinned by Ecological Dynamics, the focus of Nonlinear Pedagogy is on the learner and includes the provision of less prescriptive instructions and…

  8. Rethinking Teaching and Learning Pedagogy for Education in the Twenty-First Century: Blended Learning in Music Education

    Science.gov (United States)

    Crawford, Renée

    2017-01-01

    In an increasingly technologically driven world, there is proliferate discussion among education and government authorities about the necessity to rethink education in the twenty-first century. The evolution of technology and its pervasive influence on the needs and requirements of society is central to this mindset. Innovations in online…

  9. Monte Carlo simulation of continuous-space crystal growth

    International Nuclear Information System (INIS)

    Dodson, B.W.; Taylor, P.A.

    1986-01-01

    We describe a method, based on Monte Carlo techniques, of simulating the atomic growth of crystals without the discrete lattice space assumed by conventional Monte Carlo growth simulations. Since no lattice space is assumed, problems involving epitaxial growth, heteroepitaxy, phonon-driven mechanisms, surface reconstruction, and many other phenomena incompatible with the lattice-space approximation can be studied. Also, use of the Monte Carlo method circumvents to some extent the extreme limitations on simulated timescale inherent in crystal-growth techniques which might be proposed using molecular dynamics. The implementation of the new method is illustrated by studying the growth of strained-layer superlattice (SLS) interfaces in two-dimensional Lennard-Jones atomic systems. Despite the extreme simplicity of such systems, the qualitative features of SLS growth seen here are similar to those observed experimentally in real semiconductor systems

  10. Navigation simulator for the Space Tug vehicle

    Science.gov (United States)

    Colburn, B. K.; Boland, J. S., III; Peters, E. G.

    1977-01-01

    A general simulation program (GSP) for state estimation of a nonlinear space vehicle flight navigation system is developed and used as a basis for evaluating the performance of a Space Tug navigation system. An explanation of the iterative guidance mode (IGM) guidance law, derivation of the dynamics, coordinate frames and state estimation routines are given in order to clarify the assumptions and approximations made. A number of simulation and analytical studies are used to demonstrate the operation of the Tug system. Included in the simulation studies are (1) initial offset vector parameter study; (2) propagation time vs accuracy; (3) measurement noise parametric study and (4) reduction in computational burden of an on-board implementable scheme. From the results of these studies, conclusions and recommendations concerning future areas of practical and theoretical work are presented.

  11. Niels Bohr and the philosophy of physics twenty-first century perspectives

    CERN Document Server

    Folse, Henry

    2017-01-01

    Niels Bohr and Philosophy of Physics: Twenty-First Century Perspectives examines the philosophical views, influences and legacy of the Nobel Prize physicist and philosophical spokesman of the quantum revolution, Niels Bohr. The sixteen contributions in this collection by some of the best contemporary philosophers and physicists writing on Bohr's philosophy today all carefully distinguish his subtle and unique interpretation of quantum mechanics from views often imputed to him under the banner of the “Copenhagen Interpretation.” With respect to philosophical influences on Bohr's outlook, the contributors analyse prominent similarities between his viewpoint and Kantian ways of thinking, the views of the Danish philosopher Harald Høffding, and themes characteristic of American pragmatism. In recognizing the importance of Bohr's epistemological naturalism they examine his defence of the indispensability of classical concepts from a variety of different perspectives. This collection shows us that Bohr's int...

  12. A history of meniscal surgery: from ancient times to the twenty-first century.

    Science.gov (United States)

    Di Matteo, B; Moran, C J; Tarabella, V; Viganò, A; Tomba, P; Marcacci, M; Verdonk, R

    2016-05-01

    The science and surgery of the meniscus have evolved significantly over time. Surgeons and scientists always enjoy looking forward to novel therapies. However, as part of the ongoing effort at optimizing interventions and outcomes, it may also be useful to reflect on important milestones from the past. The aim of the present manuscript was to explore the history of meniscal surgery across the ages, from ancient times to the twenty-first century. Herein, some of the investigations of the pioneers in orthopaedics are described, to underline how their work has influenced the management of the injured meniscus in modern times. Level of evidence V.

  13. A needs assessment for DOE's packaging and transportation activities - a look into the twenty-first century

    International Nuclear Information System (INIS)

    Pope, R.; Turi, G.; Brancato, R.; Blalock, L.; Merrill, O.

    1995-01-01

    The U.S. Department of Energy (DOE) has performed a department-wide scoping of its packaging and transportation needs and has arrived at a projection of these needs for well into the twenty-first century. The assessment, known as the Transportation Needs Assessment (TNA) was initiated during August 1994 and completed in December 1994. The TNA will allow DOE to better prepare for changes in its transportation requirements in the future. The TNA focused on projected, quantified shipping needs based on forecasts of inventories of materials which will ultimately require transport by the DOE for storage, treatment and/or disposal. In addition, experts provided input on the growing needs throughout DOE resulting from changes in regulations, in DOE's mission, and in the sociopolitical structure of the United States. Through the assessment, DOE's transportation needs have been identified for a time period extending from the present through the first three decades of the twenty-first century. The needs assessment was accomplished in three phases: (1) defining current packaging, shipping, resource utilization, and methods of managing packaging and transportation activities; (2) establishing the inventory of materials which DOE will need to transport on into the next century and scenarios which project when, from where, and to where these materials will need to be transported; and (3) developing requirements and projected changes for DOE to accomplish the necessary transport safely and economically

  14. New and newer[The New Physics for the Twenty-First Century

    Energy Technology Data Exchange (ETDEWEB)

    Clark, C. [Electron and Optical Physics Division, National Institute of Standards and Technology, MD (United States)]. E-mail: clark@mail.nist.gov

    2006-09-15

    Stephen Hawking's inaugural lecture as Lucasian Professor of Mathematics at Cambridge University in 1980 caused quite a stir. Its title - 'Is the end in sight for theoretical physics?' - raised the prospect of a unified 'theory of everything'. Hawking suggested that there was a good chance of resolving the remaining inconsistencies between the two big 'theories of something' - quantum mechanics and general relativity - before the turn of the century. My first impression on reading The New Physics for the Twenty-First Century, a collection of essays edited by science journalist Gordon Fraser, is that a theory of everything may still be attainable by the turn of the century. However, there is now 20 times more of everything in the universe than there was in the past century, 95% of which no-one has ever actually seen, or had even heard of until a few years ago - as summarized in articles by Wendy Freedman, Edward Kolb and Ronald Adler. Despite this, Michael Green describes amazing developments in string theory that could tie everything together, if one could just figure out which, if any, of the apparently infinite varieties of string theory applies to our world, and why. (U.K.)

  15. China Accomplished Its First Space Rendezvous and Docking

    Institute of Scientific and Technical Information of China (English)

    Chen Xiaoli

    2011-01-01

    At 1:36 am on November 3,China's Shenzhou 8 unmanned spaceship and Tiangong 1 space lab spacecraft accomplished the country's first space docking procedure and coupling in space at more than 343km above Earth's surface,marking a great leap in China's space program.

  16. Proceedings of the twenty-first symposium of atomic energy research on WWER physics and reactor safety

    International Nuclear Information System (INIS)

    Vidovszky, I.

    2011-10-01

    The present volume contains 61 papers, presented on the twenty-first symposium of atomic energy research, held in Dresden, Germany, 19-23 September 2011. The papers are presented in their original form, i. e. no corrections or modifications were carried out. The content of this volume is divided into thematic groups: Improvement, extension and validation of parameterized few-group libraries for WWER-440 and WWER-1000.

  17. Virtual simulation. First clinical results in patients with prostate cancer

    International Nuclear Information System (INIS)

    Buchali, A.; Dinges, S.; Koswig, S.; Rosenthal, P.; Salk, S.; Harder, C.; Schlenger, L.; Budach, V.

    1998-01-01

    Investigation of options of virtual simulation in patients with localized prostate cancer. Twenty-four patients suffering from prostate cancer were virtual simulated. The clinical target volume was contoured and the planning target volume was defined after CT scan. The isocenter of the planning target volume was determined and marked at patient's skin. The precision of patients marking was controlled with conventional simulation after physical radiation treatment planning. Mean differences of the patient's mark revealed between the 2 simulations in all room axes around 1 mm. The organs at risk were visualized in the digital reconstructed radiographs. The precise patient's mark of the isocentre by virtual simulation allows to skip the conventional simulation. The visualisation of organs at risk leeds to an unnecessarity of an application of contrast medium and to a further relieve of the patient. The personal requirement is not higher in virtual simulation than in conventional CT based radiation treatment planning. (orig./MG) [de

  18. High Level Architecture Distributed Space System Simulation for Simulation Interoperability Standards Organization Simulation Smackdown

    Science.gov (United States)

    Li, Zuqun

    2011-01-01

    Modeling and Simulation plays a very important role in mission design. It not only reduces design cost, but also prepares astronauts for their mission tasks. The SISO Smackdown is a simulation event that facilitates modeling and simulation in academia. The scenario of this year s Smackdown was to simulate a lunar base supply mission. The mission objective was to transfer Earth supply cargo to a lunar base supply depot and retrieve He-3 to take back to Earth. Federates for this scenario include the environment federate, Earth-Moon transfer vehicle, lunar shuttle, lunar rover, supply depot, mobile ISRU plant, exploratory hopper, and communication satellite. These federates were built by teams from all around the world, including teams from MIT, JSC, University of Alabama in Huntsville, University of Bordeaux from France, and University of Genoa from Italy. This paper focuses on the lunar shuttle federate, which was programmed by the USRP intern team from NASA JSC. The shuttle was responsible for provide transportation between lunar orbit and the lunar surface. The lunar shuttle federate was built using the NASA standard simulation package called Trick, and it was extended with HLA functions using TrickHLA. HLA functions of the lunar shuttle federate include sending and receiving interaction, publishing and subscribing attributes, and packing and unpacking fixed record data. The dynamics model of the lunar shuttle was modeled with three degrees of freedom, and the state propagation was obeying the law of two body dynamics. The descending trajectory of the lunar shuttle was designed by first defining a unique descending orbit in 2D space, and then defining a unique orbit in 3D space with the assumption of a non-rotating moon. Finally this assumption was taken away to define the initial position of the lunar shuttle so that it will start descending a second after it joins the execution. VPN software from SonicWall was used to connect federates with RTI during testing

  19. Report of the twenty-first session, London, 18-22 February 1991

    International Nuclear Information System (INIS)

    1991-01-01

    The Joint Group of Experts on the Scientific Aspects of Marine Pollution (GESAMP) held its twenty-first session at the Headquarters of the International Maritime Organization (IMO), London, from 18 to 22 February 1991. Marine pollution is primarily linked to coastal development. The most serious problems are those associated with inadequately controlled coastal development and intensive human settlement of the coastal zone. GESAMP emphasizes the importance of the following problems and issues: State of the marine environment; comprehensive framework for the assessment and regulation of waste disposal in the marine environment; information on preparations for the United Nations Conference on Environment and Development; review of potentially harmful substances: 1. Carcinogenic substances. 2. Mutagenic substances. 3. Teratogenic substances. 4. Organochlorine compounds. 5. Oil, and other hydrocarbons including used lubricating oils, oil spill dispersants and chemicals used in offshore oil exploration and exploitation; environmental impacts of coastal aquaculture; global change and the air/sea exchange of chemicals; future work programme

  20. Golf science research at the beginning of the twenty-first century.

    Science.gov (United States)

    Farrally, M R; Cochran, A J; Crews, D J; Hurdzan, M J; Price, R J; Snow, J T; Thomas, P R

    2003-09-01

    At the beginning of the twenty-first century, there are 30,000 golf courses and 55 million people who play golf worldwide. In the USA alone, the value of golf club memberships sold in the 1990s was US dollar 3.2 billion. Underpinning this significant human activity is a wide variety of people researching and applying science to sustain and develop the game. The 11 golf science disciplines recognized by the World Scientific Congress of Golf have reported 311 papers at four world congresses since 1990. Additionally, scientific papers have been published in discipline-specific peer-reviewed journals, research has been sponsored by the two governing bodies of golf, the Royal and Ancient Golf Club of St. Andrews and the United States Golf Association, and confidential research is undertaken by commercial companies, especially equipment manufacturers. This paper reviews much of this human endeavour and points the way forward for future research into golf.

  1. Twenty-first century learning in states: the case of the Massachusetts educational system.

    Science.gov (United States)

    Driscoll, David P

    2006-01-01

    A current crisis in education is leaving students less prepared to succeed in the working world than any generation before them. Increasingly complex external, nonacademic pressures have an impact on many of today's students, often causing them to drop out of school. Only 76 percent of Massachusetts high school students graduate, and only 29 percent earn a college degree. National figures are worse. Most educational institutions share a common goal to support students in becoming skilled, productive, successful members of society, but the author argues that this goal is not being met. Despite the constant changes in the world, educational practices have remained static. Most public schools are not adapting to meet the shifting needs of students. Universities are not able to prepare the right mix of prospective employees for the demands of the job market; for example, schools are graduating only 10 percent of the needed engineers. Institutions of higher learning cannot keep up with employers' needs in an evolving global market: strong math, science, and writing abilities; critical thinking skills; and the ability to work in teams. The author draws on exemplary efforts at work in his home state of Massachusetts--whose improvements in student achievement outcomes have been some of the best in the nation--to suggest there is promise in twenty-first century learning. Middle school students involved in a NASA-funded project write proposals, work in teams, and engage in peer review. Older students participate in enhanced, hands-on cooperative school-to-work and after-school programs. Schools are starting to offer expanded day learning, increasing the number of hours they are engaged in formal learning. Yet such programs have not reached significant levels of scale. The author calls for a major shift in education to help today's students be successful in the twenty-first century.

  2. From School to Cafe and Back Again: Responding to the Learning Demands of the Twenty-First Century

    Science.gov (United States)

    McWilliam, Erica

    2011-01-01

    This paper traces the historical origins of formal and informal lifelong learning to argue that optimal twenty-first-century education can and should draw on the traditions of both the school and the coffee house or cafe. For some time now, educational policy documents and glossy school brochures have come wrapped in the mantle of lifelong…

  3. A Simulation and Modeling Framework for Space Situational Awareness

    International Nuclear Information System (INIS)

    Olivier, S.S.

    2008-01-01

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated

  4. Implementation of an Open-Scenario, Long-Term Space Debris Simulation Approach

    Science.gov (United States)

    Nelson, Bron; Yang Yang, Fan; Carlino, Roberto; Dono Perez, Andres; Faber, Nicolas; Henze, Chris; Karacalioglu, Arif Goktug; O'Toole, Conor; Swenson, Jason; Stupl, Jan

    2015-01-01

    This paper provides a status update on the implementation of a flexible, long-term space debris simulation approach. The motivation is to build a tool that can assess the long-term impact of various options for debris-remediation, including the LightForce space debris collision avoidance concept that diverts objects using photon pressure [9]. State-of-the-art simulation approaches that assess the long-term development of the debris environment use either completely statistical approaches, or they rely on large time steps on the order of several days if they simulate the positions of single objects over time. They cannot be easily adapted to investigate the impact of specific collision avoidance schemes or de-orbit schemes, because the efficiency of a collision avoidance maneuver can depend on various input parameters, including ground station positions and orbital and physical parameters of the objects involved in close encounters (conjunctions). Furthermore, maneuvers take place on timescales much smaller than days. For example, LightForce only changes the orbit of a certain object (aiming to reduce the probability of collision), but it does not remove entire objects or groups of objects. In the same sense, it is also not straightforward to compare specific de-orbit methods in regard to potential collision risks during a de-orbit maneuver. To gain flexibility in assessing interactions with objects, we implement a simulation that includes every tracked space object in Low Earth Orbit (LEO) and propagates all objects with high precision and variable time-steps as small as one second. It allows the assessment of the (potential) impact of physical or orbital changes to any object. The final goal is to employ a Monte Carlo approach to assess the debris evolution during the simulation time-frame of 100 years and to compare a baseline scenario to debris remediation scenarios or other scenarios of interest. To populate the initial simulation, we use the entire space

  5. Macro Level Simulation Model Of Space Shuttle Processing

    Science.gov (United States)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  6. Planetary and Space Simulation Facilities PSI at DLR for Astrobiology

    Science.gov (United States)

    Rabbow, E.; Rettberg, P.; Panitz, C.; Reitz, G.

    2008-09-01

    Ground based experiments, conducted in the controlled planetary and space environment simulation facilities PSI at DLR, are used to investigate astrobiological questions and to complement the corresponding experiments in LEO, for example on free flying satellites or on space exposure platforms on the ISS. In-orbit exposure facilities can only accommodate a limited number of experiments for exposure to space parameters like high vacuum, intense radiation of galactic and solar origin and microgravity, sometimes also technically adapted to simulate extraterrestrial planetary conditions like those on Mars. Ground based experiments in carefully equipped and monitored simulation facilities allow the investigation of the effects of simulated single environmental parameters and selected combinations on a much wider variety of samples. In PSI at DLR, international science consortia performed astrobiological investigations and space experiment preparations, exposing organic compounds and a wide range of microorganisms, reaching from bacterial spores to complex microbial communities, lichens and even animals like tardigrades to simulated planetary or space environment parameters in pursuit of exobiological questions on the resistance to extreme environments and the origin and distribution of life. The Planetary and Space Simulation Facilities PSI of the Institute of Aerospace Medicine at DLR in Köln, Germany, providing high vacuum of controlled residual composition, ionizing radiation of a X-ray tube, polychromatic UV radiation in the range of 170-400 nm, VIS and IR or individual monochromatic UV wavelengths, and temperature regulation from -20°C to +80°C at the sample size individually or in selected combinations in 9 modular facilities of varying sizes are presented with selected experiments performed within.

  7. Thinking Like Twenty-First Century Learners: An Exploration of Blog Use in a Skills-Based Counselor Education Course

    Science.gov (United States)

    Buono, Lisa L.

    2011-01-01

    Twenty-first century learners and millennial generation students have integrated technology into their personal lives; there is a growing expectation for technology to be integrated into their classroom experiences as well. Incorporating technology, including the use of blogs, into teaching and learning is receiving attention in the literature.…

  8. Transformative Pedagogy, Leadership and School Organisation for the Twenty-First-Century Knowledge-Based Economy: The Case of Singapore

    Science.gov (United States)

    Dimmock, Clive; Goh, Jonathan W. P.

    2011-01-01

    Singapore has a high performing school system; its students top international tests in maths and science. Yet while the Singapore government cherishes its world class "brand", it realises that in a globally competitive world, its schools need to prepare students for the twenty-first-century knowledge-based economy (KBE). Accordingly,…

  9. Causes and impacts of changes in the Arctic freshwater budget during the twentieth and twenty-first centuries in an AOGCM

    Energy Technology Data Exchange (ETDEWEB)

    Arzel, Olivier [University of New South Wales, Climate and Environmental Dynamics Laboratory, School of Mathematics and Statistics, Sydney, NSW (Australia); Fichefet, Thierry; Goosse, Hugues [Universite Catholique de Louvain, Institut d' Astronomie et de Geophysique G. Lemaitre, Louvain-la-Neuve (Belgium); Dufresne, Jean-Louis [Institut Pierre Simon Laplace UPMC/CNRS, Laboratoire de Meteorologie Dynamique, Paris (France)

    2008-01-15

    The fourth version of the atmosphere-ocean general circulation (AOGCM) model developed at the Institut Pierre-Simon Laplace (IPSL-CM4) is used to investigate the mechanisms influencing the Arctic freshwater balance in response to anthropogenic greenhouse gas forcing. The freshwater influence on the interannual variability of deep winter oceanic convection in the Nordic Seas is also studied on the basis of correlation and regression analyses of detrended variables. The model shows that the Fram Strait outflow, which is an important source of freshwater for the northern North Atlantic, experiences a rapid and strong transition from a weak state toward a relatively strong state during 1990-2010. The authors propose that this climate shift is triggered by the retreat of sea ice in the Barents Sea during the late twentieth century. This sea ice reduction initiates a positive feedback in the atmosphere-sea ice-ocean system that alters both the atmospheric and oceanic circulations in the Greenland-Iceland-Norwegian (GIN)-Barents Seas sector. Around year 2080, the model predicts a second transition threshold beyond which the Fram Strait outflow is restored toward its original weak value. The long-term freshening of the GIN Seas is invoked to explain this rapid transition. It is further found that the mechanism of interannual changes in deep mixing differ fundamentally between the twentieth and twenty-first centuries. This difference is caused by the dominant influence of freshwater over the twenty-first century. In the GIN Seas, the interannual changes in the liquid freshwater export out of the Arctic Ocean through Fram Strait combined with the interannual changes in the liquid freshwater import from the North Atlantic are shown to have a major influence in driving the interannual variability of the deep convection during the twenty-first century. South of Iceland, the other region of deep water renewal in the model, changes in freshwater import from the North Atlantic

  10. A Farewell to Innocence? African Youth and Violence in the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    Charles Ugochukwu Ukeje

    2012-12-01

    Full Text Available This is a broad examination of the issue of youth violence in twenty-first-century Africa, looking at the context within which a youth culture of violence has evolved and attempting to understand the underlining discourses of hegemony and power that drive it. The article focuses specifically on youth violence as apolitical response to the dynamics of (disempowerment, exclusion, and economic crisis and uses (postconflict states like Liberia, Sierra Leone, and Nigeriato explain not just the overall challenge of youth violence but also the nature of responses that it has elicited from established structures of authority. Youth violence is in many ways an expression of youth agency in the context of a social and economic system that provides little opportunity.

  11. Civil Rights Laws as Tools to Advance Health in the Twenty-First Century.

    Science.gov (United States)

    McGowan, Angela K; Lee, Mary M; Meneses, Cristina M; Perkins, Jane; Youdelman, Mara

    2016-01-01

    To improve health in the twenty-first century, to promote both access to and quality of health care services and delivery, and to address significant health disparities, legal and policy approaches, specifically those focused on civil rights, could be used more intentionally and strategically. This review describes how civil rights laws, and their implementation and enforcement, help to encourage health in the United States, and it provides examples for peers around the world. The review uses a broad lens to define health for both classes of individuals and their communities--places where people live, learn, work, and play. Suggestions are offered for improving health and equity broadly, especially within societal groups and marginalized populations. These recommendations include multisectorial approaches that focus on the social determinants of health.

  12. Unified Approach to Modeling and Simulation of Space Communication Networks and Systems

    Science.gov (United States)

    Barritt, Brian; Bhasin, Kul; Eddy, Wesley; Matthews, Seth

    2010-01-01

    Network simulator software tools are often used to model the behaviors and interactions of applications, protocols, packets, and data links in terrestrial communication networks. Other software tools that model the physics, orbital dynamics, and RF characteristics of space systems have matured to allow for rapid, detailed analysis of space communication links. However, the absence of a unified toolset that integrates the two modeling approaches has encumbered the systems engineers tasked with the design, architecture, and analysis of complex space communication networks and systems. This paper presents the unified approach and describes the motivation, challenges, and our solution - the customization of the network simulator to integrate with astronautical analysis software tools for high-fidelity end-to-end simulation. Keywords space; communication; systems; networking; simulation; modeling; QualNet; STK; integration; space networks

  13. Between vanguard and exclusion- young people of the twenty-first century

    Directory of Open Access Journals (Sweden)

    Agnieszka Gil

    2011-12-01

    Full Text Available This study has been narrowed down to reveal a paradox. Here the vanguard of culture and civilization - which is regarded as young people of the twenty-first century – is embroiled in a discourse of exclusion: economic, political and cultural life. In secondary school programs and high schools we do not find specific references and studies, primarily based on the needs of students, about the theory of popular culture and cultural education in the area of pop culture. The paradox of exclusion of mainstream culture from educational discourse is schizophrenic. The political exclusion of young people of the XXI century I consider all the disparaging scientific discourse, which skips the actual media and communication competence of young people. Prosumers, cognitarchy, digital natives, C-generation – they are for the modern economy “Silicon Valley” - their market power to exclude is already unstoppable. In other areas it remains to be considered whether excluding young people from the cultural discourse will not deprive our future teachers and translators of the next civilization revolution of social reality...

  14. A Dialogue Worth Having: Vocational Competence, Career Identity and a Learning Environment for Twenty-First Century Success at Work

    NARCIS (Netherlands)

    Meijers, Frans; Lengelle, Reinekke; Winters, Annemie; Kuijpers, Marinka

    2018-01-01

    The cultivation of intrinsic motivation is key in the twenty first century, but most students in Dutch vocational education lack this quality. To foster intrinsic motivation, a strong career-learning environment is needed that enables students to develop career competencies and a career identity.

  15. James Van Allen The First Eight Billion Miles

    CERN Document Server

    Foerstner, Abigail

    2009-01-01

    Astrophysicist and space pioneer James Van Allen (1914-2006), for whom the Van Allen radiation belts were named, was among the principal scientific investigators for twenty-four space missions, including Explorer I in 1958, the first successful U.S. satellite; Mariner 2's 1962 flyby of Venus, the first successful mission to another planet; and the 1970's Pioneer 10 and Pioneer 11, missions that surveyed Jupiter and Saturn. Abigail Foerstner blends space science, drama, military agenda's, cold war politics, and the events of Van Allen's lengthy career to create the first biography of this highl

  16. Civil engineering at the crossroads in the twenty-first century.

    Science.gov (United States)

    Ramírez, Francisco; Seco, Andres

    2012-12-01

    The twenty-first century presents a major challenge for civil engineering. The magnitude and future importance of some of the problems perceived by society are directly related to the field of the civil engineer, implying an inescapable burden of responsibility for a group whose technical soundness, rational approach and efficiency is highly valued and respected by the citizen. However, the substantial changes in society and in the way it perceives the problems that it considers important call for a thorough review of our structures, both professional and educational; so that our profession, with its undeniable historical prestige, may modernize certain approaches and attitudes in order to continue to be a reliable instrument in the service of society, giving priority from an ethical standpoint to its actions in pursuit of "the public good". It possesses important tools to facilitate this work (new technologies, the development of communications, the transmission of scientific thought.···); but there is nevertheless a need for deep reflection on the very essence of civil engineering: what we want it to be in the future, and the ability and willingness to take the lead at a time when society needs disinterested messages, technically supported, reasonably presented and dispassionately transmitted.

  17. The Return of "Patrimonial Capitalism": A Review of Thomas Piketty's Capital in the Twenty-First Century

    OpenAIRE

    Branko Milanovic

    2014-01-01

    Capital in the Twenty-First Century by Thomas Piketty provides a unified theory of the functioning of the capitalist economy by linking theories of economic growth and functional and personal income distributions. It argues, based on the long-run historical data series, that the forces of economic divergence (including rising income inequality) tend to dominate in capitalism. It regards the twentieth century as an exception to this rule and proposes policies that would make capitalism sustain...

  18. 26th Space Simulation Conference Proceedings. Environmental Testing: The Path Forward

    Science.gov (United States)

    Packard, Edward A.

    2010-01-01

    Topics covered include: A Multifunctional Space Environment Simulation Facility for Accelerated Spacecraft Materials Testing; Exposure of Spacecraft Surface Coatings in a Simulated GEO Radiation Environment; Gravity-Offloading System for Large-Displacement Ground Testing of Spacecraft Mechanisms; Microscopic Shutters Controlled by cRIO in Sounding Rocket; Application of a Physics-Based Stabilization Criterion to Flight System Thermal Testing; Upgrade of a Thermal Vacuum Chamber for 20 Kelvin Operations; A New Approach to Improve the Uniformity of Solar Simulator; A Perfect Space Simulation Storm; A Planetary Environmental Simulator/Test Facility; Collimation Mirror Segment Refurbishment inside ESA s Large Space; Space Simulation of the CBERS 3 and 4 Satellite Thermal Model in the New Brazilian 6x8m Thermal Vacuum Chamber; The Certification of Environmental Chambers for Testing Flight Hardware; Space Systems Environmental Test Facility Database (SSETFD), Website Development Status; Wallops Flight Facility: Current and Future Test Capabilities for Suborbital and Orbital Projects; Force Limited Vibration Testing of JWST NIRSpec Instrument Using Strain Gages; Investigation of Acoustic Field Uniformity in Direct Field Acoustic Testing; Recent Developments in Direct Field Acoustic Testing; Assembly, Integration and Test Centre in Malaysia: Integration between Building Construction Works and Equipment Installation; Complex Ground Support Equipment for Satellite Thermal Vacuum Test; Effect of Charging Electron Exposure on 1064nm Transmission through Bare Sapphire Optics and SiO2 over HfO2 AR-Coated Sapphire Optics; Environmental Testing Activities and Capabilities for Turkish Space Industry; Integrated Circuit Reliability Simulation in Space Environments; Micrometeoroid Impacts and Optical Scatter in Space Environment; Overcoming Unintended Consequences of Ambient Pressure Thermal Cycling Environmental Tests; Performance and Functionality Improvements to Next Generation

  19. Grounding by Attention Simulation in Peripersonal Space: Pupils Dilate to Pinch Grip But Not Big Size Nominal Classifier.

    Science.gov (United States)

    Lobben, Marit; Bochynska, Agata

    2018-03-01

    Grammatical categories represent implicit knowledge, and it is not known if such abstract linguistic knowledge can be continuously grounded in real-life experiences, nor is it known what types of mental states can be simulated. A former study showed that attention bias in peripersonal space (PPS) affects reaction times in grammatical congruency judgments of nominal classifiers, suggesting that simulated semantics may include reenactment of attention. In this study, we contrasted a Chinese nominal classifier used with nouns denoting pinch grip objects with a classifier for nouns with big object referents in a pupil dilation experiment. Twenty Chinese native speakers read grammatical and ungrammatical classifier-noun combinations and made grammaticality judgment while their pupillary responses were measured. It was found that their pupils dilated significantly more to the pinch grip classifier than to the big object classifier, indicating attention simulation in PPS. Pupil dilations were also significantly larger with congruent trials on the whole than in incongruent trials, but crucially, congruency and classifier semantics were independent of each other. No such effects were found in controls. Copyright © 2017 Cognitive Science Society, Inc.

  20. De-individualized psychophysiological strain assessment during a flight simulation test—Validation of a space methodology

    Science.gov (United States)

    Johannes, Bernd; Salnitski, Vyacheslav; Soll, Henning; Rauch, Melina; Hoermann, Hans-Juergen

    For the evaluation of an operator's skill reliability indicators of work quality as well as of psychophysiological states during the work have to be considered. The herein presented methodology and measurement equipment were developed and tested in numerous terrestrial and space experiments using a simulation of a spacecraft docking on a space station. However, in this study the method was applied to a comparable terrestrial task—the flight simulator test (FST) used in the DLR selection procedure for ab initio pilot applicants for passenger airlines. This provided a large amount of data for a statistical verification of the space methodology. For the evaluation of the strain level of applicants during the FST psychophysiological measurements were used to construct a "psychophysiological arousal vector" (PAV) which is sensitive to various individual reaction patterns of the autonomic nervous system to mental load. Its changes and increases will be interpreted as "strain". In the first evaluation study, 614 subjects were analyzed. The subjects first underwent a calibration procedure for the assessment of their autonomic outlet type (AOT) and on the following day they performed the FST, which included three tasks and was evaluated by instructors applying well-established and standardized rating scales. This new method will possibly promote a wide range of other future applications in aviation and space psychology.

  1. HUMAN SPACE FLIGHTS: FACTS AND DREAMS

    OpenAIRE

    Mariano Bizzarri; Enrico Saggese

    2011-01-01

    Manned space flight has been the great human and technological adventure of the past half-century. By putting people into places and situations unprecedented in history, it has stirred the imagination while expanding and redefining the human experience. However, space exploration obliges men to confront a hostile environment of cosmic radiation, microgravity, isolation and changes in the magnetic field. Any space traveler is therefore submitted to relevant health threats. In the twenty-first ...

  2. Learning Spaces in Academic Libraries--A Review of the Evolving Trends

    Science.gov (United States)

    Turner, Arlee; Welch, Bernadette; Reynolds, Sue

    2013-01-01

    This paper presents a review of the professional discourse regarding the evolution of information and learning spaces in academic libraries, particularly in the first decade of the twenty-first century. It investigates the evolution of academic libraries and the development of learning spaces focusing on the use of the terms which have evolved…

  3. A Vision for ARES in the Twenty-First Century: The Virtual Community of Real Estate Thought Leaders

    OpenAIRE

    Stephen E. Roulac

    1996-01-01

    In the twenty-first century the American Real Estate Society (ARES) is a virtual community of real estate thought leaders, electronically interconnected and linked through the International Real Estate Society to counterpart organizations on all major continents as well as numerous country-specific societies. ARES growth is attributable to its emphasis on rigorous applied microeconomic decisionmaking and an inclusive, open style. The initiatives of the Strategic Planning Task Force, whose rep...

  4. Predicting climate change impacts on native and invasive tree species using radial growth and twenty-first century climate scenarios

    NARCIS (Netherlands)

    González-Muñoz, N.; Linares, J.C.; Castro-Díez, P.; Sass-Klaassen, U.G.W.

    2014-01-01

    The climatic conditions predicted for the twenty-first century may aggravate the extent and impacts of plant invasions, by favouring those invaders more adapted to altered conditions or by hampering the native flora. We aim to predict the fate of native and invasive tree species in the oak forests

  5. The first decade of commercial space tourism

    Science.gov (United States)

    Chang, Yi-Wei

    2015-03-01

    In order to provide a basis for assessing the future prospects and challenges of space tourism, this paper begins with a brief overview of the history of space tourism. This is followed by a discussion on market demand and current developments in the academic community, as well as the status of traffic tools, regulations and legalization. In market demand, although studies conducted in 1990s assumed the possibility of 500,000 per year in space tourists and several billion USD of annual revenue, in 2008 a relatively modest 13,000 per year was predicted. At this time traffic transport tools including the Soyuz system, CST-100, DragonRider and International Space Station (ISS) can only provide a few tens in spare seats for space tourists per year compared to the projected 20,000 plus seat capacity of the Lynx, Dream Chaser and SpaceShipTwo (SS2) fleets, which have the potential to conduct their first full suborbital test flight and first commercial flight within the coming decade. Added to this, the US government has only a regulatory regime that supports privately owned suborbital space tourism (SST) and no government funded orbital space tourism (OST). These evidences reveal a very high and advantageous potential for SST to form a space tourism industry in the coming decade, whereas the possibility of OST is relatively low. However, even though the prosperity of SST in the coming years is expectable, its maturity, reliability and safety still need to win the confidence of the general public. For examples, the announcement of changes to fuel used in the SS2 rocket engine in May 2014 and the crash of one SS2 while performing test flight on 31 October 2014 indicated the need for much careful preparation, as any accident in commercial operation could seriously damage or even kill its future prospects.

  6. Simulation of the preliminary General Electric SP-100 space reactor concept using the ATHENA computer code

    International Nuclear Information System (INIS)

    Fletcher, C.D.

    1986-01-01

    The capability to perform thermal-hydraulic analyses of a space reactor using the ATHENA computer code is demonstrated. The fast reactor, liquid-lithium coolant loops, and lithium-filled heat pipes of the preliminary General electric SP-100 design were modeled with ATHENA. Two demonstration transient calculations were performed simulating accident conditions. Calculated results are available for display using the Nuclear Plant Analyzer color graphics analysis tool in addition to traditional plots. ATHENA-calculated results appear reasonable, both for steady state full power conditions, and for the two transients. This analysis represents the first known transient thermal-hydraulic simulation using an integral space reactor system model incorporating heat pipes. 6 refs., 17 figs., 1 tab

  7. The Challenges of Teaching and Learning about Science in the Twenty-First Century: Exploring the Abilities and Constraints of Adolescent Learners

    Science.gov (United States)

    Anderman, Eric M.; Sinatra, Gale M.; Gray, DeLeon L.

    2012-01-01

    In this article, we critically examine skills that are necessary for the effective learning of science in adolescent populations. We argue that a focus on twenty-first-century skills among adolescents within the context of science instruction must be considered in light of research on cognitive and social development. We first review adolescents'…

  8. Psychosocial value of space simulation for extended spaceflight

    Science.gov (United States)

    Kanas, N.

    1997-01-01

    There have been over 60 studies of Earth-bound activities that can be viewed as simulations of manned spaceflight. These analogs have involved Antarctic and Arctic expeditions, submarines and submersible simulators, land-based simulators, and hypodynamia environments. None of these analogs has accounted for all the variables related to extended spaceflight (e.g., microgravity, long-duration, heterogeneous crews), and some of the stimulation conditions have been found to be more representative of space conditions than others. A number of psychosocial factors have emerged from the simulation literature that correspond to important issues that have been reported from space. Psychological factors include sleep disorders, alterations in time sense, transcendent experiences, demographic issues, career motivation, homesickness, and increased perceptual sensitivities. Psychiatric factors include anxiety, depression, psychosis, psychosomatic symptoms, emotional reactions related to mission stage, asthenia, and postflight personality, and marital problems. Finally, interpersonal factors include tension resulting from crew heterogeneity, decreased cohesion over time, need for privacy, and issues involving leadership roles and lines of authority. Since future space missions will usually involve heterogeneous crews working on complicated objectives over long periods of time, these features require further study. Socio-cultural factors affecting confined crews (e.g., language and dialect, cultural differences, gender biases) should be explored in order to minimize tension and sustain performance. Career motivation also needs to be examined for the purpose of improving crew cohesion and preventing subgrouping, scapegoating, and territorial behavior. Periods of monotony and reduced activity should be addressed in order to maintain morale, provide meaningful use of leisure time, and prevent negative consequences of low stimulation, such as asthenia and crew member withdrawal

  9. Thermally Induced Vibrations of the Hubble Space Telescope's Solar Array 3 in a Test Simulated Space Environment

    Science.gov (United States)

    Early, Derrick A.; Haile, William B.; Turczyn, Mark T.; Griffin, Thomas J. (Technical Monitor)

    2001-01-01

    NASA Goddard Space Flight Center and the European Space Agency (ESA) conducted a disturbance verification test on a flight Solar Array 3 (SA3) for the Hubble Space Telescope using the ESA Large Space Simulator (LSS) in Noordwijk, the Netherlands. The LSS cyclically illuminated the SA3 to simulate orbital temperature changes in a vacuum environment. Data acquisition systems measured signals from force transducers and accelerometers resulting from thermally induced vibrations of the SAI The LSS with its seismic mass boundary provided an excellent background environment for this test. This paper discusses the analysis performed on the measured transient SA3 responses and provides a summary of the results.

  10. Remapping simulated halo catalogues in redshift space

    OpenAIRE

    Mead, Alexander; Peacock, John

    2014-01-01

    We discuss the extension to redshift space of a rescaling algorithm, designed to alter the effective cosmology of a pre-existing simulated particle distribution or catalogue of dark matter haloes. The rescaling approach was initially developed by Angulo & White and was adapted and applied to halo catalogues in real space in our previous work. This algorithm requires no information other than the initial and target cosmological parameters, and it contains no tuned parameters. It is shown here ...

  11. Laboratory simulation of space plasma phenomena*

    Science.gov (United States)

    Amatucci, B.; Tejero, E. M.; Ganguli, G.; Blackwell, D.; Enloe, C. L.; Gillman, E.; Walker, D.; Gatling, G.

    2017-12-01

    Laboratory devices, such as the Naval Research Laboratory's Space Physics Simulation Chamber, are large-scale experiments dedicated to the creation of large-volume plasmas with parameters realistically scaled to those found in various regions of the near-Earth space plasma environment. Such devices make valuable contributions to the understanding of space plasmas by investigating phenomena under carefully controlled, reproducible conditions, allowing for the validation of theoretical models being applied to space data. By working in collaboration with in situ experimentalists to create realistic conditions scaled to those found during the observations of interest, the microphysics responsible for the observed events can be investigated in detail not possible in space. To date, numerous investigations of phenomena such as plasma waves, wave-particle interactions, and particle energization have been successfully performed in the laboratory. In addition to investigations such as plasma wave and instability studies, the laboratory devices can also make valuable contributions to the development and testing of space plasma diagnostics. One example is the plasma impedance probe developed at NRL. Originally developed as a laboratory diagnostic, the sensor has now been flown on a sounding rocket, is included on a CubeSat experiment, and will be included on the DoD Space Test Program's STP-H6 experiment on the International Space Station. In this presentation, we will describe several examples of the laboratory investigation of space plasma waves and instabilities and diagnostic development. *This work supported by the NRL Base Program.

  12. Solving the problems we face: the United States Environmental Protection Agency, sustainability, and the challenges of the twenty-first century

    Science.gov (United States)

    Addressing the problems of the twenty-first century will require new initiatives that complement traditional regulatory activities. Existing regulations, such as the Clean Air Act and Clean Water Act are important safety nets in the United States for protecting human health and t...

  13. First principles simulations

    International Nuclear Information System (INIS)

    Palummo, M.; Reining, L.; Ballone, P.

    1993-01-01

    In this paper we outline the major features of the ''ab-initio'' simulation scheme of Car and Parrinello, focusing on the physical ideas and computational details at the basis of its efficiency and success. We briefly review the main applications of the method. We discuss the limitations of the standard scheme, as well as recent developments proposed in order to extend the reach of the method. Moreover, we consider more in detail two specific subjects. First, we describe a simple improvement (Gradient Corrections) on the basic approximation of the ''ab-initio'' simulation, i.e. the Local Density Approximation. These corrections can be easily and efficiently included in the Car-Parrinello code, bringing computed structural and cohesive properties significantly closer to their experimental values. Finally, we discuss the choice of the pseudopotential, with special attention to the possibilities and limitations of the last generation of soft pseudopotentials. (orig.)

  14. The renaissance of word-of-mouth marketing: A new standard in twenty-first century marketing management?!

    OpenAIRE

    Meiners, Norbert H.; Schwarting, Ulf; Seeberger, Bernd

    2010-01-01

    In this paper the importance of word of mouth for marketing management in the twenty-first century will be discussed. After a short introduction, there will be a focus on the demarcations and problems of traditional marketing. Then, in the third section, word of mouth (WOM) and word-of-mouth marketing (WOMM) as a 'new' standard in modern marketing are described. The fourth section broaches the importance of word of mouth and word-of-mouth marketing from the point of view of business and consu...

  15. Challenges and Opportunities for Occupational Epidemiology in the Twenty-first Century.

    Science.gov (United States)

    Stayner, L T; Collins, J J; Guo, Y L; Heederik, D; Kogevinas, M; Steenland, K; Wesseling, C; Demers, P A

    2017-09-01

    There are many opportunities and challenges for conducting occupational epidemiologic studies today. In this paper, we summarize the discussion of a symposium held at the Epidemiology in Occupational Health (EPICOH) conference, Chicago 2014, on challenges for occupational epidemiology in the twenty-first century. The increasing number of publications and attendance at our conferences suggests that worldwide interest in occupational epidemiology has been growing. There are clearly abundant opportunities for new research in occupational epidemiology. Areas ripe for further work include developing improved methods for exposure assessment, statistical analysis, studying migrant workers and other vulnerable populations, the use of biomarkers, and new hazards. Several major challenges are also discussed such as the rapidly changing nature and location of work, lack of funding, and political/legal conflicts. As long as work exists there will be occupational diseases that demand our attention, and a need for epidemiologic studies designed to characterize these risks and to support the development of preventive strategies. Despite the challenges and given the important past contribution in this field, we are optimistic about the importance and continued vitality of the research field of occupational epidemiology.

  16. Deep Space Navigation and Timing Architecture and Simulation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Microcosm will develop a deep space navigation and timing architecture and associated simulation, incorporating state-of-the art radiometric, x-ray pulsar, and laser...

  17. Twenty-one lectures on complex analysis a first course

    CERN Document Server

    Isaev, Alexander

    2017-01-01

    At its core, this concise textbook presents standard material for a first course in complex analysis at the advanced undergraduate level. This distinctive text will prove most rewarding for students who have a genuine passion for mathematics as well as certain mathematical maturity. Primarily aimed at undergraduates with working knowledge of real analysis and metric spaces, this book can also be used to instruct a graduate course. The text uses a conversational style with topics purposefully apportioned into 21 lectures, providing a suitable format for either independent study or lecture-based teaching. Instructors are invited to rearrange the order of topics according to their own vision. A clear and rigorous exposition is supported by engaging examples and exercises unique to each lecture; a large number of exercises contain useful calculation problems. Hints are given for a selection of the more difficult exercises. This text furnishes the reader with a means of learning complex analysis as well as a subtl...

  18. Issues in visual support to real-time space system simulation solved in the Systems Engineering Simulator

    Science.gov (United States)

    Yuen, Vincent K.

    1989-01-01

    The Systems Engineering Simulator has addressed the major issues in providing visual data to its real-time man-in-the-loop simulations. Out-the-window views and CCTV views are provided by three scene systems to give the astronauts their real-world views. To expand the window coverage for the Space Station Freedom workstation a rotating optics system is used to provide the widest field of view possible. To provide video signals to as many viewpoints as possible, windows and CCTVs, with a limited amount of hardware, a video distribution system has been developed to time-share the video channels among viewpoints at the selection of the simulation users. These solutions have provided the visual simulation facility for real-time man-in-the-loop simulations for the NASA space program.

  19. Monte Carlo simulations for the space radiation superconducting shield project (SR2S).

    Science.gov (United States)

    Vuolo, M; Giraudo, M; Musenich, R; Calvelli, V; Ambroglini, F; Burger, W J; Battiston, R

    2016-02-01

    Astronauts on deep-space long-duration missions will be exposed for long time to galactic cosmic rays (GCR) and Solar Particle Events (SPE). The exposure to space radiation could lead to both acute and late effects in the crew members and well defined countermeasures do not exist nowadays. The simplest solution given by optimized passive shielding is not able to reduce the dose deposited by GCRs below the actual dose limits, therefore other solutions, such as active shielding employing superconducting magnetic fields, are under study. In the framework of the EU FP7 SR2S Project - Space Radiation Superconducting Shield--a toroidal magnetic system based on MgB2 superconductors has been analyzed through detailed Monte Carlo simulations using Geant4 interface GRAS. Spacecraft and magnets were modeled together with a simplified mechanical structure supporting the coils. Radiation transport through magnetic fields and materials was simulated for a deep-space mission scenario, considering for the first time the effect of secondary particles produced in the passage of space radiation through the active shielding and spacecraft structures. When modeling the structures supporting the active shielding systems and the habitat, the radiation protection efficiency of the magnetic field is severely decreasing compared to the one reported in previous studies, when only the magnetic field was modeled around the crew. This is due to the large production of secondary radiation taking place in the material surrounding the habitat. Copyright © 2016 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  20. Simulation analysis of photometric data for attitude estimation of unresolved space objects

    Science.gov (United States)

    Du, Xiaoping; Gou, Ruixin; Liu, Hao; Hu, Heng; Wang, Yang

    2017-10-01

    The attitude information acquisition of unresolved space objects, such as micro-nano satellites and GEO objects under the way of ground-based optical observations, is a challenge to space surveillance. In this paper, a useful method is proposed to estimate the SO attitude state according to the simulation analysis of photometric data in different attitude states. The object shape model was established and the parameters of the BRDF model were determined, then the space object photometric model was established. Furthermore, the photometric data of space objects in different states are analyzed by simulation and the regular characteristics of the photometric curves are summarized. The simulation results show that the photometric characteristics are useful for attitude inversion in a unique way. Thus, a new idea is provided for space object identification in this paper.

  1. Research and development at the Marshall Space Flight Center Neutral Buoyancy Simulator

    Science.gov (United States)

    Kulpa, Vygantas P.

    1987-01-01

    The Neutral Buoyancy Simulator (NBS), a facility designed to imitate zero-gravity conditions, was used to test the Experimental Assembly of Structures in Extravehicular Activity (EASE) and the Assembly Concept for Construction of Erectable Space Structures (ACCESS). Neutral Buoyancy Simulator applications and operations; early space structure research; development of the EASE/ACCESS experiments; and improvement of NBS simulation are summarized.

  2. Modeling extreme "Carrington-type" space weather events using three-dimensional global MHD simulations

    Science.gov (United States)

    Ngwira, Chigomezyo M.; Pulkkinen, Antti; Kuznetsova, Maria M.; Glocer, Alex

    2014-06-01

    There is a growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure. In the last two decades, significant progress has been made toward the first-principles modeling of space weather events, and three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, thereby playing a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for the modern global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events with a Dst footprint comparable to the Carrington superstorm of September 1859 based on the estimate by Tsurutani et. al. (2003). Results are presented for a simulation run with "very extreme" constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated induced geoelectric field on the ground to such extreme driving conditions. The model setup is further tested using input data for an observed space weather event of Halloween storm October 2003 to verify the MHD model consistence and to draw additional guidance for future work. This extreme space weather MHD model setup is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in ground-based conductor systems such as power transmission grids. Therefore, our ultimate goal is to explore the level of geoelectric fields that can be induced from an assumed storm of the reported magnitude, i.e., Dst˜=-1600 nT.

  3. Formatively Assessing Teamwork in Technology-Enabled Twenty-First Century Classrooms: Exploratory Findings of a Teamwork Awareness Programme in Singapore

    Science.gov (United States)

    Koh, Elizabeth; Hong, Helen; Tan, Jennifer Pei-Ling

    2018-01-01

    Teamwork, one of the core competencies for the twenty-first century learner, is a critical skill for work and learning. However, assessing teamwork is complex, in particular, developing a measure of teamwork that is domain-generic and applicable across a wide range of learners. This paper documents one such study that leverages technology to help…

  4. Generation of initial kinetic distributions for simulation of long-pulse charged particle beams with high space-charge intensity

    Directory of Open Access Journals (Sweden)

    Steven M. Lund

    2009-11-01

    Full Text Available Self-consistent Vlasov-Poisson simulations of beams with high space-charge intensity often require specification of initial phase-space distributions that reflect properties of a beam that is well adapted to the transport channel—both in terms of low-order rms (envelope properties as well as the higher-order phase-space structure. Here, we first review broad classes of kinetic distributions commonly in use as initial Vlasov distributions in simulations of unbunched or weakly bunched beams with intense space-charge fields including the following: the Kapchinskij-Vladimirskij (KV equilibrium, continuous-focusing equilibria with specific detailed examples, and various nonequilibrium distributions, such as the semi-Gaussian distribution and distributions formed from specified functions of linear-field Courant-Snyder invariants. Important practical details necessary to specify these distributions in terms of standard accelerator inputs are presented in a unified format. Building on this presentation, a new class of approximate initial kinetic distributions are constructed using transformations that preserve linear focusing, single-particle Courant-Snyder invariants to map initial continuous-focusing equilibrium distributions to a form more appropriate for noncontinuous focusing channels. Self-consistent particle-in-cell simulations are employed to show that the approximate initial distributions generated in this manner are better adapted to the focusing channels for beams with high space-charge intensity. This improved capability enables simulations that more precisely probe intrinsic stability properties and machine performance.

  5. DataSpaces: An Interaction and Coordination Framework for Coupled Simulation Workflows

    International Nuclear Information System (INIS)

    Docan, Ciprian; Klasky, Scott A.; Parashar, Manish

    2010-01-01

    Emerging high-performance distributed computing environments are enabling new end-to-end formulations in science and engineering that involve multiple interacting processes and data-intensive application workflows. For example, current fusion simulation efforts are exploring coupled models and codes that simultaneously simulate separate application processes, such as the core and the edge turbulence, and run on different high performance computing resources. These components need to interact, at runtime, with each other and with services for data monitoring, data analysis and visualization, and data archiving. As a result, they require efficient support for dynamic and flexible couplings and interactions, which remains a challenge. This paper presents Data-Spaces, a flexible interaction and coordination substrate that addresses this challenge. DataSpaces essentially implements a semantically specialized virtual shared space abstraction that can be associatively accessed by all components and services in the application workflow. It enables live data to be extracted from running simulation components, indexes this data online, and then allows it to be monitored, queried and accessed by other components and services via the space using semantically meaningful operators. The underlying data transport is asynchronous, low-overhead and largely memory-to-memory. The design, implementation, and experimental evaluation of DataSpaces using a coupled fusion simulation workflow is presented.

  6. Simulating cosmic microwave background maps in multiconnected spaces

    International Nuclear Information System (INIS)

    Riazuelo, Alain; Uzan, Jean-Philippe; Lehoucq, Roland; Weeks, Jeffrey

    2004-01-01

    This paper describes the computation of cosmic microwave background (CMB) anisotropies in a universe with multiconnected spatial sections and focuses on the implementation of the topology in standard CMB computer codes. The key ingredient is the computation of the eigenmodes of the Laplacian with boundary conditions compatible with multiconnected space topology. The correlators of the coefficients of the decomposition of the temperature fluctuation in spherical harmonics are computed and examples are given for spatially flat spaces and one family of spherical spaces, namely, the lens spaces. Under the hypothesis of Gaussian initial conditions, these correlators encode all the topological information of the CMB and suffice to simulate CMB maps

  7. Two-Dimensional Electronic Spectroscopy of Benzene, Phenol, and Their Dimer: An Efficient First-Principles Simulation Protocol.

    Science.gov (United States)

    Nenov, Artur; Mukamel, Shaul; Garavelli, Marco; Rivalta, Ivan

    2015-08-11

    First-principles simulations of two-dimensional electronic spectroscopy in the ultraviolet region (2DUV) require computationally demanding multiconfigurational approaches that can resolve doubly excited and charge transfer states, the spectroscopic fingerprints of coupled UV-active chromophores. Here, we propose an efficient approach to reduce the computational cost of accurate simulations of 2DUV spectra of benzene, phenol, and their dimer (i.e., the minimal models for studying electronic coupling of UV-chromophores in proteins). We first establish the multiconfigurational recipe with the highest accuracy by comparison with experimental data, providing reference gas-phase transition energies and dipole moments that can be used to construct exciton Hamiltonians involving high-lying excited states. We show that by reducing the active spaces and the number of configuration state functions within restricted active space schemes, the computational cost can be significantly decreased without loss of accuracy in predicting 2DUV spectra. The proposed recipe has been successfully tested on a realistic model proteic system in water. Accounting for line broadening due to thermal and solvent-induced fluctuations allows for direct comparison with experiments.

  8. Gendering inequality: a note on Piketty's Capital in the twenty-first century.

    Science.gov (United States)

    Perrons, Diane

    2014-12-01

    Thomas Piketty's Capital in the Twenty-First Century is remarkable for moving inequality from the margins to mainstream debate through detailed analysis of longitudinal statistics and, for an economist, by advocating an interdisciplinary perspective and writing in a witty and accessible style. With reference to the post 1970 period, when wage increases are largely responsible for the increase in inequality, Piketty shows how patrimonial capitalists (elite managers) in the top decile and centile of the distribution appropriate a growing share of social wealth as a consequence of their 'power to set their own remuneration' in the context of tolerant social norms rather than through their productive contributions. Piketty raises but defers the question of where these social norms come from to other disciplines. A Feminist Economics perspective indicates that these questions are central to a more inclusive form of economic analysis and such an approach would enrich Piketty's analysis in two main ways. First, by paying greater attention to the processes and social norms through which inequalities are produced and justified and second by highlighting the ways in which inequality is experienced differently depending not only on class, but also on other aspects of identity including gender. This approach also suggests that it is necessary to supplement the ex-post redistributive policies recommended by Piketty: a global wealth tax and more steeply progressive income tax, with ex-ante measures to stop the rise in wage inequality in the first place, especially by bridging the huge gulf that exists between those who care for people and those who manage money. © London School of Economics and Political Science 2014.

  9. Sea-level rise and its possible impacts given a ‘beyond 4°C world’ in the twenty-first century

    NARCIS (Netherlands)

    Nicholls, R.; Marinova, N.A.; Lowe, J.; Brown, S.; Vellinga, P.

    2011-01-01

    The range of future climate-induced sea-level rise remains highly uncertain with continued concern that large increases in the twenty-first century cannot be ruled out. The biggest source of uncertainty is the response of the large ice sheets of Greenland and west Antarctica. Based on our analysis,

  10. Next Generation Simulation Framework for Robotic and Human Space Missions

    Science.gov (United States)

    Cameron, Jonathan M.; Balaram, J.; Jain, Abhinandan; Kuo, Calvin; Lim, Christopher; Myint, Steven

    2012-01-01

    The Dartslab team at NASA's Jet Propulsion Laboratory (JPL) has a long history of developing physics-based simulations based on the Darts/Dshell simulation framework that have been used to simulate many planetary robotic missions, such as the Cassini spacecraft and the rovers that are currently driving on Mars. Recent collaboration efforts between the Dartslab team at JPL and the Mission Operations Directorate (MOD) at NASA Johnson Space Center (JSC) have led to significant enhancements to the Dartslab DSENDS (Dynamics Simulator for Entry, Descent and Surface landing) software framework. The new version of DSENDS is now being used for new planetary mission simulations at JPL. JSC is using DSENDS as the foundation for a suite of software known as COMPASS (Core Operations, Mission Planning, and Analysis Spacecraft Simulation) that is the basis for their new human space mission simulations and analysis. In this paper, we will describe the collaborative process with the JPL Dartslab and the JSC MOD team that resulted in the redesign and enhancement of the DSENDS software. We will outline the improvements in DSENDS that simplify creation of new high-fidelity robotic/spacecraft simulations. We will illustrate how DSENDS simulations are assembled and show results from several mission simulations.

  11. First-ever evening public engine test of a Space Shuttle Main Engine

    Science.gov (United States)

    2001-01-01

    Thousands of people watch the first-ever evening public engine test of a Space Shuttle Main Engine at NASA's John C. Stennis Space Center. The spectacular test marked Stennis Space Center's 20th anniversary celebration of the first Space Shuttle mission.

  12. he First Superconductivity Experiment in Space

    International Nuclear Information System (INIS)

    Polturak, E.; Koren, G.

    1999-01-01

    One of the most promising applications of high Tc superconductors is in the field of satellite communications. In view of the rapidly increasing demand for satellite communication channels due to the formation of global networks of cellular phones, internet, etc., one needs to (develop more efficient ways of dividing the finite frequency band into more and more channels without paying for it with excessive interference or an increasingly large weight of conventional filters. Superconductive components can save an order of magnitude on the weight and volume of such filters, a very important factor in satellite design. Yet, up to now superconductors were never tested in space. We present the design and performance of the first such experiment to reach space. The experiment consists of a thin film HTSC device integrated with a miniature cryo cooler. It was launched into space in July 1998 aboard the Thatch's-II micro satellite. We will present data obtained from this experiment until the present time. Long term survivability of HTSC devices in space would be discussed

  13. Changing ideas in forestry: A comparison of concepts in Swedish and American forestry journals during the early twentieth and twenty-first centuries.

    Science.gov (United States)

    Mårald, Erland; Langston, Nancy; Sténs, Anna; Moen, Jon

    2016-02-01

    By combining digital humanities text-mining tools and a qualitative approach, we examine changing concepts in forestry journals in Sweden and the United States (US) in the early twentieth and early twenty-first centuries. Our first hypothesis is that foresters at the beginning of the twentieth century were more concerned with production and less concerned with ecology than foresters at the beginning of the twenty-first century. Our second hypothesis is that US foresters in the early twentieth century were less concerned with local site conditions than Swedish foresters. We find that early foresters in both countries had broader-and often ecologically focused-concerns than hypothesized. Ecological concerns in the forestry literature have increased, but in the Nordic countries, production concerns have increased as well. In both regions and both time periods, timber management is closely connected to concerns about governance and state power, but the forms that governance takes have changed.

  14. Global threats from invasive alien species in the twenty-first century and national response capacities

    Science.gov (United States)

    Early, Regan; Bradley, Bethany A.; Dukes, Jeffrey S.; Lawler, Joshua J.; Olden, Julian D.; Blumenthal, Dana M.; Gonzalez, Patrick; Grosholz, Edwin D.; Ibañez, Ines; Miller, Luke P.; Sorte, Cascade J. B.; Tatem, Andrew J.

    2016-01-01

    Invasive alien species (IAS) threaten human livelihoods and biodiversity globally. Increasing globalization facilitates IAS arrival, and environmental changes, including climate change, facilitate IAS establishment. Here we provide the first global, spatial analysis of the terrestrial threat from IAS in light of twenty-first century globalization and environmental change, and evaluate national capacities to prevent and manage species invasions. We find that one-sixth of the global land surface is highly vulnerable to invasion, including substantial areas in developing economies and biodiversity hotspots. The dominant invasion vectors differ between high-income countries (imports, particularly of plants and pets) and low-income countries (air travel). Uniting data on the causes of introduction and establishment can improve early-warning and eradication schemes. Most countries have limited capacity to act against invasions. In particular, we reveal a clear need for proactive invasion strategies in areas with high poverty levels, high biodiversity and low historical levels of invasion. PMID:27549569

  15. How Has Elderly Migration Changed in the Twenty-First Century? What the Data Can-and Cannot-Tell Us.

    Science.gov (United States)

    Conway, Karen Smith; Rork, Jonathan C

    2016-08-01

    Interstate elderly migration has strong implications for state tax policies and health care systems, yet little is known about how it has changed in the twenty-first century. Its relative rarity requires a large data set with which to construct reliable measures, and the replacement of the U.S. Census long form (CLF) with the American Community Survey (ACS) has made such updates difficult. Two commonly used alternative migration data sources-the Current Population Survey (CPS) and the Statistics of Income (SOI) program of the Internal Revenue Service (IRS)-suffer serious limitations in studying the migration of any subpopulation, including the elderly. Our study informs migration research in the post-2000 era by identifying methodological differences between data sources and devising strategies for reconciling the CLF and ACS. Our investigation focusing on the elderly suggests that the ACS can generate comparable migration data that reveal a continuation of previously identified geographic patterns as well as changes unique to the 2000s. However, its changed definition of residence and survey timing leaves us unable to construct a comparable national migration rate, suggesting that one must use national trends in the smaller CPS to investigate whether elderly migration has increased or decreased in the twenty-first century.

  16. Extremophiles survival to simulated space conditions: an astrobiology model study.

    Science.gov (United States)

    Mastascusa, V; Romano, I; Di Donato, P; Poli, A; Della Corte, V; Rotundi, A; Bussoletti, E; Quarto, M; Pugliese, M; Nicolaus, B

    2014-09-01

    In this work we investigated the ability of four extremophilic bacteria from Archaea and Bacteria domains to resist to space environment by exposing them to extreme conditions of temperature, UV radiation, desiccation coupled to low pressure generated in a Mars' conditions simulator. All the investigated extremophilic strains (namely Sulfolobus solfataricus, Haloterrigena hispanica, Thermotoga neapolitana and Geobacillus thermantarcticus) showed a good resistance to the simulation of the temperature variation in the space; on the other hand irradiation with UV at 254 nm affected only slightly the growth of H. hispanica, G. thermantarcticus and S. solfataricus; finally exposition to Mars simulated condition showed that H. hispanica and G. thermantarcticus were resistant to desiccation and low pressure.

  17. A Simulation Base Investigation of High Latency Space Systems Operations

    Science.gov (United States)

    Li, Zu Qun; Crues, Edwin Z.; Bielski, Paul; Moore, Michael

    2017-01-01

    NASA's human space program has developed considerable experience with near Earth space operations. Although NASA has experience with deep space robotic missions, NASA has little substantive experience with human deep space operations. Even in the Apollo program, the missions lasted only a few weeks and the communication latencies were on the order of seconds. Human missions beyond the relatively close confines of the Earth-Moon system will involve missions with durations measured in months and communications latencies measured in minutes. To minimize crew risk and to maximize mission success, NASA needs to develop a better understanding of the implications of these types of mission durations and communication latencies on vehicle design, mission design and flight controller interaction with the crew. To begin to address these needs, NASA performed a study using a physics-based subsystem simulation to investigate the interactions between spacecraft crew and a ground-based mission control center for vehicle subsystem operations across long communication delays. The simulation, built with a subsystem modeling tool developed at NASA's Johnson Space Center, models the life support system of a Mars transit vehicle. The simulation contains models of the cabin atmosphere and pressure control system, electrical power system, drinking and waste water systems, internal and external thermal control systems, and crew metabolic functions. The simulation has three interfaces: 1) a real-time crew interface that can be use to monitor and control the vehicle subsystems; 2) a mission control center interface with data transport delays up to 15 minutes each way; 3) a real-time simulation test conductor interface that can be use to insert subsystem malfunctions and observe the interactions between the crew, ground, and simulated vehicle. The study was conducted at the 21st NASA Extreme Environment Mission Operations (NEEMO) mission between July 18th and Aug 3rd of year 2016. The NEEMO

  18. Bruce's Magnificent Quartet: Inquiry, Community, Technology and Literacy--Implications for Renewing Qualitative Research in the Twenty-First Century

    Science.gov (United States)

    Davidson, Judith

    2014-01-01

    Bruce and Bishop's community informatics work brings forward four critical concepts: inquiry, community, technology, and literacy. These four terms serve as the basis for a discussion of qualitative research in the twenty-first century--what is lacking and what is needed. The author suggests that to resolve the tensions or challenges…

  19. A Coordinated Initialization Process for the Distributed Space Exploration Simulation (DSES)

    Science.gov (United States)

    Phillips, Robert; Dexter, Dan; Hasan, David; Crues, Edwin Z.

    2007-01-01

    This document describes the federate initialization process that was developed at the NASA Johnson Space Center with the HIIA Transfer Vehicle Flight Controller Trainer (HTV FCT) simulations and refined in the Distributed Space Exploration Simulation (DSES). These simulations use the High Level Architecture (HLA) IEEE 1516 to provide the communication and coordination between the distributed parts of the simulation. The purpose of the paper is to describe a generic initialization sequence that can be used to create a federate that can: 1. Properly initialize all HLA objects, object instances, interactions, and time management 2. Check for the presence of all federates 3. Coordinate startup with other federates 4. Robustly initialize and share initial object instance data with other federates.

  20. Twenty-First Century Instructional Classroom Practices and Reading Motivation: Probing the Effectiveness of Interventional Reading Programs

    Directory of Open Access Journals (Sweden)

    Taoufik Boulhrir

    2017-07-01

    Full Text Available Twenty-first century education has undoubtedly witnessed changes of the definition of literacy to cope with the economic, social, and intellectual trends. Technological advances, which include skills of communication, creativity, critical thinking, and collaboration have become key in education, especially when dealing with literacy and reading motivation. As motivation hinges around two major theoretical approaches, intrinsic and extrinsic, numerous studies argue for the first to be more sustainable in enhancing reading motivation. Accordingly, many research-based interventional programs have emerged since the late nineties with increasing popularity to offer answers to the dwindling rates in reading among youth. This article discusses traits of 21st century education in light of trends and challenges as it probes the effectiveness of some interventional programs that are meant, and argued for, to enhance literacy skills and reading motivation.

  1. Accelerators for the twenty-first century - a review

    International Nuclear Information System (INIS)

    Wilson, E.J.N.

    1990-01-01

    Modern synchrotrons and storage rings are based upon the electrical technology of the 1900s boosted by the microwave radar techniques of World War II. This method of acceleration now seems to be approaching its practical limit. It is high time that we seek a new physical acceleration mechanism to provide the higher energies and luminosities needed to continue particle physics beyond the machines now on the stocks. Twenty years is a short time in which to invent, develop, and construct such a device. Without it, high-energy physics may well come to an end. Particle physicists and astrophysicists are invited to join accelerator specialists in the hunt for this new principle. This report analyses the present limitations of colliders and explores some of the directions in which one might look to find a new principle. Chapters cover proton colliders, electron-positron colliders, linear colliders, and two-beam accelerators; transverse fields, wake-field and beat-wave accelerators, ferroelectric crystals, and acceleration in astrophysics. (orig.)

  2. Desdemona and a ticket to space; training for space flight in a 3g motion simulator

    NARCIS (Netherlands)

    Wouters, M.

    2014-01-01

    On October 5, 2013, Marijn Wouters and two other contestants of a nation-wide competition ‘Nederland Innoveert’ underwent a space training exercise. One by one, the trainees were pushed to their limits in the Desdemona motion simulator, an experience that mimicked the Space Expedition Corporation

  3. An analysis of players\\' performances in the first cricket Twenty20 ...

    African Journals Online (AJOL)

    The purpose of this paper is to show how batting and bowling performance measures for one-day internationals can be adapted for use in Twenty20 matches, specifically in the case of a very small number of matches played. These measures are then used to give rankings of the batsmen and bowlers who performed best in ...

  4. Future projections of synoptic weather types over the Arabian Peninsula during the twenty-first century using an ensemble of CMIP5 models

    KAUST Repository

    El Kenawy, Ahmed M.; McCabe, Matthew

    2016-01-01

    An assessment of future change in synoptic conditions over the Arabian Peninsula throughout the twenty-first century was performed using 20 climate models from the Coupled Model Intercomparison Project Phase 5 (CMIP5) database. We employed the mean

  5. Effects of repeated simulated removal activities on feral swine movements and space use

    Science.gov (United States)

    Fischer, Justin W.; McMurtry , Dan; Blass, Chad R.; Walter, W. David; Beringer, Jeff; VerCauterren, Kurt C.

    2016-01-01

    Abundance and distribution of feral swine (Sus scrofa) in the USA have increased dramatically during the last 30 years. Effective measures are needed to control and eradicate feral swine populations without displacing animals over wider areas. Our objective was to investigate effects of repeated simulated removal activities on feral swine movements and space use. We analyzed location data from 21 feral swine that we fitted with Global Positioning System harnesses in southern MO, USA. Various removal activities were applied over time to eight feral swine before lethal removal, including trapped-and-released, chased with dogs, chased with hunter, and chased with helicopter. We found that core space-use areas were reduced following the first removal activity, whereas overall space-use areas and diurnal movement distances increased following the second removal activity. Mean geographic centroid shifts did not differ between pre- and post-periods for either the first or second removal activities. Our information on feral swine movements and space use precipitated by human removal activities, such as hunting, trapping, and chasing with dogs, helps fill a knowledge void and will aid wildlife managers. Strategies to optimize management are needed to reduce feral swine populations while preventing enlarged home ranges and displacing individuals, which could lead to increased disease transmission risk and human-feral swine conflict in adjacent areas.

  6. An alternative phase-space distribution to sample initial conditions for classical dynamics simulations

    International Nuclear Information System (INIS)

    Garcia-Vela, A.

    2002-01-01

    A new quantum-type phase-space distribution is proposed in order to sample initial conditions for classical trajectory simulations. The phase-space distribution is obtained as the modulus of a quantum phase-space state of the system, defined as the direct product of the coordinate and momentum representations of the quantum initial state. The distribution is tested by sampling initial conditions which reproduce the initial state of the Ar-HCl cluster prepared by ultraviolet excitation, and by simulating the photodissociation dynamics by classical trajectories. The results are compared with those of a wave packet calculation, and with a classical simulation using an initial phase-space distribution recently suggested. A better agreement is found between the classical and the quantum predictions with the present phase-space distribution, as compared with the previous one. This improvement is attributed to the fact that the phase-space distribution propagated classically in this work resembles more closely the shape of the wave packet propagated quantum mechanically

  7. Time-Accurate Unsteady Pressure Loads Simulated for the Space Launch System at Wind Tunnel Conditions

    Science.gov (United States)

    Alter, Stephen J.; Brauckmann, Gregory J.; Kleb, William L.; Glass, Christopher E.; Streett, Craig L.; Schuster, David M.

    2015-01-01

    A transonic flow field about a Space Launch System (SLS) configuration was simulated with the Fully Unstructured Three-Dimensional (FUN3D) computational fluid dynamics (CFD) code at wind tunnel conditions. Unsteady, time-accurate computations were performed using second-order Delayed Detached Eddy Simulation (DDES) for up to 1.5 physical seconds. The surface pressure time history was collected at 619 locations, 169 of which matched locations on a 2.5 percent wind tunnel model that was tested in the 11 ft. x 11 ft. test section of the NASA Ames Research Center's Unitary Plan Wind Tunnel. Comparisons between computation and experiment showed that the peak surface pressure RMS level occurs behind the forward attach hardware, and good agreement for frequency and power was obtained in this region. Computational domain, grid resolution, and time step sensitivity studies were performed. These included an investigation of pseudo-time sub-iteration convergence. Using these sensitivity studies and experimental data comparisons, a set of best practices to date have been established for FUN3D simulations for SLS launch vehicle analysis. To the author's knowledge, this is the first time DDES has been used in a systematic approach and establish simulation time needed, to analyze unsteady pressure loads on a space launch vehicle such as the NASA SLS.

  8. Diverging seasonal extremes for ocean acidification during the twenty-first century

    Science.gov (United States)

    Kwiatkowski, Lester; Orr, James C.

    2018-01-01

    How ocean acidification will affect marine organisms depends on changes in both the long-term mean and the short-term temporal variability of carbonate chemistry1-8. Although the decadal-to-centennial response to atmospheric CO2 and climate change is constrained by observations and models1, 9, little is known about corresponding changes in seasonality10-12, particularly for pH. Here we assess the latter by analysing nine earth system models (ESMs) forced with a business-as-usual emissions scenario13. During the twenty-first century, the seasonal cycle of surface-ocean pH was attenuated by 16 ± 7%, on average, whereas that for hydrogen ion concentration [H+] was amplified by 81 ± 16%. Simultaneously, the seasonal amplitude of the aragonite saturation state (Ωarag) was attenuated except in the subtropics, where it was amplified. These contrasting changes derive from regionally varying sensitivities of these variables to atmospheric CO2 and climate change and to diverging trends in seasonal extremes in the primary controlling variables (temperature, dissolved inorganic carbon and alkalinity). Projected seasonality changes will tend to exacerbate the impacts of increasing [H+] on marine organisms during the summer and ameliorate the impacts during the winter, although the opposite holds in the high latitudes. Similarly, over most of the ocean, impacts from declining Ωarag are likely to be intensified during the summer and dampened during the winter.

  9. Thermal System Upgrade of the Space Environment Simulation Test Chamber

    Science.gov (United States)

    Desai, Ashok B.

    1997-01-01

    The paper deals with the refurbishing and upgrade of the thermal system for the existing thermal vacuum test facility, the Space Environment Simulator, at NASA's Goddard Space Flight Center. The chamber is the largest such facility at the center. This upgrade is the third phase of the long range upgrade of the chamber that has been underway for last few years. The first phase dealt with its vacuum system, the second phase involved the GHe subsystem. The paper describes the considerations of design philosophy options for the thermal system; approaches taken and methodology applied, in the evaluation of the remaining "life" in the chamber shrouds and related equipment by conducting special tests and studies; feasibility and extent of automation, using computer interfaces and Programmable Logic Controllers in the control system and finally, matching the old components to the new ones into an integrated, highly reliable and cost effective thermal system for the facility. This is a multi-year project just started and the paper deals mainly with the plans and approaches to implement the project successfully within schedule and costs.

  10. Interfacing Space Communications and Navigation Network Simulation with Distributed System Integration Laboratories (DSIL)

    Science.gov (United States)

    Jennings, Esther H.; Nguyen, Sam P.; Wang, Shin-Ywan; Woo, Simon S.

    2008-01-01

    NASA's planned Lunar missions will involve multiple NASA centers where each participating center has a specific role and specialization. In this vision, the Constellation program (CxP)'s Distributed System Integration Laboratories (DSIL) architecture consist of multiple System Integration Labs (SILs), with simulators, emulators, testlabs and control centers interacting with each other over a broadband network to perform test and verification for mission scenarios. To support the end-to-end simulation and emulation effort of NASA' exploration initiatives, different NASA centers are interconnected to participate in distributed simulations. Currently, DSIL has interconnections among the following NASA centers: Johnson Space Center (JSC), Kennedy Space Center (KSC), Marshall Space Flight Center (MSFC) and Jet Propulsion Laboratory (JPL). Through interconnections and interactions among different NASA centers, critical resources and data can be shared, while independent simulations can be performed simultaneously at different NASA locations, to effectively utilize the simulation and emulation capabilities at each center. Furthermore, the development of DSIL can maximally leverage the existing project simulation and testing plans. In this work, we describe the specific role and development activities at JPL for Space Communications and Navigation Network (SCaN) simulator using the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to simulate communications effects among mission assets. Using MACHETE, different space network configurations among spacecrafts and ground systems of various parameter sets can be simulated. Data that is necessary for tracking, navigation, and guidance of spacecrafts such as Crew Exploration Vehicle (CEV), Crew Launch Vehicle (CLV), and Lunar Relay Satellite (LRS) and orbit calculation data are disseminated to different NASA centers and updated periodically using the High Level Architecture (HLA). In

  11. Use of Comics to Enhance Students' Learning for the Development of the Twenty-First Century Competencies in the Mathematics Classroom

    Science.gov (United States)

    Toh, Tin Lam; Cheng, Lu Pien; Ho, Siew Yin; Jiang, Heng; Lim, Kam Ming

    2017-01-01

    This paper discusses the use of comics in teaching mathematics in the secondary mathematics classroom. We explicate how the use of comics in teaching mathematics can prepare students for the twenty-first century competencies. We developed an alternative teaching package using comics for two lower secondary mathematics topics. This alternative…

  12. Saving time in a space-efficient simulation algorithm

    NARCIS (Netherlands)

    Markovski, J.

    2011-01-01

    We present an efficient algorithm for computing the simulation preorder and equivalence for labeled transition systems. The algorithm improves an existing space-efficient algorithm and improves its time complexity by employing a variant of the stability condition and exploiting properties of the

  13. The era of the wandering mind? Twenty-first century research on self-generated mental activity

    Directory of Open Access Journals (Sweden)

    Felicity eCallard

    2013-12-01

    Full Text Available The first decade of the twenty-first century was characterized by renewed scientific interest in self-generated mental activity (activity largely generated by the individual, rather than in response to experimenters’ instructions or specific external sensory inputs. To understand this renewal of interest, we interrogated the peer-reviewed literature from 2003–2012 (i to explore recent changes in use of terms for self-generated mental activity; (ii to investigate changes in the topics on which mind wandering research, specifically, focuses; and (iii to visualize co-citation communities amongst researchers working on self-generated mental activity. Our analyses demonstrated that there has been a dramatic increase in the term mind wandering, and a significant crossing-over of psychological investigations of mind wandering, specifically, into cognitive neuroscience. If this is, indeed, the ‘era of the wandering mind’, our paper calls for more explicit reflection to be given by mind wandering researchers to the terms they use, the topics and brain regions they focused on, and the research literatures that they implicitly foreground or ignore as not relevant.

  14. Thermography During Thermal Test of the Gaia Deployable Sunshield Assembly Qualification Model in the ESTEC Large Space Simulator

    Science.gov (United States)

    Simpson, R.; Broussely, M.; Edwards, G.; Robinson, D.; Cozzani, A.; Casarosa, G.

    2012-07-01

    The National Physical Laboratory (NPL) and The European Space Research and Technology Centre (ESTEC) have performed for the first time successful surface temperature measurements using infrared thermal imaging in the ESTEC Large Space Simulator (LSS) under vacuum and with the Sun Simulator (SUSI) switched on during thermal qualification tests of the GAIA Deployable Sunshield Assembly (DSA). The thermal imager temperature measurements, with radiosity model corrections, show good agreement with thermocouple readings on well characterised regions of the spacecraft. In addition, the thermal imaging measurements identified potentially misleading thermocouple temperature readings and provided qualitative real-time observations of the thermal and spatial evolution of surface structure changes and heat dissipation during hot test loadings, which may yield additional thermal and physical measurement information through further research.

  15. Virtual Reality: Teaching Tool of the Twenty-First Century?

    Science.gov (United States)

    Hoffman, Helene; Vu, Dzung

    1997-01-01

    Virtual reality-based procedural and surgical simulations promise to revolutionize medical training. A wide range of simulations representing diverse content areas and varied implementation strategies are under development or in early use. The new systems will make broad-based training experiences available for students at all levels without risks…

  16. State-Space Equations and the First-Phase Algorithm for Signal Control of Single Intersections

    Institute of Scientific and Technical Information of China (English)

    LI Jinyuan; PAN Xin; WANG Xiqin

    2007-01-01

    State-space equations were applied to formulate the queuing and delay of traffic at a single intersection in this paper. The signal control of a single intersection was then modeled as a discrete-time optimal control problem, with consideration of the constraints of stream conflicts, saturation flow rate, minimum green time, and maximum green time. The problem cannot be solved directly due to the nonlinear constraints.However, the results of qualitative analysis were used to develop a first-phase signal control algorithm. Simulation results show that the algorithm substantially reduces the total delay compared to fixed-time control.

  17. Indication to Open Anatrophic Nephrolithotomy in the Twenty-First Century: A Case Report

    Directory of Open Access Journals (Sweden)

    Alfredo Maria Bove

    2012-01-01

    Full Text Available Introduction. Advances in endourology have greatly reduced indications to open surgery in the treatment of staghorn kidney stones. Nevertheless in our experience, open surgery still represents the treatment of choice in rare cases. Case Report. A 71-year-old morbidly obese female patient complaining about occasional left flank pain, and recurrent cystitis for many years, presented bilateral staghorn kidney stones. Comorbidities were obesity (BMI 36.2, hypertension, type II diabetes, and chronic obstructive pulmunary disease (COPD hyperlipidemia. Due to these comorbidities, endoscopic and laparoscopic approaches were not indicated. We offered the patient staged open anatrophic nephrolithotomy. Results. Operative time was 180 minutes. Blood loss was 500 cc. requiring one unit of packed red blood cells. Hospital stay was 7 days. The renal function was unaffected based on preoperative and postoperative serum creatinine levels. Stone-free status of the left kidney was confirmed after surgery with CT scan. Conclusions. Open surgery can represent a valid alterative in the treatment of staghorn kidney stones of very selected cases. A discussion of the current indications in the twenty-first century is presented.

  18. Three-dimensional space changes after premature loss of a maxillary primary first molar.

    Science.gov (United States)

    Park, Kitae; Jung, Da-Woon; Kim, Ji-Yeon

    2009-11-01

    A space maintainer is generally preferred when a primary first molar is lost before or during active eruption of the first permanent molars in order to prevent space loss. However, controversy prevails regarding the space loss after eruption of the permanent first molars. The purpose of this study was to examine spatial changes subsequent to premature loss of a maxillary primary first molar after the eruption of the permanent first molars. Thirteen children, five girls and eight boys, expecting premature extraction of a maxillary primary first molar because of caries and/or failed pulp therapy, were selected. Spatial changes were investigated using a three-dimensional laser scanner by comparing the primary molar space, arch width, arch length, and arch perimeter before and after the extraction of a maxillary primary first molar. Also, the inclination and angulation changes in the maxillary primary canines, primary second molars, and permanent first molars adjacent to the extraction site were investigated before and after the extraction of the maxillary primary first molar in order to examine the source of space loss. There was no statistically significant space loss on the extraction side compared to the control side (P = 0.33). No consistent findings were seen on the inclination and angulation changes on the extraction side. The premature loss of a maxillary primary first molar, in cases with class I molar relationship, has limited influence on the space in permanent dentition.

  19. The restructuring of the Argentina Navy between the end of the twentieth century and early twenty-first.

    Directory of Open Access Journals (Sweden)

    Germán Soprano

    2017-07-01

    Full Text Available The definition of a policy of national defense and internal security in democracy, created conditions to advance in the process of restructuring of the Argentina Navy, introducing changes in its organization and functions. In this article we will focus this process analyzing, on the one hand, the relationship between the definitions of defense policy and the configuration of naval military instrument between the end of the twentieth century and early twenty-first century; and, on the other hand, understanding their development in the case of two components of the force: the marine corps and the division of maritime patrol.

  20. Simulated Space Environment Effects on a Candidate Solar Sail Material

    Science.gov (United States)

    Kang, Jin Ho; Bryant, Robert G.; Wilkie, W. Keats; Wadsworth, Heather M.; Craven, Paul D.; Nehls, Mary K.; Vaughn, Jason A.

    2017-01-01

    For long duration missions of solar sails, the sail material needs to survive harsh space environments and the degradation of the sail material controls operational lifetime. Therefore, understanding the effects of the space environment on the sail membrane is essential for mission success. In this study, we investigated the effect of simulated space environment effects of ionizing radiation, thermal aging and simulated potential damage on mechanical, thermal and optical properties of a commercial off the shelf (COTS) polyester solar sail membrane to assess the degradation mechanisms on a feasible solar sail. The solar sail membrane was exposed to high energy electrons (about 70 keV and 10 nA/cm2), and the physical properties were characterized. After about 8.3 Grad dose, the tensile modulus, tensile strength and failure strain of the sail membrane decreased by about 20 95%. The aluminum reflective layer was damaged and partially delaminated but it did not show any significant change in solar absorbance or thermal emittance. The effect on mechanical properties of a pre-cracked sample, simulating potential impact damage of the sail membrane, as well as thermal aging effects on metallized PEN (polyethylene naphthalate) film will be discussed.

  1. Taking Up Space: Museum Exploration in the Twenty-First Century

    Science.gov (United States)

    Sutton, Tiffany

    2007-01-01

    Museums have become a crucible for questions of the role that traditional art and art history should play in contemporary art. Friedrich Nietzsche argued in the nineteenth century that museums can be no more than mausoleums for effete (fine) art. Over the course of the twentieth century, however, curators dispelled such blanket pessimism by…

  2. Simulation of space charge effects and transition crossing in the Fermilab Booster

    International Nuclear Information System (INIS)

    Lucas, P.; MacLachlan, J.

    1987-03-01

    The longitudinal phase space program ESME, modified for space charge and wall impedance effects, has been used to simulate transition crossing in the Fermilab Booster. The simulations yield results in reasonable quantitative agreement with measured parameters. They further indicate that a transition jump scheme currently under construction will significantly reduce emittance growth, while attempts to alter machine impedance are less obviously beneficial. In addition to presenting results, this paper points out a serious difficulty, related to statistical fluctuations, in the space charge calculation. False indications of emittance growth can appear if care is not taken to minimize this problem

  3. Twenty-First-Century Kids, Twenty-First-Century Librarians

    Science.gov (United States)

    Walter, Virginia A.

    2010-01-01

    Inspired by a new generation of librarians and children, Walter reconsiders the legacy passed on by the matriarchs of children's services and examines more recent trends and challenges growing out of changes in educational philosophy and information technology. This thoroughly researched book includes the current issues and trends of: (1)…

  4. The politics of space mining - An account of a simulation game

    Science.gov (United States)

    Paikowsky, Deganit; Tzezana, Roey

    2018-01-01

    Celestial bodies like the Moon and asteroids contain materials and precious metals, which are valuable for human activity on Earth and beyond. Space mining has been mainly relegated to the realm of science fiction, and was not treated seriously by the international community. The private industry is starting to assemble towards space mining, and success on this front would have major impact on all nations. We present in this paper a review of current space mining ventures, and the international legislation, which could stand in their way - or aid them in their mission. Following that, we present the results of a role-playing simulation in which the role of several important nations was played by students of international relations. The results of the simulation are used as a basis for forecasting the potential initial responses of the nations of the world to a successful space mining operation in the future.

  5. Approximate Bayesian Computation by Subset Simulation using hierarchical state-space models

    Science.gov (United States)

    Vakilzadeh, Majid K.; Huang, Yong; Beck, James L.; Abrahamsson, Thomas

    2017-02-01

    A new multi-level Markov Chain Monte Carlo algorithm for Approximate Bayesian Computation, ABC-SubSim, has recently appeared that exploits the Subset Simulation method for efficient rare-event simulation. ABC-SubSim adaptively creates a nested decreasing sequence of data-approximating regions in the output space that correspond to increasingly closer approximations of the observed output vector in this output space. At each level, multiple samples of the model parameter vector are generated by a component-wise Metropolis algorithm so that the predicted output corresponding to each parameter value falls in the current data-approximating region. Theoretically, if continued to the limit, the sequence of data-approximating regions would converge on to the observed output vector and the approximate posterior distributions, which are conditional on the data-approximation region, would become exact, but this is not practically feasible. In this paper we study the performance of the ABC-SubSim algorithm for Bayesian updating of the parameters of dynamical systems using a general hierarchical state-space model. We note that the ABC methodology gives an approximate posterior distribution that actually corresponds to an exact posterior where a uniformly distributed combined measurement and modeling error is added. We also note that ABC algorithms have a problem with learning the uncertain error variances in a stochastic state-space model and so we treat them as nuisance parameters and analytically integrate them out of the posterior distribution. In addition, the statistical efficiency of the original ABC-SubSim algorithm is improved by developing a novel strategy to regulate the proposal variance for the component-wise Metropolis algorithm at each level. We demonstrate that Self-regulated ABC-SubSim is well suited for Bayesian system identification by first applying it successfully to model updating of a two degree-of-freedom linear structure for three cases: globally

  6. Operation and evaluation of the Terminal Configured Vehicle Mission Simulator in an automated terminal area metering and spacing ATC environment

    Science.gov (United States)

    Houck, J. A.

    1980-01-01

    This paper describes the work being done at the National Aeronautics and Space Administration's Langley Research Center on the development of a mission simulator for use in the Terminal Configured Vehicle Program. A brief description of the goals and objectives of the Terminal Configured Vehicle Program is presented. A more detailed description of the Mission Simulator, in its present configuration, and its components is provided. Finally, a description of the first research study conducted in the Mission Simulator is presented along with a discussion of some preliminary results from this study.

  7. Changes in seasonal and diurnal precipitation types during summer over South Korea in the late twenty-first century (2081-2100) projected by the RegCM4.0 based on four RCP scenarios

    Science.gov (United States)

    Oh, Seok-Geun; Suh, Myoung-Seok

    2018-01-01

    Changes in seasonal and diurnal precipitation types over South Korea during summer in the late twenty-first century (2081-2100) were projected under four RCP scenarios using the Regional Climate Model (RegCM4.0) with a horizontal resolution of 12.5 km. Two boundary conditions, ERA-Interim and HadGEM2-AO, were used to drive the RegCM4.0 (jointly named RG4_ERA and RG4_HG2, respectively). In general, the RegCM4.0 reproduces the spatial distribution of summer precipitation over Northeast Asia for the current climate (1989-2008) reasonably well. The RG4_HG2 shows larger dry biases over South Korea, when compared with observations, than does the RG4_ERA. These strong dry biases result from the underestimation of convective precipitation (CPR) and are particularly noticeable in late afternoons during July and August. It is related to the performance of HadGEM2-AO which simulated southwesterly winds weakly in that time. However, interestingly, the RG4_HG2 simulates similar increases in the contribution of CPR to total precipitation after mid-July, resulting in comparable performance in the reproduction of heavy precipitation. In the late twenty-first century, a significant increase (decrease) in CPR (NCPR) is generally projected over South Korea, and particularly under the RCP8.5. During June, the total precipitation is affected primarily by changes in NCPR under RCP2.6 and RCP6.0. After mid-July, increasing total precipitation is primarily caused by the distinct increases in CPR in the late afternoons; this pattern is particularly noticeable under RCP8.5, which is associated with more destabilized atmospheric conditions during July and August. Light and heavy precipitation are projected to decrease and increase, respectively, under RCP8.5.

  8. Drone Warfare: Twenty-First Century Empire and Communications

    Directory of Open Access Journals (Sweden)

    Kevin Howley

    2017-02-01

    Full Text Available This paper, part of a larger project that examines drones from a social-construction of technology perspective, considers drone warfare in light of Harold Innis’s seminal work on empire and communication. Leveraging leading-edge aeronautics with advanced optics, data processing, and networked communication, drones represent an archetypal “space-biased” technology. Indeed, by allowing remote operators and others to monitor, select, and strike targets from half a world away, and in real-time, these weapon systems epitomize the “pernicious neglect of time” Innis sought to identify and remedy in his later writing. With Innis’s time-space dialectic as a starting point, then, the paper considers drones in light of a longstanding paradox of American culture: the impulse to collapse the geographical distance between the United States and other parts of the globe, while simultaneously magnifying the cultural difference between Americans and other peoples and societies. In the midst of the worldwide proliferation of drones, this quintessentially sublime technology embodies this (disconnect in important, profound, and ominous ways.

  9. First-principles simulations of heat transport

    Science.gov (United States)

    Puligheddu, Marcello; Gygi, Francois; Galli, Giulia

    2017-11-01

    Advances in understanding heat transport in solids were recently reported by both experiment and theory. However an efficient and predictive quantum simulation framework to investigate thermal properties of solids, with the same complexity as classical simulations, has not yet been developed. Here we present a method to compute the thermal conductivity of solids by performing ab initio molecular dynamics at close to equilibrium conditions, which only requires calculations of first-principles trajectories and atomic forces, thus avoiding direct computation of heat currents and energy densities. In addition the method requires much shorter sequential simulation times than ordinary molecular dynamics techniques, making it applicable within density functional theory. We discuss results for a representative oxide, MgO, at different temperatures and for ordered and nanostructured morphologies, showing the performance of the method in different conditions.

  10. Space-based infrared sensors of space target imaging effect analysis

    Science.gov (United States)

    Dai, Huayu; Zhang, Yasheng; Zhou, Haijun; Zhao, Shuang

    2018-02-01

    Target identification problem is one of the core problem of ballistic missile defense system, infrared imaging simulation is an important means of target detection and recognition. This paper first established the space-based infrared sensors ballistic target imaging model of point source on the planet's atmosphere; then from two aspects of space-based sensors camera parameters and target characteristics simulated atmosphere ballistic target of infrared imaging effect, analyzed the camera line of sight jitter, camera system noise and different imaging effects of wave on the target.

  11. Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications

    Science.gov (United States)

    Boghosian, Mary; Narvaez, Pablo; Herman, Ray

    2012-01-01

    The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.

  12. Development of automation and robotics for space via computer graphic simulation methods

    Science.gov (United States)

    Fernandez, Ken

    1988-01-01

    A robot simulation system, has been developed to perform automation and robotics system design studies. The system uses a procedure-oriented solid modeling language to produce a model of the robotic mechanism. The simulator generates the kinematics, inverse kinematics, dynamics, control, and real-time graphic simulations needed to evaluate the performance of the model. Simulation examples are presented, including simulation of the Space Station and the design of telerobotics for the Orbital Maneuvering Vehicle.

  13. First Principle simulations of electrochemical interfaces - a DFT study

    DEFF Research Database (Denmark)

    Ahmed, Rizwan

    for the whole system to qualify as a proper electrochemical interface. I have also contributed to the model, which accounts for pH in the first principle electrode-electrolyte interface simulations. This is an important step forward, since electrochemical reaction rate and barrier for charge transfer can......In this thesis, I have looked beyond the computational hydrogen electrode (CHE) model, and focused on the first principle simulations which treats the electrode-electrolyte interfaces explicitly. Since obtaining a realistic electrode-electrolyte interface was difficult, I aimed to address various...... challenges regarding first principle electrochemical interface modeling in order to bridge the gap between the model interface used in simulations and real catalyst at operating conditions. Atomic scale insight for the processes and reactions that occur at the electrochemical interface presents a challenge...

  14. Simulated Space Environmental Effects on Thin Film Solar Array Components

    Science.gov (United States)

    Finckenor, Miria; Carr, John; SanSoucie, Michael; Boyd, Darren; Phillips, Brandon

    2017-01-01

    The Lightweight Integrated Solar Array and Transceiver (LISA-T) experiment consists of thin-film, low mass, low volume solar panels. Given the variety of thin solar cells and cover materials and the lack of environmental protection typically afforded by thick coverglasses, a series of tests were conducted in Marshall Space Flight Center's Space Environmental Effects Facility to evaluate the performance of these materials. Candidate thin polymeric films and nitinol wires used for deployment were also exposed. Simulated space environment exposures were selected based on SSP 30425 rev. B, "Space Station Program Natural Environment Definition for Design" or AIAA Standard S-111A-2014, "Qualification and Quality Requirements for Space Solar Cells." One set of candidate materials were exposed to 5 eV atomic oxygen and concurrent vacuum ultraviolet (VUV) radiation for low Earth orbit simulation. A second set of materials were exposed to 1 MeV electrons. A third set of samples were exposed to 50, 100, 500, and 700 keV energy protons, and a fourth set were exposed to >2,000 hours of near ultraviolet (NUV) radiation. A final set was rapidly thermal cycled between -55 and +125degC. This test series provides data on enhanced power generation, particularly for small satellites with reduced mass and volume resources. Performance versus mass and cost per Watt is discussed.

  15. Educating for the Twenty-First Century

    Science.gov (United States)

    Ramaley, Judith A.

    2013-01-01

    In his first inaugural speech, President Obama declared that "our schools fail too many" and an essential component of laying "a new foundation for growth" will be "to transform our schools and colleges and universities to meet the demands of a new age." Concerns about our nation's position in the global education race have led to a focus on…

  16. Integrated visualization of simulation results and experimental devices in virtual-reality space

    International Nuclear Information System (INIS)

    Ohtani, Hiroaki; Ishiguro, Seiji; Shohji, Mamoru; Kageyama, Akira; Tamura, Yuichi

    2011-01-01

    We succeeded in integrating the visualization of both simulation results and experimental device data in virtual-reality (VR) space using CAVE system. Simulation results are shown using Virtual LHD software, which can show magnetic field line, particle trajectory, and isosurface of plasma pressure of the Large Helical Device (LHD) based on data from the magnetohydrodynamics equilibrium simulation. A three-dimensional mouse, or wand, determines the initial position and pitch angle of a drift particle or the starting point of a magnetic field line, interactively in the VR space. The trajectory of a particle and the stream-line of magnetic field are calculated using the Runge-Kutta-Huta integration method on the basis of the results obtained after pointing the initial condition. The LHD vessel is objectively visualized based on CAD-data. By using these results and data, the simulated LHD plasma can be interactively drawn in the objective description of the LHD experimental vessel. Through this integrated visualization, it is possible to grasp the three-dimensional relationship of the positions between the device and plasma in the VR space, opening a new path in contribution to future research. (author)

  17. Optimizing grade-control drillhole spacing with conditional simulations

    Directory of Open Access Journals (Sweden)

    Adrian Martínez-Vargas

    2017-01-01

    Full Text Available This paper summarizes a method to determine the optimum spacing of grade-control drillholes drilled with reverse-circulation. The optimum drillhole spacing was defined as that one whose cost equals the cost of misclassifying ore and waste in selection mining units (SMU. The cost of misclassification of a given drillhole spacing is equal to the cost of processing waste misclassified as ore (Type I error plus the value of the ore misclassified as waste (Type II error. Type I and Type II errors were deduced by comparing true and estimated grades at SMUs, in relation to a cuttoff grade value and assuming free ore selection. True grades at SMUs and grades at drillhole samples were generated with conditional simulations. A set of estimated grades at SMU, one per each drillhole spacing, were generated with ordinary kriging. This method was used to determine the optimum drillhole spacing in a gold deposit. The results showed that the cost of misclassification is sensitive to extreme block values and tend to be overrepresented. Capping SMU’s lost values and implementing diggability constraints was recommended to improve calculations of total misclassification costs.

  18. Multicultural Ground Teams in Space Programs

    Science.gov (United States)

    Maier, M.

    2012-01-01

    In the early years of space flight only two countries had access to space. In the last twenty years, there have been major changes in how we conduct space business. With the fall of the iron curtain and the growing of the European Union, more and more players were able to join the space business and space science. By end of the last century, numerous countries, agencies and companies earned the right to be equal partners in space projects. This paper investigates the impact of multicultural teams in the space arena. Fortunately, in manned spaceflight, especially for long duration missions, there are several studies and simulations reporting on multicultural team impact. These data have not been as well explored on the team interactions within the ground crews. The focus of this paper are the teams working on the ISS project. Hypotheses will be drawn from the results of space crew research to determine parallels and differences for this vital segment of success in space missions. The key source of the data will be drawn from structured interviews with managers and other ground crews on the ISS project.

  19. The General-Use Nodal Network Solver (GUNNS) Modeling Package for Space Vehicle Flow System Simulation

    Science.gov (United States)

    Harvey, Jason; Moore, Michael

    2013-01-01

    The General-Use Nodal Network Solver (GUNNS) is a modeling software package that combines nodal analysis and the hydraulic-electric analogy to simulate fluid, electrical, and thermal flow systems. GUNNS is developed by L-3 Communications under the TS21 (Training Systems for the 21st Century) project for NASA Johnson Space Center (JSC), primarily for use in space vehicle training simulators at JSC. It has sufficient compactness and fidelity to model the fluid, electrical, and thermal aspects of space vehicles in real-time simulations running on commodity workstations, for vehicle crew and flight controller training. It has a reusable and flexible component and system design, and a Graphical User Interface (GUI), providing capability for rapid GUI-based simulator development, ease of maintenance, and associated cost savings. GUNNS is optimized for NASA's Trick simulation environment, but can be run independently of Trick.

  20. Catastrophic Disruption of Asteroids: First Simulations with Explicit Formation of Spinning Rigid and Semi-rigid Aggregates

    Science.gov (United States)

    Michel, Patrick; Richardson, D. C.

    2007-10-01

    We have made major improvements in simulations of asteroid disruption by computing explicitly aggregate formations during the gravitational reaccumulation of small fragments, allowing us to obtain information on their spin and shape. First results will be presented taking as examples asteroid families that we reproduced successfully with previous less sophisticated simulations. In the last years, we have simulated successfully the formation of asteroid families using a SPH hydrocode to compute the fragmentation following the impact of a projectile on the parent body, and the N-body code pkdgrav to compute the mutual interactions of the fragments. We found that fragments generated by the disruption of a km-size asteroid can have large enough masses to be attracted by each other during their ejection. Consequently, many reaccumulations take place. Eventually most large fragments correspond to gravitational aggregates formed by reaccumulation of smaller ones. Moreover, formation of satellites occurs around the largest and other big remnants. In these previous simulations, when fragments reaccumulate, they merge into a single sphere whose mass is the sum of their masses. Thus, no information is obtained on the actual shape of the aggregates, their spin, ... For the first time, we have now simulated the disruption of a family parent body by computing explicitly the formation of aggregates, along with the above-mentioned properties. Once formed these aggregates can interact and/or collide with each other and break up during their evolution. We will present these first simulations and their possible implications on properties of asteroids generated by disruption. Results can for instance be compared with data provided by the Japanese space mission Hayabusa of the asteroid Itokawa, a body now understood to be a reaccumulated fragment from a larger parent body. Acknowledgments: PM and DCR acknowledge supports from the French Programme National de Planétologie and grants

  1. First Space VLBI Observations and Images Using the VLBA and VSOP

    Science.gov (United States)

    Romney, J. D.; Benson, J. M.; Claussen, M. J.; Desai, K. M.; Flatters, C.; Mioduszewski, A. J.; Ulvestad, J. S.

    1997-12-01

    The National Radio Astronomy Observatory (NRAO) is a participant in the VSOP Space VLBI mission, an international collaboration led by Japan's Institute of Space and Astronautical Science. NRAO has committed up to 30% of scheduled observing time on the Very Long Baseline Array (VLBA), and corresponding correlation resources, to Space VLBI observations. The NRAO Space VLBI Project, funded by NASA, has been working for several years to complete the necessary enhancements to the VLBA correlator and the AIPS image processing system. These developments were completed by the time of the successful launch of the VSOP mission's Halca spacecraft on 1997 February 12. As part of the in-orbit checkout phase, the first Space VLBI fringes from a VLBA observation were detected on 1997 June 12, and the VSOP mission's first images, in both the 1.6- and 5-GHz bands, were obtained shortly thereafter. In-orbit test observations continued through early September, with the first General Observing Time (GOT) scientific observations beginning in July. Through mid-October, a total of 20 Space VLBI observations, comprising 190 hours, had been completed at the VLBA correlator. This paper reviews the unique features of correlation and imaging of Space VLBI observations. These include, for correlation, the ephemeris for an orbiting VLBI ``station'' which is not fixed on the surface of the earth, and the requirement to close the loop on the phase-transfer process from a frequency standard on the ground to the spacecraft. Images from a number of early tests and scientific observations are presented. NRAO's user-support program, providing expert assistance in data analysis to Space VLBI observers, is also described.

  2. Modeling extreme (Carrington-type) space weather events using three-dimensional MHD code simulations

    Science.gov (United States)

    Ngwira, C. M.; Pulkkinen, A. A.; Kuznetsova, M. M.; Glocer, A.

    2013-12-01

    There is growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure and systems. In the last two decades, significant progress has been made towards the modeling of space weather events. Three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, and have played a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for existing global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events that have a ground footprint comparable (or larger) to the Carrington superstorm. Results are presented for an initial simulation run with ``very extreme'' constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated ground induced geoelectric field to such extreme driving conditions. We also discuss the results and what they might mean for the accuracy of the simulations. The model is further tested using input data for an observed space weather event to verify the MHD model consistence and to draw guidance for future work. This extreme space weather MHD model is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in earth conductors such as power transmission grids.

  3. SPACE CHARGE SIMULATION METHODS INCORPORATED IN SOME MULTI - PARTICLE TRACKING CODES AND THEIR RESULTS COMPARISON

    International Nuclear Information System (INIS)

    BEEBE - WANG, J.; LUCCIO, A.U.; D IMPERIO, N.; MACHIDA, S.

    2002-01-01

    Space charge in high intensity beams is an important issue in accelerator physics. Due to the complicity of the problems, the most effective way of investigating its effect is by computer simulations. In the resent years, many space charge simulation methods have been developed and incorporated in various 2D or 3D multi-particle-tracking codes. It has becoming necessary to benchmark these methods against each other, and against experimental results. As a part of global effort, we present our initial comparison of the space charge methods incorporated in simulation codes ORBIT++, ORBIT and SIMPSONS. In this paper, the methods included in these codes are overviewed. The simulation results are presented and compared. Finally, from this study, the advantages and disadvantages of each method are discussed

  4. SPACE CHARGE SIMULATION METHODS INCORPORATED IN SOME MULTI - PARTICLE TRACKING CODES AND THEIR RESULTS COMPARISON.

    Energy Technology Data Exchange (ETDEWEB)

    BEEBE - WANG,J.; LUCCIO,A.U.; D IMPERIO,N.; MACHIDA,S.

    2002-06-03

    Space charge in high intensity beams is an important issue in accelerator physics. Due to the complicity of the problems, the most effective way of investigating its effect is by computer simulations. In the resent years, many space charge simulation methods have been developed and incorporated in various 2D or 3D multi-particle-tracking codes. It has becoming necessary to benchmark these methods against each other, and against experimental results. As a part of global effort, we present our initial comparison of the space charge methods incorporated in simulation codes ORBIT++, ORBIT and SIMPSONS. In this paper, the methods included in these codes are overviewed. The simulation results are presented and compared. Finally, from this study, the advantages and disadvantages of each method are discussed.

  5. Simulation for memory effect of Fick's first law†

    Indian Academy of Sciences (India)

    Administrator

    firmed by means of the 3D Monte Carlo simulation, where τ is the relaxation time, J is the flux of the diffus- ing particles, D is ... Fick's first law; memory effects; relaxation time; second sound; Monte Carlo simulation. 1. ..... According to the theory of the fluctuation of numbers ... example, the existing probability of a diffusing par-.

  6. Analysis of the Thermo-Elastic Response of Space Reflectors to Simulated Space Environment

    Science.gov (United States)

    Allegri, G.; Ivagnes, M. M.; Marchetti, M.; Poscente, F.

    2002-01-01

    The evaluation of space environment effects on materials and structures is a key matter to develop a proper design of long duration missions: since a large part of satellites operating in the earth orbital environment are employed for telecommunications, the development of space antennas and reflectors featured by high dimensional stability versus space environment interactions represents a major challenge for designers. The structural layout of state of the art space antennas and reflectors is very complex, since several different sensible elements and materials are employed: particular care must be placed in evaluating the actual geometrical configuration of the reflectors operating in the space environment, since very limited distortions of the designed layout can produce severe effects on the quality of the signal both received and transmitted, especially for antennas operating at high frequencies. The effects of thermal loads due to direct sunlight exposition and to earth and moon albedo can be easily taken into account employing the standard methods of structural analysis: on the other hand the thermal cycling and the exposition to the vacuum environment produce a long term damage accumulation which affects the whole structure. The typical effects of the just mentioned exposition are the outgassing of polymeric materials and the contamination of the exposed surface, which can affect sensibly the thermo-mechanical properties of the materials themselves and, therefore, the structural global response. The main aim of the present paper is to evaluate the synergistic effects of thermal cycling and of the exposition to high vacuum environment on an innovative antenna developed by Alenia Spazio S.p.a.: to this purpose, both an experimental and numerical research activity has been developed. A complete prototype of the antenna has been exposed to the space environment simulated by the SAS facility: this latter is constituted by an high vacuum chamber, equipped by

  7. Modeling and Simulation for Multi-Missions Space Exploration Vehicle

    Science.gov (United States)

    Chang, Max

    2011-01-01

    Asteroids and Near-Earth Objects [NEOs] are of great interest for future space missions. The Multi-Mission Space Exploration Vehicle [MMSEV] is being considered for future Near Earth Object missions and requires detailed planning and study of its Guidance, Navigation, and Control [GNC]. A possible mission of the MMSEV to a NEO would be to navigate the spacecraft to a stationary orbit with respect to the rotating asteroid and proceed to anchor into the surface of the asteroid with robotic arms. The Dynamics and Real-Time Simulation [DARTS] laboratory develops reusable models and simulations for the design and analysis of missions. In this paper, the development of guidance and anchoring models are presented together with their role in achieving mission objectives and relationships to other parts of the simulation. One important aspect of guidance is in developing methods to represent the evolution of kinematic frames related to the tasks to be achieved by the spacecraft and its robot arms. In this paper, we compare various types of mathematical interpolation methods for position and quaternion frames. Subsequent work will be on analyzing the spacecraft guidance system with different movements of the arms. With the analyzed data, the guidance system can be adjusted to minimize the errors in performing precision maneuvers.

  8. The first metatarsal web space: its applied anatomy and usage in tracing the first dorsal metatarsal artery in thumb reconstruction.

    Science.gov (United States)

    Xu, Yong-Qing; Li, Jun; Zhong, Shi-Zhen; Xu, Da-Chuan; Xu, Xiao-Shan; Guo, Yuan-Fa; Wang, Xin-Min; Li, Zhu-Yi; Zhu, Yue-Liang

    2004-12-01

    To clarify the anatomical relationship of the structures in the first toe webbing space for better dissection of toes in thumb reconstruction. The first dorsal metatarsal artery, the first deep transverse metatarsal ligament and the extensor expansion were observed on 42 adult cadaveric lower extremities. Clinically the method of tracing the first dorsal metatarsal artery around the space of the extensor expansion was used in 36 cases of thumb reconstruction. The distal segments of the first dorsal metatarsal artery of Gilbert types I and II were located superficially to the extensor expansion. The harvesting time of a toe was shortened from 90 minutes to 50 minutes with 100% survival of reconstructed fingers. The distal segment of the first dorsal metatarsal artery lies constantly at the superficial layer of the extensor expansion. Most of the first metatarsal arteries of Gilbert types I and II can be easily located via the combined sequential and reverse dissection around the space of the extensor expansion.

  9. Functional requirements for design of the Space Ultrareliable Modular Computer (SUMC) system simulator

    Science.gov (United States)

    Curran, R. T.; Hornfeck, W. A.

    1972-01-01

    The functional requirements for the design of an interpretive simulator for the space ultrareliable modular computer (SUMC) are presented. A review of applicable existing computer simulations is included along with constraints on the SUMC simulator functional design. Input requirements, output requirements, and language requirements for the simulator are discussed in terms of a SUMC configuration which may vary according to the application.

  10. High School Students' Perceptions of the Effects of International Science Olympiad on Their STEM Career Aspirations and Twenty-First Century Skill Development

    Science.gov (United States)

    Sahin, Alpaslan; Gulacar, Ozcan; Stuessy, Carol

    2015-12-01

    Social cognitive theory guided the design of a survey to investigate high school students' perceptions of factors affecting their career contemplations and beliefs regarding the influence of their participation in the international Science Olympiad on their subject interests and twenty-first century skills. In addition, gender differences in students' choice of competition category were studied. Mixed methods analysis of survey returns from 172 Olympiad participants from 31 countries showed that students' career aspirations were affected most by their teachers, personal interests, and parents, respectively. Students also indicated that they believed that their participation in the Olympiad reinforced their plan to choose a science, technology, engineering, and mathematics (STEM) major at college and assisted them in developing and improving their twenty-first century skills. Furthermore, female students' responses indicated that their project choices were less likely to be in the engineering category and more likely to be in the environment or energy categories. Findings are discussed in the light of increasing the awareness of the role and importance of Science Olympiads in STEM career choice and finding ways to attract more female students into engineering careers.

  11. Project Mercury: NASA's first manned space programme

    Science.gov (United States)

    Catchpole, John

    Project Mercury will offer a developmental resume of the first American manned spaceflight programme and its associated infrastructure, including accounts of space launch vehicles. The book highlights the differences in Redstone/Atlas technology, drawing similar comparisons between ballistic capsules and alternative types of spacecraft. The book also covers astronaut selection and training, as well as tracking systems, flight control, basic principles of spaceflight and detailed accounts of individual flights.

  12. The daylighting dashboard - A simulation-based design analysis for daylit spaces

    Energy Technology Data Exchange (ETDEWEB)

    Reinhart, Christoph F. [Harvard University, Graduate School of Design, 48 Quincy Street, Cambridge, MA 02138 (United States); Wienold, Jan [Fraunhofer Institute for Solar Energy Systems, Heidenhofstrasse 2, 79110 Freiburg (Germany)

    2011-02-15

    This paper presents a vision of how state-of-the-art computer-based analysis techniques can be effectively used during the design of daylit spaces. Following a review of recent advances in dynamic daylight computation capabilities, climate-based daylighting metrics, occupant behavior and glare analysis, a fully integrated design analysis method is introduced that simultaneously considers annual daylight availability, visual comfort and energy use: Annual daylight glare probability profiles are combined with an occupant behavior model in order to determine annual shading profiles and visual comfort conditions throughout a space. The shading profiles are then used to calculate daylight autonomy plots, energy loads, operational energy costs and green house gas emissions. The paper then shows how simulation results for a sidelit space can be visually presented to simulation non-experts using the concept of a daylighting dashboard. The paper ends with a discussion of how the daylighting dashboard could be practically implemented using technologies that are available today. (author)

  13. Designing Social Production Models to Support Producer-Consumer Collaboration and Innovation in Digital Social Spaces

    Science.gov (United States)

    Arakji, Reina Y.

    2009-01-01

    The first decade of the twenty-first century has seen dramatic advances in Internet technologies. Digital social spaces have emerged as popular Internet applications that are radically changing how firms and consumers of digital content interact. In the first chapter "Research Agenda" I introduce my research and the context within which it is…

  14. An IBM PC-based math model for space station solar array simulation

    Science.gov (United States)

    Emanuel, E. M.

    1986-01-01

    This report discusses and documents the design, development, and verification of a microcomputer-based solar cell math model for simulating the Space Station's solar array Initial Operational Capability (IOC) reference configuration. The array model is developed utilizing a linear solar cell dc math model requiring only five input parameters: short circuit current, open circuit voltage, maximum power voltage, maximum power current, and orbit inclination. The accuracy of this model is investigated using actual solar array on orbit electrical data derived from the Solar Array Flight Experiment/Dynamic Augmentation Experiment (SAFE/DAE), conducted during the STS-41D mission. This simulator provides real-time simulated performance data during the steady state portion of the Space Station orbit (i.e., array fully exposed to sunlight). Eclipse to sunlight transients and shadowing effects are not included in the analysis, but are discussed briefly. Integrating the Solar Array Simulator (SAS) into the Power Management and Distribution (PMAD) subsystem is also discussed.

  15. Numerical simulation of a cabin ventilation subsystem in a space station oriented real-time system

    Directory of Open Access Journals (Sweden)

    Zezheng QIU

    2017-12-01

    Full Text Available An environment control and life support system (ECLSS is an important system in a space station. The ECLSS is a typical complex system, and the real-time simulation technology can help to accelerate its research process by using distributed hardware in a loop simulation system. An implicit fixed time step numerical integration method is recommended for a real-time simulation system with time-varying parameters. However, its computational efficiency is too low to satisfy the real-time data interaction, especially for the complex ECLSS system running on a PC cluster. The instability problem of an explicit method strongly limits its application in the ECLSS real-time simulation although it has a high computational efficiency. This paper proposes an improved numerical simulation method to overcome the instability problem based on the explicit Euler method. A temperature and humidity control subsystem (THCS is firstly established, and its numerical stability is analyzed by using the eigenvalue estimation theory. Furthermore, an adaptive operator is proposed to avoid the potential instability problem. The stability and accuracy of the proposed method are investigated carefully. Simulation results show that this proposed method can provide a good way for some complex time-variant systems to run their real-time simulation on a PC cluster. Keywords: Numerical integration method, Real-time simulation, Stability, THCS, Time-variant system

  16. Macrofilament simulation of high current beam transport

    International Nuclear Information System (INIS)

    Hayden, R.J.; Jakobson, M.J.

    1985-01-01

    Macrofilament simulation of high current beam transport through a series of solenoids has been used to investigate the sensitivity of such calculations to the initial beam distribution and to the number of filaments used in the simulation. The transport line was tuned to approximately 105 0 phase advance per cell at zero current with a tune depression of 65 0 due to the space charge. Input distributions with the filaments randomly uniform throughout a four dimensional ellipsoid and K-V input distributions have been studied. The behavior of the emittance is similar to that published for quadrupoles with like tune depression. The emittance demonstrated little growth in the first twelve solenoids, a rapid rate of growth for the next twenty, and a subsequent slow rate of growth. A few hundred filaments were sufficient to show the character of the instability. The number of filaments utilized is an order of magnitude fewer than has been utilized previously for similar instabilities. The previously published curves for simulations with less than a thousand particles show a rather constant emittance growth. If the solenoid transport line magnetic field is increased a few percent, emittance growth curves are obtained not unlike those curves. Collision growth effects are less important than indicated in the previously published results for quadrupoles

  17. ML-Space: Hybrid Spatial Gillespie and Particle Simulation of Multi-Level Rule-Based Models in Cell Biology.

    Science.gov (United States)

    Bittig, Arne T; Uhrmacher, Adelinde M

    2017-01-01

    Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.

  18. Space charge and magnet error simulations for the SNS accumulator ring

    International Nuclear Information System (INIS)

    Beebe-Wang, J.; Fedotov, A.V.; Wei, J.; Machida, S.

    2000-01-01

    The effects of space charge forces and magnet errors in the beam of the Spallation Neutron Source (SNS) accumulator ring are investigated. In this paper, the focus is on the emittance growth and halo/tail formation in the beam due to space charge with and without magnet errors. The beam properties of different particle distributions resulting from various injection painting schemes are investigated. Different working points in the design of SNS accumulator ring lattice are compared. The simulations in close-to-resonance condition in the presence of space charge and magnet errors are presented. (author)

  19. PATH: a lumped-element beam-transport simulation program with space charge

    International Nuclear Information System (INIS)

    Farrell, J.A.

    1983-01-01

    PATH is a group of computer programs for simulating charged-particle beam-transport systems. It was developed for evaluating the effects of some aberrations without a time-consuming integration of trajectories through the system. The beam-transport portion of PATH is derived from the well-known program, DECAY TURTLE. PATH contains all features available in DECAY TURTLE (including the input format) plus additional features such as a more flexible random-ray generator, longitudinal phase space, some additional beamline elements, and space-charge routines. One of the programs also provides a simulation of an Alvarez linear accelerator. The programs, originally written for a CDC 7600 computer system, also are available on a VAX-VMS system. All of the programs are interactive with input prompting for ease of use

  20. Twenty lectures on thermodynamics

    CERN Document Server

    Buchdahl, H A

    2013-01-01

    Twenty Lectures on Thermodynamics is a course of lectures, parts of which the author has given various times over the last few years. The book gives the readers a bird's eye view of phenomenological and statistical thermodynamics. The book covers many areas in thermodynamics such as states and transition; adiabatic isolation; irreversibility; the first, second, third and Zeroth laws of thermodynamics; entropy and entropy law; the idea of the application of thermodynamics; pseudo-states; the quantum-static al canonical and grand canonical ensembles; and semi-classical gaseous systems. The text

  1. Fast Poisson Solvers for Self-Consistent Beam-Beam and Space-Charge Field Computation in Multiparticle Tracking Simulations

    CERN Document Server

    Florio, Adrien; Pieloni, Tatiana; CERN. Geneva. ATS Department

    2015-01-01

    We present two different approaches to solve the 2-dimensional electrostatic problem with open boundary conditions to be used in fast tracking codes for beam-beam and space charge simulations in high energy accelerators. We compare a fast multipoles method with a hybrid Poisson solver based on the fast Fourier transform and finite differences in polar coordinates. We show that the latter outperforms the first in terms of execution time and precision, allowing for a reduction of the noise in the tracking simulation. Furthermore the new algorithm is shown to scale linearly on parallel architectures with shared memory. We conclude by effectively replacing the HFMM by the new Poisson solver in the COMBI code.

  2. THE DISPUTE BETWEEN POLITICAL THEOLOGY AND THE POLITICS OF THEOLOGY IN THE TWENTY-FIRST CENTURY ON THE MEANINGS OF THE POSTMODERN GLOBALIZING AND INDIVIDUALISTIC SOCIETY AND THE CHRISTIAN PERSONALIST GLOBALITY

    Directory of Open Access Journals (Sweden)

    Stelian MANOLACHE

    2016-05-01

    Full Text Available Upon the dawn of postmodernity, in the twenty-first century, we witness the emergence of a new way of thinking and of new forms of culture and life, under the ideology of globalism, whose dominance is given by the practicality and utility related to civilization, and under globality, which is the cultural aspect of globalization, pertaining to the field of culture. The two dimensions of globalization and globality, civilizational and cultural, will (requestion the principle relationship between Christianity and the new postmodern globalizing utopia, requiring to (reconsider the sense and presence of Christianity within the world, and the appropriate sociological figure of the Church, within the new reality of global and globalized humanity, in the postmodern public space. This paper deals with this ideology - globalism and the cultural manifestation of globality, and with the Orthodox answer to the new challenge of individualism and postmodern globalizing (neocollectivism.

  3. Role of collective effects in dominance of scattering off thermal ions over Langmuir wave decay: Analysis, simulations, and space applications

    International Nuclear Information System (INIS)

    Cairns, Iver H.

    2000-01-01

    Langmuir waves driven to high levels by beam instabilities are subject to nonlinear processes, including the closely related processes of scattering off thermal ions (STI) and a decay process in which the ion response is organized into a product ion acoustic wave. Calculations of the nonlinear growth rates predict that the decay process should always dominate STI, creating two paradoxes. The first is that three independent computer simulation studies show STI proceeding, with no evidence for the decay at all. The second is that observations in space of type III solar radio bursts and Earth's foreshock, which the simulations were intended to model, show evidence for the decay proceeding but no evidence for STI. Resolutions to these paradoxes follow from the realization that a nonlinear process cannot proceed when its growth rate exceeds the minimum frequency of the participating waves, since the required collective response cannot be maintained and the waves cannot respond appropriately, and that a significant number of e-foldings and wave periods must be contained in the time available. It is shown that application of these ''collective'' and ''time scale'' constraints to the simulations explains why the decay does not proceed in them, as well as why STI proceeds in specific simulations. This appears to be the first demonstration that collective constraints are important in understanding nonlinear phenomena. Furthermore, applying these constraints to space observations, it is predicted that the decay should proceed (and dominate STI) in type III sources and the high beam speed regions of Earth's foreshock for a specific range of wave levels, with a possible role for STI alone at slightly higher wave levels. Deeper in the foreshock, for slower beams and weaker wave levels, the decay and STI are predicted to become ineffective. Suggestions are given for future testing of the collective constraint and an explanation for why waves in space are usually much weaker than

  4. Role of collective effects in dominance of scattering off thermal ions over Langmuir wave decay: Analysis, simulations, and space applications

    Energy Technology Data Exchange (ETDEWEB)

    Cairns, Iver H.

    2000-12-01

    Langmuir waves driven to high levels by beam instabilities are subject to nonlinear processes, including the closely related processes of scattering off thermal ions (STI) and a decay process in which the ion response is organized into a product ion acoustic wave. Calculations of the nonlinear growth rates predict that the decay process should always dominate STI, creating two paradoxes. The first is that three independent computer simulation studies show STI proceeding, with no evidence for the decay at all. The second is that observations in space of type III solar radio bursts and Earth's foreshock, which the simulations were intended to model, show evidence for the decay proceeding but no evidence for STI. Resolutions to these paradoxes follow from the realization that a nonlinear process cannot proceed when its growth rate exceeds the minimum frequency of the participating waves, since the required collective response cannot be maintained and the waves cannot respond appropriately, and that a significant number of e-foldings and wave periods must be contained in the time available. It is shown that application of these ''collective'' and ''time scale'' constraints to the simulations explains why the decay does not proceed in them, as well as why STI proceeds in specific simulations. This appears to be the first demonstration that collective constraints are important in understanding nonlinear phenomena. Furthermore, applying these constraints to space observations, it is predicted that the decay should proceed (and dominate STI) in type III sources and the high beam speed regions of Earth's foreshock for a specific range of wave levels, with a possible role for STI alone at slightly higher wave levels. Deeper in the foreshock, for slower beams and weaker wave levels, the decay and STI are predicted to become ineffective. Suggestions are given for future testing of the collective constraint and an explanation

  5. Construction of the Hunveyor-Husar space probe model system for planetary science education and analog studies and simulations in universities and colleges of Hungary.

    Science.gov (United States)

    Bérczi, Sz.; Hegyi, S.; Hudoba, Gy.; Hargitai, H.; Kokiny, A.; Drommer, B.; Gucsik, A.; Pintér, A.; Kovács, Zs.

    Several teachers and students had the possibility to visit International Space Camp in the vicinity of the MSFC NASA in Huntsville Alabama USA where they learned the success of simulators in space science education To apply these results in universities and colleges in Hungary we began a unified complex modelling in planetary geology robotics electronics and complex environmental analysis by constructing an experimental space probe model system First a university experimental lander HUNVEYOR Hungarian UNiversity surVEYOR then a rover named HUSAR Hungarian University Surface Analyser Rover has been built For Hunveyor the idea and example was the historical Surveyor program of NASA in the 1960-ies for the Husar the idea and example was the Pathfinder s rover Sojouner rover The first step was the construction of the lander a year later the rover followed The main goals are 1 to build the lander structure and basic electronics from cheap everyday PC compatible elements 2 to construct basic experiments and their instruments 3 to use the system as a space activity simulator 4 this simulator contains lander with on board computer for works on a test planetary surface and a terrestrial control computer 5 to harmonize the assemblage of the electronic system and instruments in various levels of autonomy from the power and communication circuits 6 to use the complex system in education for in situ understanding complex planetary environmental problems 7 to build various planetary environments for application of the

  6. University of Central Florida / Deep Space Industries Asteroid Regolith Simulants

    Science.gov (United States)

    Britt, Daniel; Covey, Steven D.; Schultz, Cody

    2017-10-01

    Introduction: The University of Central Florida (UCF), in partnership with Deep Space Industries (DSI) are working under a NASA Phase 2 SBIR contract to develop and produce a family of asteroid regolith simulants for use in research, engineering, and mission operations testing. We base simulant formulas on the mineralogy, particle size, and physical characteristics of CI, CR, CM, C2, CV, and L-Chondrite meteorites. The advantage in simulating meteorites is that the vast majority of meteoritic materials are common rock forming minerals that are available in commercial quantities. While formulas are guided by the meteorites our approach is one of constrained maximization under the limitations of safety, cost, source materials, and ease of handling. In all cases our goal is to deliver a safe, high fidelity analog at moderate cost.Source Materials, Safety, and Biohazards: A critical factor in any useful simulant is to minimize handling risks for biohazards or toxicity. All the terrestrial materials proposed for these simulants were reviewed for potential toxicity. Of particular interest is the organic component of volatile rich carbonaceous chondrites which contain polycyclic aromatic hydrocarbons (PAHs), some of which are known carcinogens and mutagens. Our research suggests that we can maintain rough chemical fidelity by substituting much safer sub-bituminous coal as our organic analog. A second safety consideration is the choice of serpentine group materials. While most serpentine polymorphs are quite safe we avoid fibrous chrysotile because of its asbestos content. Terrestrial materials identified as inputs for our simulants are common rock forming minerals that are available in commercial quantities. These include olivine, pyroxene, plagioclase feldspar, smectite, serpentine, saponite, pyrite, and magnetite in amounts that are appropriate for each type. For CI's and CR’s, their olivines tend to be Fo100 which is rare on Earth. We have substituted Fo90 olivine

  7. Proceedings of the Fifth Seminar of High Temperature Reactor: The Role and Challenge with HTR Opportunity in the Twenty-first Century

    International Nuclear Information System (INIS)

    As-Natio-Lasman; Zaki-Su'ud; Bambang-Sugiono

    2000-11-01

    The Seminar in HTR Reactor has become routine activities held in BATAN since 1994. This Seminar is a continuation of the Seminar on Technology and HTR Application held by Centre for Development of Advanced Reactor System. The theme of the seminar is Role, Challenge, Opportunity of HTR in the Twenty-first Century. Thirteen papers presented in the seminar were collected into proceedings. The aims of the proceedings is to provide information and references on nuclear technology, mainly on HTR technology. (DII)

  8. Laboratory simulation of erosion by space plasma

    International Nuclear Information System (INIS)

    Kristoferson, L.; Fredga, K.

    1976-04-01

    A laboratory experiment has been made where a plasma stream collides with targets made of different materials of cosmic interest. The experiment can be viewed as a process simulation of the solar wind particle interaction with solid surfaces in space, e.g. cometary dust. Special interest is given to sputtering of OH and Na. It is shown that the erosion of solid particles in interplanetary space at large heliocentric distances is most likely dominated by sputtering and by sublimation near the sun. The heliocentric distance of the limit between the two regions is determined mainly by the material properties of the eroded surface, e.g. heat of sublimation and sputtering yield, a typical distance being 0,5 a.u. It is concluded that the observations of Na in comets at large solar distances, in some cases also near the sun, is most likely to be explained by solar wind sputtering. OH emission in space could be of importance also from 'dry', water-free, matter by means of molecule sputtering. The observed OH production rates in comets are however too large to be explained in this way and are certainly the results of sublimation and dissociation of H 2 O from an icy nucleus. (Auth.)

  9. Agriculture in West Africa in the Twenty-First Century: Climate Change and Impacts Scenarios, and Potential for Adaptation

    Science.gov (United States)

    Sultan, Benjamin; Gaetani, Marco

    2016-01-01

    West Africa is known to be particularly vulnerable to climate change due to high climate variability, high reliance on rain-fed agriculture, and limited economic and institutional capacity to respond to climate variability and change. In this context, better knowledge of how climate will change in West Africa and how such changes will impact crop productivity is crucial to inform policies that may counteract the adverse effects. This review paper provides a comprehensive overview of climate change impacts on agriculture in West Africa based on the recent scientific literature. West Africa is nowadays experiencing a rapid climate change, characterized by a widespread warming, a recovery of the monsoonal precipitation, and an increase in the occurrence of climate extremes. The observed climate tendencies are also projected to continue in the twenty-first century under moderate and high emission scenarios, although large uncertainties still affect simulations of the future West African climate, especially regarding the summer precipitation. However, despite diverging future projections of the monsoonal rainfall, which is essential for rain-fed agriculture, a robust evidence of yield loss in West Africa emerges. This yield loss is mainly driven by increased mean temperature while potential wetter or drier conditions as well as elevated CO2 concentrations can modulate this effect. Potential for adaptation is illustrated for major crops in West Africa through a selection of studies based on process-based crop models to adjust cropping systems (change in varieties, sowing dates and density, irrigation, fertilizer management) to future climate. Results of the cited studies are crop and region specific and no clear conclusions can be made regarding the most effective adaptation options. Further efforts are needed to improve modeling of the monsoon system and to better quantify the uncertainty in its changes under a warmer climate, in the response of the crops to such

  10. Toward a first-principles integrated simulation of tokamak edge plasmas

    International Nuclear Information System (INIS)

    Chang, C S; Klasky, Scott A; Cummings, Julian; Samtaney, Ravi; Shoshani, A.; Sugiyama, L.; Keyes, David E; Ku, Seung-Hoe; Park, G.; Parker, Scott; Podhorszki, Norbert; Strauss, H.; Abbasi, H.; Adams, Mark; Barreto, Roselyne D; Bateman, Glenn; Bennett, K.; Chen, Yang; D'Azevedo, Eduardo; Docan, Ciprian; Ethier, Stephane; Feibush, E.; Greengard, Leslie; Hahm, Taik Soo; Hinton, Fred; Jin, Chen; Khan, A.; Kritz, Arnold; Krstic, Predrag S; Lao, T.; Lee, Wei-Li; Lin, Zhihong; Lofstead, J.; Mouallem, P. A.; Nagappan, M.; Pankin, A.; Parashar, Manish; Pindzola, Michael S.; Reinhold, Carlos O; Schultz, David Robert; Schwan, Karsten; Silver, D.; Sim, A.; Stotler, D.

    2008-01-01

    Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary first principles, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to deliver such a code integration framework by (1) building new kinetic codes XGC0 and XGC1, which can simulate the edge pedestal buildup; (2) using and improving the existing MHD codes ELITE, M3D-OMP, M3D-MPP and NIMROD, for study of large-scale edge instabilities called Edge Localized Modes (ELMs); and (3) integrating the codes into a framework using cutting-edge computer science technology. Collaborative effort among physics, computer science, and applied mathematics within CPES has created the first working version of the End-to-end Framework for Fusion Integrated Simulation (EFFIS), which can be used to study the pedestal-ELM cycles

  11. Simulation of DNA Damage in Human Cells from Space Radiation Using a Physical Model of Stochastic Particle Tracks and Chromosomes

    Science.gov (United States)

    Ponomarev, Artem; Plante, Ianik; Hada, Megumi; George, Kerry; Wu, Honglu

    2015-01-01

    The formation of double-strand breaks (DSBs) and chromosomal aberrations (CAs) is of great importance in radiation research and, specifically, in space applications. We are presenting a recently developed model, in which chromosomes simulated by NASARTI (NASA Radiation Tracks Image) is combined with nanoscopic dose calculations performed with the Monte-Carlo simulation by RITRACKS (Relativistic Ion Tracks) in a voxelized space. The model produces the number of DSBs, as a function of dose for high-energy iron, oxygen, and carbon ions, and He ions. The combined model calculates yields of radiation-induced CAs and unrejoined chromosome breaks in normal and repair deficient cells. The merged computational model is calibrated using the relative frequencies and distributions of chromosomal aberrations reported in the literature. The model considers fractionated deposition of energy to approximate dose rates of the space flight environment. The merged model also predicts of the yields and sizes of translocations, dicentrics, rings, and more complex-type aberrations formed in the G0/G1 cell cycle phase during the first cell division after irradiation.

  12. Space Geodetic Technique Co-location in Space: Simulation Results for the GRASP Mission

    Science.gov (United States)

    Kuzmicz-Cieslak, M.; Pavlis, E. C.

    2011-12-01

    The Global Geodetic Observing System-GGOS, places very stringent requirements in the accuracy and stability of future realizations of the International Terrestrial Reference Frame (ITRF): an origin definition at 1 mm or better at epoch and a temporal stability on the order of 0.1 mm/y, with similar numbers for the scale (0.1 ppb) and orientation components. These goals were derived from the requirements of Earth science problems that are currently the international community's highest priority. None of the geodetic positioning techniques can achieve this goal alone. This is due in part to the non-observability of certain attributes from a single technique. Another limitation is imposed from the extent and uniformity of the tracking network and the schedule of observational availability and number of suitable targets. The final limitation derives from the difficulty to "tie" the reference points of each technique at the same site, to an accuracy that will support the GGOS goals. The future GGOS network will address decisively the ground segment and to certain extent the space segment requirements. The JPL-proposed multi-technique mission GRASP (Geodetic Reference Antenna in Space) attempts to resolve the accurate tie between techniques, using their co-location in space, onboard a well-designed spacecraft equipped with GNSS receivers, a SLR retroreflector array, a VLBI beacon and a DORIS system. Using the anticipated system performance for all four techniques at the time the GGOS network is completed (ca 2020), we generated a number of simulated data sets for the development of a TRF. Our simulation studies examine the degree to which GRASP can improve the inter-technique "tie" issue compared to the classical approach, and the likely modus operandi for such a mission. The success of the examined scenarios is judged by the quality of the origin and scale definition of the resulting TRF.

  13. Facilities Inventory and Utilization Study, Fall of 1987. Twenty-First Edition.

    Science.gov (United States)

    North Carolina Commission on Higher Education Facilities, Chapel Hill.

    The status of space in North Carolina institutions of higher education at the end of the drop-add period of the 1987 fall term at each college is presented. Indications of the uses being made of the space are given, and norms and historical information are presented for the past 5 years to enable institutions to make their own assessments of their…

  14. The "Very Cool" James Webb Space Telescope!

    Science.gov (United States)

    Teague, Peter J. B.

    2018-01-01

    For over twenty years, scientists, engineers, technicians, and other personnel have been working on the next generation space telescope. As a partnership between NASA (National Aeronautics and Space Administration), CSA (Canadian Space Agency), and ESA (European Space Angency), the James Webb Space Telescope will complement the previous research performed by the Hubble by utilizing a larger primary mirror, which will also be optimized for infrared wavelengths. This combination will allow JWST to collect data and take images of light having traveled over 13.7 billion light years. This presentation will focus on the mission, as well as the contamination control challenges during the integration and testing in the NASA Goddard Spacecraft Systems Development and Integration Facility (SSDIF), one of the largest cleanrooms in the world. Additional information will be presented regarding space simulation testing down to a cool 20 degrees Kelvin [-424 degrees Fahrenheit] that will occur at Johnson Space Center in Houston, TX, and more testing and integration to happen at Northrop Grumman Corp., in Redondo Beach, CA. Launch of the JWST is currently scheduled for the spring of 2019 at Ariane Spaceport in French Guiana, South America.

  15. Efficient Neural Network Modeling for Flight and Space Dynamics Simulation

    Directory of Open Access Journals (Sweden)

    Ayman Hamdy Kassem

    2011-01-01

    Full Text Available This paper represents an efficient technique for neural network modeling of flight and space dynamics simulation. The technique will free the neural network designer from guessing the size and structure for the required neural network model and will help to minimize the number of neurons. For linear flight/space dynamics systems, the technique can find the network weights and biases directly by solving a system of linear equations without the need for training. Nonlinear flight dynamic systems can be easily modeled by training its linearized models keeping the same network structure. The training is fast, as it uses the linear system knowledge to speed up the training process. The technique is tested on different flight/space dynamic models and showed promising results.

  16. Space-Charge Effect

    International Nuclear Information System (INIS)

    Chauvin, N

    2013-01-01

    First, this chapter introduces the expressions for the electric and magnetic space-charge internal fields and forces induced by high-intensity beams. Then, the root-mean-square equation with space charge is derived and discussed. In the third section, the one-dimensional Child-Langmuir law, which gives the maximum current density that can be extracted from an ion source, is exposed. Space-charge compensation can occur in the low-energy beam transport lines (located after the ion source). This phenomenon, which counteracts the spacecharge defocusing effect, is explained and its main parameters are presented. The fifth section presents an overview of the principal methods to perform beam dynamics numerical simulations. An example of a particles-in-cells code, SolMaxP, which takes into account space-charge compensation, is given. Finally, beam dynamics simulation results obtained with this code in the case of the IFMIF injector are presented. (author)

  17. Space-Charge Effect

    CERN Document Server

    Chauvin, N.

    2013-12-16

    First, this chapter introduces the expressions for the electric and magnetic space-charge internal fields and forces induced by high-intensity beams. Then, the root-mean-square equation with space charge is derived and discussed. In the third section, the one-dimensional Child-Langmuir law, which gives the maximum current density that can be extracted from an ion source, is exposed. Space-charge compensation can occur in the low-energy beam transport lines (located after the ion source). This phenomenon, which counteracts the spacecharge defocusing effect, is explained and its main parameters are presented. The fifth section presents an overview of the principal methods to perform beam dynamics numerical simulations. An example of a particles-in-cells code, SolMaxP, which takes into account space-charge compensation, is given. Finally, beam dynamics simulation results obtained with this code in the case of the IFMIF injector are presented.

  18. Chinese Woman in New York City: Transcultural Travel and Postsocialist Cosmopolitanism in Twenty-first Century China

    OpenAIRE

    Berg, Daria; Kunze, Rui

    2016-01-01

    This paper explores transcultural travel as the new space of Chinese women and culture in motion in a globalizing postsocialist China. We adopt Lisa Rofel’s concept of ‘postsocialist cosmopolitanism’ to examine how a new generation of Chinese women writers fashions a new female self in their writings about lived experiences in transnational and transcultural environments. According to Rofel, postsocialist cosmopolitanism combines first, a self-conscious transcendence of locality accomplished ...

  19. SIMULATIONS OF THE MAGELLANIC STREAM IN A FIRST INFALL SCENARIO

    International Nuclear Information System (INIS)

    Besla, G.; Hernquist, L.; Keres, D.; Kallivayalil, N.; Van der Marel, R. P.; Cox, T. J.

    2010-01-01

    Recent high-precision proper motions from the Hubble Space Telescope suggest that the Large and Small Magellanic Clouds (LMC and SMC, respectively) are either on their first passage or on an eccentric long period (>6 Gyr) orbit about the Milky Way (MW). This differs markedly from the canonical picture in which the Clouds travel on a quasi-periodic orbit about the MW (period of ∼2 Gyr). Without a short-period orbit about the MW, the origin of the Magellanic Stream, a young (1-2 Gyr old) coherent stream of H I gas that trails the Clouds ∼150 0 across the sky, can no longer be attributed to stripping by MW tides and/or ram pressure stripping by MW halo gas. We propose an alternative formation mechanism in which material is removed by LMC tides acting on the SMC before the system is accreted by the MW. We demonstrate the feasibility and generality of this scenario using an N-body/smoothed particle hydrodynamics simulation with cosmologically motivated initial conditions constrained by the observations. Under these conditions, we demonstrate that it is possible to explain the origin of the Magellanic Stream in a first infall scenario. This picture is generically applicable to any gas-rich dwarf galaxy pair infalling toward a massive host or interacting in isolation.

  20. An FPGA computing demo core for space charge simulation

    International Nuclear Information System (INIS)

    Wu, Jinyuan; Huang, Yifei

    2009-01-01

    In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.

  1. An FPGA computing demo core for space charge simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jinyuan; Huang, Yifei; /Fermilab

    2009-01-01

    In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.

  2. First laser measurements to space debris in Poland

    Science.gov (United States)

    Lejba, Paweł; Suchodolski, Tomasz; Michałek, Piotr; Bartoszak, Jacek; Schillak, Stanisław; Zapaśnik, Stanisław

    2018-05-01

    The Borowiec Satellite Laser Ranging station (BORL 7811, Borowiec) being a part of the Space Research Centre of the Polish Academy of Sciences (SRC PAS) went through modernization in 2014-2015. One of the main tasks of the modernization was the installation of a high-energy laser module dedicated to space debris tracking. Surelite III by Continuum is a Nd:YAG pulse laser with 10 Hz repetition rate, a pulse width of 3-5 ns and a pulse energy of 450 mJ for green (532 nm). This new laser unit was integrated with the SLR system at Borowiec performing standard satellite tracking. In 2016 BORL 7811 participated actively to the observational campaigns related to the space debris targets from LEO region managed by the Space Debris Study Group (SDSG) of the International Laser Ranging Service (ILRS). Currently, Borowiec station regularly tracks 36 space debris from the LEO regime, including typical rocket bodies (Russian/Chinese) and cooperative targets like the inactive TOPEX/Poseidon, ENVISAT, OICETS and others. In this paper the first results of space debris laser measurements obtained by the Borowiec station in period August 2016 - January 2017 are presented. The results gained by the SRC PAS Borowiec station confirm the rotation of the defunct TOPEX/Poseidon satellite which spins with a period of approximately 10 s. The novelty of this work is the presentation of the sample results of the Chinese CZ-2C R/B target (NORAD catalogue number 31114) which is equipped (probably) with retroreflectors. Laser measurements to space debris is a very desirable topic for the next years, especially in the context of the Space Surveillance and Tracking (SST) activity. Some targets are very easy to track like defunct ENVISAT or TOPEX/Poseidon. On the other hand, there is a big population of different LEO targets with different orbital and physical parameters, which are challenging for laser ranging like small irregular debris and rocket boosters.

  3. Space Tweetup - from a participant to a Mars Tweetup organizer and a new format of space communication

    Science.gov (United States)

    Haider, O.; Groemer, G.

    2014-01-01

    In September 2011, the European Space Agency (ESA) and the German Space Agency (DLR) organized the first European SpaceTweetup during the German Aerospace day. One of the authors was one of 60 participants at this SpaceTweetup in Cologne and experienced the concept of a Tweetup and the engagement of the participants from the inside view. Building upon this experience, the Austrian Space Forum (OeWF) organized the first Austrian MarsTweetup during the “Dachstein Mars analog simulation”. Between 27 Apr,2001 and May,2012, a five day Mars simulation was conducted by the Austrian Space Forum and international research partners at the Giant Ice caves at the Dachstein region in Austria. During this field test, the Aouda.X spacesuit simulator and selected geophysical and life-science related experiments were conducted. In this paper we outline the potential and limitations of social media and how to engage the general public to participate and communicate about space projects through their own experience. We show examples of material SpaceTweetup participants produced e.g. hundreds of tweets during the actual event, blog entries, photo galleries and how space communication can benefit from it. Our considerations on organizing a SpaceTweetup are complemented with a section on lessons learned.

  4. A Simulation Learning Approach to Training First Responders for Radiological Emergencies

    International Nuclear Information System (INIS)

    Sanders, Robert Lon; Rhodes, Graham S.

    2007-01-01

    This paper describes the application of simulation learning technology, popularized by the emerging serious games industry, for training first responders to properly act in the event of a radiological emergency. Using state-of-the-art video game production tools and runtime engines as an enabling technology, simulation learning combines interactive virtual worlds based on validated engineering models with engaging storylines and scenarios that invoke the emotional response-and the corresponding human stress level-that first responders would encounter during a real-world emergency. For the application discussed here, in addition to providing engaging instruction about the fundamentals of radiological environments and the proper usage of radiological equipment, simulation learning prepares first responders to perform effectively under high stress and enables them to practice in teams

  5. Change and Continuity in Librarianship: Approaching the Twenty-First Century. Proceedings of the 40th Military Librarians Workshop, 20-22 November 1996, Annapolis, Maryland,

    Science.gov (United States)

    1996-11-01

    Novembecr 1996 Arinarolis, Maryland1 rDIO QUALMTY DZEOTN I VIBYKUTON UrtAIK=yg A Change and Continuity in Librarianship : Approaching the Twenty-first...speakers Walt Crawford (Keynote), speaking on "Millennial Librarianship ;" Dr. Keith Swigger, Dean of the Graduate School of Library and Information...1 --Richard Hume Werking Millennial Librarianship : Maintaining the Mix and Avoiding the Hype .................. 2 --Walt Crawford

  6. Modeling and Simulation of DC Power Electronics Systems Using Harmonic State Space (HSS) Method

    DEFF Research Database (Denmark)

    Kwon, Jun Bum; Wang, Xiongfei; Bak, Claus Leth

    2015-01-01

    based on the state-space averaging and generalized averaging, these also have limitations to show the same results as with the non-linear time domain simulations. This paper presents a modeling and simulation method for a large dc power electronic system by using Harmonic State Space (HSS) modeling......For the efficiency and simplicity of electric systems, the dc based power electronics systems are widely used in variety applications such as electric vehicles, ships, aircrafts and also in homes. In these systems, there could be a number of dynamic interactions between loads and other dc-dc....... Through this method, the required computation time and CPU memory for large dc power electronics systems can be reduced. Besides, the achieved results show the same results as with the non-linear time domain simulation, but with the faster simulation time which is beneficial in a large network....

  7. Towards a Rational Kingdom in Africa: Knowledge, Critical Rationality and Development in a Twenty-First Century African Cultural Context

    Directory of Open Access Journals (Sweden)

    Lawrence Ogbo Ugwuanyi

    2018-03-01

    Full Text Available This paper seeks to locate the kind of knowledge that is relevant for African development in the twenty-first century African cultural context and to propose the paradigm for achieving such knowledge. To do this, it advances the view that the concept of twenty-first century in an African context must be located with the colonial and post-colonial challenges of the African world and applied to serve the African demand. Anchored on this position, the paper outlines and critiques the wrong assumption on which modern state project was anchored in post-colonial Africa and its development dividend to suggest that this is an outcome of a wrong knowledge design that is foundational to the state project and which the project did not address. It proposes a shift in the knowledge paradigm in Africa and suggests critical self-consciousness as a more desirable knowledge design for Africa. It applies the term ‘rational kingdom’ (defined as a community of reason marked by critical conceptual self-awareness driven by innovation and constructivism to suggest this paradigm. ‘Innovation’ is meant as the application of reason with an enlarged capacity to anticipate and address problems with fresh options and ‘constructivism’ is meant as the disposition to sustain innovation by advancing an alternative but more reliable worldview that can meet the exigencies of modernity in an African cultural context. The paper then proceeds to outline the nature of the rational kingdom and its anticipated gains and outcomes. It applies the method of inductive reasoning to advance its position. To do this it invokes selected but crucial areas of African life to locate how the developmental demands of these aspects of life suggest a critical turn in African rationality.

  8. Immediate and six-month space changes after premature loss of a primary maxillary first molar.

    Science.gov (United States)

    Lin, Yai-Tin; Lin, Wen-Hsien; Lin, Yng-Tzer J

    2007-03-01

    Premature loss of primary maxillary first molars has been associated with a number of consequences (such as tipping of the first permanent molar). The aim of the authors' study was to investigate dental-arch space problems arising as a result of premature loss of a primary maxillary first molar. This study was composed of 19 children who experienced unilateral premature loss of a primary maxillary first molar. The authors used each patient's intact contralateral arch segment as a control. The authors obtained maxillary dental study casts two or three days after the tooth was extracted, as well as six months later. The D + E space from the extraction side six months after removal of the tooth (mean +/- standard deviation, 15.62 +/- 1.13 millimeters) was significantly smaller than the space on the control side (16.88 +/- 1.12 mm) and the initial D + E space (16.70 +/- 0.69 mm). The authors found a significantly shorter arch length (25.47 +/- 1.58 mm) and larger intercanine width (31.29 +/- 2.49 mm) six months after the tooth was extracted compared with the initial arch length (25.66 +/- 1.64 mm) and intercanine width (30.42 +/- 2.64 mm). The early space changes to the maxillary arch subsequent to premature loss of a primary maxillary first molar are primarily distal drift of the primary canines toward the extraction space and palatal migration of the maxillary incisors. Although 1 mm of space was lost, which is statistically significant, this is not likely to be of sufficient clinical significance to warrant use of a space maintainer. If palatal movement appears to be needed, the dentist should consider use of a palatal arch rather than a band-and-loop maintainer. The effects of space maintainers need to be re-evaluated in cases of unilateral premature loss of a primary maxillary first molar.

  9. Keeping it real: revisiting a real-space approach to running ensembles of cosmological N-body simulations

    International Nuclear Information System (INIS)

    Orban, Chris

    2013-01-01

    In setting up initial conditions for ensembles of cosmological N-body simulations there are, fundamentally, two choices: either maximizing the correspondence of the initial density field to the assumed fourier-space clustering or, instead, matching to real-space statistics and allowing the DC mode (i.e. overdensity) to vary from box to box as it would in the real universe. As a stringent test of both approaches, I perform ensembles of simulations using power law and a ''powerlaw times a bump'' model inspired by baryon acoustic oscillations (BAO), exploiting the self-similarity of these initial conditions to quantify the accuracy of the matter-matter two-point correlation results. The real-space method, which was originally proposed by Pen 1997 [1] and implemented by Sirko 2005 [2], performed well in producing the expected self-similar behavior and corroborated the non-linear evolution of the BAO feature observed in conventional simulations, even in the strongly-clustered regime (σ 8 ∼>1). In revisiting the real-space method championed by [2], it was also noticed that this earlier study overlooked an important integral constraint correction to the correlation function in results from the conventional approach that can be important in ΛCDM simulations with L box ∼ −1 Gpc and on scales r∼>L box /10. Rectifying this issue shows that the fourier space and real space methods are about equally accurate and efficient for modeling the evolution and growth of the correlation function, contrary to previous claims. An appendix provides a useful independent-of-epoch analytic formula for estimating the importance of the integral constraint bias on correlation function measurements in ΛCDM simulations

  10. Exploration of DGVM Parameter Solution Space Using Simulated Annealing: Implications for Forecast Uncertainties

    Science.gov (United States)

    Wells, J. R.; Kim, J. B.

    2011-12-01

    Parameters in dynamic global vegetation models (DGVMs) are thought to be weakly constrained and can be a significant source of errors and uncertainties. DGVMs use between 5 and 26 plant functional types (PFTs) to represent the average plant life form in each simulated plot, and each PFT typically has a dozen or more parameters that define the way it uses resource and responds to the simulated growing environment. Sensitivity analysis explores how varying parameters affects the output, but does not do a full exploration of the parameter solution space. The solution space for DGVM parameter values are thought to be complex and non-linear; and multiple sets of acceptable parameters may exist. In published studies, PFT parameters are estimated from published literature, and often a parameter value is estimated from a single published value. Further, the parameters are "tuned" using somewhat arbitrary, "trial-and-error" methods. BIOMAP is a new DGVM created by fusing MAPSS biogeography model with Biome-BGC. It represents the vegetation of North America using 26 PFTs. We are using simulated annealing, a global search method, to systematically and objectively explore the solution space for the BIOMAP PFTs and system parameters important for plant water use. We defined the boundaries of the solution space by obtaining maximum and minimum values from published literature, and where those were not available, using +/-20% of current values. We used stratified random sampling to select a set of grid cells representing the vegetation of the conterminous USA. Simulated annealing algorithm is applied to the parameters for spin-up and a transient run during the historical period 1961-1990. A set of parameter values is considered acceptable if the associated simulation run produces a modern potential vegetation distribution map that is as accurate as one produced by trial-and-error calibration. We expect to confirm that the solution space is non-linear and complex, and that

  11. Validated simulator for space debris removal with nets and other flexible tethers applications

    Science.gov (United States)

    Gołębiowski, Wojciech; Michalczyk, Rafał; Dyrek, Michał; Battista, Umberto; Wormnes, Kjetil

    2016-12-01

    In the context of active debris removal technologies and preparation activities for the e.Deorbit mission, a simulator for net-shaped elastic bodies dynamics and their interactions with rigid bodies, has been developed. Its main application is to aid net design and test scenarios for space debris deorbitation. The simulator can model all the phases of the debris capturing process: net launch, flight and wrapping around the target. It handles coupled simulation of rigid and flexible bodies dynamics. Flexible bodies were implemented using Cosserat rods model. It allows to simulate flexible threads or wires with elasticity and damping for stretching, bending and torsion. Threads may be combined into structures of any topology, so the software is able to simulate nets, pure tethers, tether bundles, cages, trusses, etc. Full contact dynamics was implemented. Programmatic interaction with simulation is possible - i.e. for control implementation. The underlying model has been experimentally validated and due to significant gravity influence, experiment had to be performed in microgravity conditions. Validation experiment for parabolic flight was a downscaled process of Envisat capturing. The prepacked net was launched towards the satellite model, it expanded, hit the model and wrapped around it. The whole process was recorded with 2 fast stereographic camera sets for full 3D trajectory reconstruction. The trajectories were used to compare net dynamics to respective simulations and then to validate the simulation tool. The experiments were performed on board of a Falcon-20 aircraft, operated by National Research Council in Ottawa, Canada. Validation results show that model reflects phenomenon physics accurately enough, so it may be used for scenario evaluation and mission design purposes. The functionalities of the simulator are described in detail in the paper, as well as its underlying model, sample cases and methodology behind validation. Results are presented and

  12. HUMAN SPACE FLIGHTS: FACTS AND DREAMS

    Directory of Open Access Journals (Sweden)

    Mariano Bizzarri

    2011-12-01

    Full Text Available Manned space flight has been the great human and technological adventure of the past half-century. By putting people into places and situations unprecedented in history, it has stirred the imagination while expanding and redefining the human experience. However, space exploration obliges men to confront a hostile environment of cosmic radiation, microgravity, isolation and changes in the magnetic field. Any space traveler is therefore submitted to relevant health threats. In the twenty-first century, human space flight will continue, but it will change in the ways that science and technology have changed on Earth: it will become more networked, more global, and more oriented toward primary objectives. A new international human space flight policy can help achieve these objectives by clarifying the rationales, the ethics of acceptable risk, the role of remote presence, and the need for balance between funding and ambition to justify the risk of human lives.

  13. A Coordinated Initialization Process for the Distributed Space Exploration Simulation

    Science.gov (United States)

    Crues, Edwin Z.; Phillips, Robert G.; Dexter, Dan; Hasan, David

    2007-01-01

    A viewgraph presentation on the federate initialization process for the Distributed Space Exploration Simulation (DSES) is described. The topics include: 1) Background: DSES; 2) Simulation requirements; 3) Nine Step Initialization; 4) Step 1: Create the Federation; 5) Step 2: Publish and Subscribe; 6) Step 3: Create Object Instances; 7) Step 4: Confirm All Federates Have Joined; 8) Step 5: Achieve initialize Synchronization Point; 9) Step 6: Update Object Instances With Initial Data; 10) Step 7: Wait for Object Reflections; 11) Step 8: Set Up Time Management; 12) Step 9: Achieve startup Synchronization Point; and 13) Conclusions

  14. Space Debris Attitude Simulation - IOTA (In-Orbit Tumbling Analysis)

    Science.gov (United States)

    Kanzler, R.; Schildknecht, T.; Lips, T.; Fritsche, B.; Silha, J.; Krag, H.

    Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA's Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. The In-Orbit Tumbling Analysis tool (IOTA) is a prototype software, currently in development within the framework of ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), which is led by the Astronomical Institute of the University of Bern (AIUB). The project goal is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). Developed by Hyperschall Technologie Göttingen GmbH (HTG), IOTA will be a highly modular software tool to perform short- (days), medium- (months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour

  15. A Conservation Ethic and the Collecting of Animals by Institutions of Natural Heritage in the Twenty-First Century: Case Study of the Australian Museum

    Directory of Open Access Journals (Sweden)

    Timothy Ikin

    2011-02-01

    Full Text Available Collecting of animals from their habitats for preservation by museums and related bodies is a core operation of such institutions. Conservation of biodiversity in the current era is a priority in the scientific agendas of museums of natural heritage in Australia and the world. Intuitively, to take animals from the wild, while engaged in scientific or other practices that are supposed to promote their ongoing survival, may appear be incompatible. The Australian Museum presents an interesting ground to consider zoological collecting by museums in the twenty-first century. Anderson and Reeves in 1994 argued that a milieu existed that undervalued native species, and that the role of natural history museums, up to as late as the mid-twentieth century, was only to make a record the faunal diversity of Australia, which would inevitably be extinct. Despite the latter, conservation of Australia’s faunal diversity is a key aspect of research programmes in Australia’s institutions of natural heritage in the current era. This paper analyses collecting of animals, a core task for institutions of natural heritage, and how this interacts with a professed “conservation ethic” in a twenty-first century Australian setting.

  16. A Motion Simulator Ride Associated With Headache and Subdural Hematoma: First Case Report.

    Science.gov (United States)

    Scranton, Robert A; Evans, Randolph W; Baskin, David S

    2016-02-01

    We report the first case report of symptomatic bilateral subdural hematomas (SDH) associated with riding a centrifugal motion simulator ride. A previously healthy 55-year-old male developed new onset daily headaches 1 week after going on the ride that were due to symptomatic bilateral SDH requiring operative intervention with a full recovery. There was no history of other trauma or other systemic or intracranial abnormality to account for the development of the SDH. We review the headaches and other clinical features associated with chronic SDH. Twelve cases of roller coaster headaches due to SDH associated with riding roller coasters have been reported. The pathophysiology is reviewed, which we believe is the same mechanism that may be responsible in this case. Although it is possible that this neurovascular injury is truly rare, it is also possible that this injury is underreported as patients and physicians may not make the association or physicians have not reported additional cases. The risk of this injury likely increases with age, as the size of the subdural space increases, and may support the maxim that "roller coasters and simulators are for kids." © 2015 American Headache Society.

  17. The horror of stigma: psychosis and mental health care environments in twenty-first-century horror film (part II).

    Science.gov (United States)

    Goodwin, John

    2014-10-01

    This paper highlights the specific manner in which twenty-first-century horror films stigmatize psychosis and mental health care environments (MHCEs) A search on various film forums using the terms "mental/psychiatric patient," "psychosis/psychoses," and "mental/psychiatric hospital" (limited from 2000 to 2012) revealed 55 films. A literature review revealed criteria for a checklist. Subsequent to viewings, salient recurring criteria were added to the checklist. Films were systematically analyzed under these criteria. Homicidal maniacs are the most common stereotypes. Misinformation is often communicated. Familiar horror tropes are used to stigmatize MHCEs. Practitioners should be aware of the specific manner in which clients are being stigmatized by the media. This paper highlights specific ways in which psychosis and MHCEs are stigmatized, and encourages practitioners to challenge these depictions. © 2013 Wiley Periodicals, Inc.

  18. A Novel Simulation Technician Laboratory Design: Results of a Survey-Based Study.

    Science.gov (United States)

    Ahmed, Rami; Hughes, Patrick G; Friedl, Ed; Ortiz Figueroa, Fabiana; Cepeda Brito, Jose R; Frey, Jennifer; Birmingham, Lauren E; Atkinson, Steven Scott

    2016-03-16

    OBJECTIVE : The purpose of this study was to elicit feedback from simulation technicians prior to developing the first simulation technician-specific simulation laboratory in Akron, OH. Simulation technicians serve a vital role in simulation centers within hospitals/health centers around the world. The first simulation technician degree program in the US has been approved in Akron, OH. To satisfy the requirements of this program and to meet the needs of this special audience of learners, a customized simulation lab is essential. A web-based survey was circulated to simulation technicians prior to completion of the lab for the new program. The survey consisted of questions aimed at identifying structural and functional design elements of a novel simulation center for the training of simulation technicians. Quantitative methods were utilized to analyze data. Over 90% of technicians (n=65) think that a lab designed explicitly for the training of technicians is novel and beneficial. Approximately 75% of respondents think that the space provided appropriate audiovisual (AV) infrastructure and space to evaluate the ability of technicians to be independent. The respondents think that the lab needed more storage space, visualization space for a large number of students, and more space in the technical/repair area. CONCLUSIONS : A space designed for the training of simulation technicians was considered to be beneficial. This laboratory requires distinct space for technical repair, adequate bench space for the maintenance and repair of simulators, an appropriate AV infrastructure, and space to evaluate the ability of technicians to be independent.

  19. Autonomous Motion Learning for Intra-Vehicular Activity Space Robot

    Science.gov (United States)

    Watanabe, Yutaka; Yairi, Takehisa; Machida, Kazuo

    Space robots will be needed in the future space missions. So far, many types of space robots have been developed, but in particular, Intra-Vehicular Activity (IVA) space robots that support human activities should be developed to reduce human-risks in space. In this paper, we study the motion learning method of an IVA space robot with the multi-link mechanism. The advantage point is that this space robot moves using reaction force of the multi-link mechanism and contact forces from the wall as space walking of an astronaut, not to use a propulsion. The control approach is determined based on a reinforcement learning with the actor-critic algorithm. We demonstrate to clear effectiveness of this approach using a 5-link space robot model by simulation. First, we simulate that a space robot learn the motion control including contact phase in two dimensional case. Next, we simulate that a space robot learn the motion control changing base attitude in three dimensional case.

  20. Pre-launch simulation experiment of microwave-ionosphere nonlinear interaction rocket experiment in the space plasma chamber

    Energy Technology Data Exchange (ETDEWEB)

    Kaya, N. (Kobe University, Kobe, Japan); Tsutsui, M. (Kyoto University, Uji, Japan); Matsumoto, H. (Kyoto University, Kyoto, Japan)

    1980-09-01

    A pre-flight test experiment of a microwave-ionosphere nonlinear interaction rocket experiment (MINIX) has been carried out in a space plasma simulation chamber. Though the first rocket experiment ended up in failure because of a high voltage trouble, interesting results are observed in the pre-flight experiment. A significant microwave heating of plasma up to 300% temperature increase is observed. Strong excitations of plasma waves by the transmitted microwaves in the VLF and HF range are observed as well. These microwave effects may have to be taken into account in solar power satellite projects in the future.

  1. Twelve-month space changes after premature loss of a primary maxillary first molar.

    Science.gov (United States)

    Lin, Yai-Tin; Lin, Wen-Hsien; Lin, Yng-Tzer J

    2011-05-01

    Many early investigations concerning space changes following premature extraction of primary molars had a cross-sectional design, a small sample size, and a somewhat crude methodology, which may have led to misunderstandings. The aim of this study was to use established longitudinal data to investigate ongoing (12-month) dental-arch space problems arising as a result of premature loss of a primary maxillary first molar. Thirteen children (mean ± SD age at time of tooth extraction, 6.0 ± 0.74 years) with unilateral premature loss of a primary maxillary first molar were selected for this study. Maxillary dental study casts were obtained from participants 2 or 3 days after the tooth was removed, as well as at a follow-up appointment 12 months later. Six reference lines were measured on the study cast: D + E space, arch width, arch length, intercanine width, intercanine length, and arch perimeter. For each participant, the D + E space of the contralateral intact primary molar served as a control. A paired t-test was used to compare the cast measurements between initial examination and 12-month follow-up. A t-test was used to compare D + E space changes with those of the control group. The D + E space of the extraction side after 12 months was significantly smaller than that of the control side (P 0.05). The 12-month space changes in the maxillary dental arch after premature loss of a primary maxillary first molar consist mainly of distal drift of the primary canine toward the extraction site. Mesial movement of permanent molars or tilting of the primary molars did not occur. An increased arch dimension was found especially in the anterior segment (intercanine width and length). There is no need for the use of space maintainers from the results in this study in cases of premature loss of a primary first molar. © 2010 The Authors. International Journal of Paediatric Dentistry © 2010 BSPD, IAPD and Blackwell Publishing Ltd.

  2. Impact of prescribed Arctic sea ice thickness in simulations of the present and future climate

    Energy Technology Data Exchange (ETDEWEB)

    Krinner, Gerhard [Alfred Wegener Institute for Polar and Marine Research, Potsdam (Germany); INSU-CNRS and UJF Grenoble, Laboratoire de Glaciologie et Geophysique de l' Environnement (LGGE), 54 rue Moliere, BP 96, Saint Martin d' Heres Cedex (France); Rinke, Annette; Dethloff, Klaus [Alfred Wegener Institute for Polar and Marine Research, Potsdam (Germany); Gorodetskaya, Irina V. [INSU-CNRS and UJF Grenoble, Laboratoire de Glaciologie et Geophysique de l' Environnement (LGGE), 54 rue Moliere, BP 96, Saint Martin d' Heres Cedex (France)

    2010-09-15

    This paper describes atmospheric general circulation model climate change experiments in which the Arctic sea-ice thickness is either fixed to 3 m or somewhat more realistically parameterized in order to take into account essentially the spatial variability of Arctic sea-ice thickness, which is, to a first approximation, a function of ice type (perennial or seasonal). It is shown that, both at present and at the end of the twenty-first century (under the SRES-A1B greenhouse gas scenario), the impact of a variable sea-ice thickness compared to a uniform value is essentially limited to the cold seasons and the lower troposphere. However, because first-year ice is scarce in the Central Arctic today, but not under SRES-A1B conditions at the end of the twenty-first century, and because the impact of a sea-ice thickness reduction can be masked by changes of the open water fraction, the spatial and temporal patterns of the effect of sea-ice thinning on the atmosphere differ between the two periods considered. As a consequence, not only the climate simulated at a given period, but also the simulated Arctic climate change over the twenty-first century is affected by the way sea-ice thickness is prescribed. (orig.)

  3. The space elevator: a new tool for space studies.

    Science.gov (United States)

    Edwards, Bradley C

    2003-06-01

    The objective has been to develop a viable scenario for the construction, deployment and operation of a space elevator using current or near future technology. This effort has been primarily a paper study with several experimental tests of specific systems. Computer simulations, engineering designs, literature studies and inclusion of existing programs have been utilized to produce a design for the first space elevator. The results from this effort illustrate a viable design using current and near-term technology for the construction of the first space elevator. The timeline for possible construction is within the coming decades and estimated costs are less than $10 B. The initial elevator would have a 5 ton/day capacity and operating costs near $100/lb for payloads going to any Earth orbit or traveling to the Moon, Mars, Venus or the asteroids. An operational space elevator would allow for larger and much longer-term biological space studies at selectable gravity levels. The high-capacity and low operational cost of this system would also allow for inexpensive searches for life throughout our solar system and the first tests of environmental engineering. This work is supported by a grant from the NASA Institute for Advanced Concepts (NIAC).

  4. Simulation analysis of impulse characteristics of space debris irradiated by multi-pulse laser

    Science.gov (United States)

    Lin, Zhengguo; Jin, Xing; Chang, Hao; You, Xiangyu

    2018-02-01

    Cleaning space debris with laser is a hot topic in the field of space security research. Impulse characteristics are the basis of cleaning space debris with laser. In order to study the impulse characteristics of rotating irregular space debris irradiated by multi-pulse laser, the impulse calculation method of rotating space debris irradiated by multi-pulse laser is established based on the area matrix method. The calculation method of impulse and impulsive moment under multi-pulse irradiation is given. The calculation process of total impulse under multi-pulse irradiation is analyzed. With a typical non-planar space debris (cube) as example, the impulse characteristics of space debris irradiated by multi-pulse laser are simulated and analyzed. The effects of initial angular velocity, spot size and pulse frequency on impulse characteristics are investigated.

  5. Molecular Electronics: Insight from First-Principles Transport Simulations

    DEFF Research Database (Denmark)

    Paulsson, Magnus; Frederiksen, Thomas; Brandbyge, Mads

    2010-01-01

    Conduction properties of nanoscale contacts can be studied using first-principles simulations. Such calculations give insight into details behind the conductance that is not readily available in experiments. For example, we may learn how the bonding conditions of a molecule to the electrodes affect...

  6. Twenty-five astronomical observations that changed the world and how to make them yourself

    CERN Document Server

    Marett-Crosby, Michael

    2013-01-01

    Human history is also the record of our fascination with the sky, and to look upwards is to follow in the steps of such greats as Galileo and Newton. What they and others once saw in the heavens for the first time, amateur astronomers can discover anew using this guide to twenty-five of the greatest journeys through space.   Starting with our most visible companion the Moon, each chapter offers a step-by-step walk-through of famous astronomical observations from the history of science. Beginning with the easiest targets, sometimes even accessible with the naked eye, the challenges become progressively more difficult. Beginner astronomers and more experienced hobbyists alike can reacquaint themselves with the wonders of our fellow planets and even reach far beyond our own solar system to touch on such incredible phenomena as the birth of new stars in nebula systems and the deceptive nothingness of black holes. The would-be astronaut can spy the International Space Station in orbit with binoculars or the dooms...

  7. Molecular electronics: insight from first-principles transport simulations.

    Science.gov (United States)

    Paulsson, Magnus; Frederiksen, Thomas; Brandbyge, Mads

    2010-01-01

    Conduction properties of nanoscale contacts can be studied using first-principles simulations. Such calculations give insight into details behind the conductance that is not readily available in experiments. For example, we may learn how the bonding conditions of a molecule to the electrodes affect the electronic transport. Here we describe key computational ingredients and discuss these in relation to simulations for scanning tunneling microscopy (STM) experiments with C60 molecules where the experimental geometry is well characterized. We then show how molecular dynamics simulations may be combined with transport calculations to study more irregular situations, such as the evolution of a nanoscale contact with the mechanically controllable break-junction technique. Finally we discuss calculations of inelastic electron tunnelling spectroscopy as a characterization technique that reveals information about the atomic arrangement and transport channels.

  8. Scalable space-time adaptive simulation tools for computational electrocardiology

    OpenAIRE

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  9. Internal Flow Simulation of Enhanced Performance Solid Rocket Booster for the Space Transportation System

    Science.gov (United States)

    Ahmad, Rashid A.; McCool, Alex (Technical Monitor)

    2001-01-01

    An enhanced performance solid rocket booster concept for the space shuttle system has been proposed. The concept booster will have strong commonality with the existing, proven, reliable four-segment Space Shuttle Reusable Solid Rocket Motors (RSRM) with individual component design (nozzle, insulator, etc.) optimized for a five-segment configuration. Increased performance is desirable to further enhance safety/reliability and/or increase payload capability. Performance increase will be achieved by adding a fifth propellant segment to the current four-segment booster and opening the throat to accommodate the increased mass flow while maintaining current pressure levels. One development concept under consideration is the static test of a "standard" RSRM with a fifth propellant segment inserted and appropriate minimum motor modifications. Feasibility studies are being conducted to assess the potential for any significant departure in component performance/loading from the well-characterized RSRM. An area of concern is the aft motor (submerged nozzle inlet, aft dome, etc.) where the altered internal flow resulting from the performance enhancing features (25% increase in mass flow rate, higher Mach numbers, modified subsonic nozzle contour) may result in increased component erosion and char. To assess this issue and to define the minimum design changes required to successfully static test a fifth segment RSRM engineering test motor, internal flow studies have been initiated. Internal aero-thermal environments were quantified in terms of conventional convective heating and discrete phase alumina particle impact/concentration and accretion calculations via Computational Fluid Dynamics (CFD) simulation. Two sets of comparative CFD simulations of the RSRM and the five-segment (IBM) concept motor were conducted with CFD commercial code FLUENT. The first simulation involved a two-dimensional axi-symmetric model of the full motor, initial grain RSRM. The second set of analyses

  10. Performing the comic side of bodily abjection: A study of twenty-first century female stand-up comedy in a multi-cultural and multi-racial Britain

    OpenAIRE

    Blunden, Pamela

    2011-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University. This thesis is a socio-cultural study of the development of female stand-up comedy in the first decade of the twenty-first century within a multi-racial and multi-cultural Britain. It also engages with the theory and practice of performance and asks the question: ‘In what ways can it be said that female stand-up comics perform the comic side of bodily abjection?’ This question is applied to t...

  11. A Simple Evacuation Modeling and Simulation Tool for First Responders

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Daniel B [ORNL; Payne, Patricia W [ORNL

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools can quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.

  12. Long-term space changes after premature loss of a primary maxillary first molar

    OpenAIRE

    Lin, Yng-Tzer J.; Lin, Yai-Tin

    2016-01-01

    Background/purpose: The consequence of premature loss of primary teeth resulting in the need for space maintainers has been controversial for many years. There is no longitudinal long-term report in literature regarding the premature loss of a primary maxillary first molar. The aim of this study was to continue observing the long-term space changes of 19 cases following premature loss of a primary maxillary first molar during the transition from primary to permanent dentition. Materials an...

  13. The Space-Time Conservative Schemes for Large-Scale, Time-Accurate Flow Simulations with Tetrahedral Meshes

    Science.gov (United States)

    Venkatachari, Balaji Shankar; Streett, Craig L.; Chang, Chau-Lyan; Friedlander, David J.; Wang, Xiao-Yen; Chang, Sin-Chung

    2016-01-01

    Despite decades of development of unstructured mesh methods, high-fidelity time-accurate simulations are still predominantly carried out on structured, or unstructured hexahedral meshes by using high-order finite-difference, weighted essentially non-oscillatory (WENO), or hybrid schemes formed by their combinations. In this work, the space-time conservation element solution element (CESE) method is used to simulate several flow problems including supersonic jet/shock interaction and its impact on launch vehicle acoustics, and direct numerical simulations of turbulent flows using tetrahedral meshes. This paper provides a status report for the continuing development of the space-time conservation element solution element (CESE) numerical and software framework under the Revolutionary Computational Aerosciences (RCA) project. Solution accuracy and large-scale parallel performance of the numerical framework is assessed with the goal of providing a viable paradigm for future high-fidelity flow physics simulations.

  14. First intramuscular administration in the U.S. space program. [of motion sickness drugs

    Science.gov (United States)

    Bagian, James P.

    1991-01-01

    In the past, the only kind of medicines used for symptomatic treatment of space motion sickness (SMS) in space had been oral, transdermal, or suppositories. This paper describes the effect of the first intramuscular (IM) administration of Phenergan (50-mg in single dose) on SMS in one subject who exhibited grade-3 symptoms and signs which persisted unabated throughout the first and the second flight days aboard the Space Shuttle. Thirty minutes after the injection, the subject had completely recovered. His symptoms were gone, his appetite was back, and he had no recurrences for the remainder of the flight. Since that experiment, intramuscular injections have been given nine more times on subsequent flights, with similar results.

  15. Evaluation of SPACE code for simulation of inadvertent opening of spray valve in Shin Kori unit 1

    International Nuclear Information System (INIS)

    Kim, Seyun; Youn, Bumsoo

    2013-01-01

    SPACE code is expected to be applied to the safety analysis for LOCA (Loss of Coolant Accident) and Non-LOCA scenarios. SPACE code solves two-fluid, three-field governing equations and programmed with C++ computer language using object-oriented concepts. To evaluate the analysis capability for the transient phenomena in the actual nuclear power plant, an inadvertent opening of spray valve in startup test phase of Shin Kori unit 1 was simulated with SPACE code. To evaluate the analysis capability for the transient phenomena in the actual nuclear power plant, an inadvertent opening of spray valve in startup test phase of Shin Kori unit 1 was simulated with SPACE code

  16. Design space development for the extraction process of Danhong injection using a Monte Carlo simulation method.

    Directory of Open Access Journals (Sweden)

    Xingchu Gong

    Full Text Available A design space approach was applied to optimize the extraction process of Danhong injection. Dry matter yield and the yields of five active ingredients were selected as process critical quality attributes (CQAs. Extraction number, extraction time, and the mass ratio of water and material (W/M ratio were selected as critical process parameters (CPPs. Quadratic models between CPPs and CQAs were developed with determination coefficients higher than 0.94. Active ingredient yields and dry matter yield increased as the extraction number increased. Monte-Carlo simulation with models established using a stepwise regression method was applied to calculate the probability-based design space. Step length showed little effect on the calculation results. Higher simulation number led to results with lower dispersion. Data generated in a Monte Carlo simulation following a normal distribution led to a design space with a smaller size. An optimized calculation condition was obtained with 10,000 simulation times, 0.01 calculation step length, a significance level value of 0.35 for adding or removing terms in a stepwise regression, and a normal distribution for data generation. The design space with a probability higher than 0.95 to attain the CQA criteria was calculated and verified successfully. Normal operating ranges of 8.2-10 g/g of W/M ratio, 1.25-1.63 h of extraction time, and two extractions were recommended. The optimized calculation conditions can conveniently be used in design space development for other pharmaceutical processes.

  17. Book review of Capital in the Twenty-First Century, by Thomas Piketty. Cambridge, Massachusetts, London, England: The Belknap Press of Harvard Press, 2014, 605 pages

    Directory of Open Access Journals (Sweden)

    Paul Dobrescu

    2015-04-01

    Full Text Available “Every now and then, the field of economics produces an important book; this is one of them” (Cowen, 2014. These are the opening words of Tyler Cowen’s presentation of Thomas Piketty’s work, “Capital in the Twenty-First Century” (Piketty, 2014, in Foreign Affairs. This is a book that is visibly placed in all important bookstores around the world, widely debated, acclaimed, sold (over 1 million copies have been sold so far. It has been favorably reviewed or quoted in all major journals. The assessment of “Capital in the Twenty-First Century” by Paul Krugman, Nobel Economics Prize Laureate as a “magnificent, sweeping meditation on inequality”, is highly relevant: “This is a book that will change both the way we think about society and the way we do economics” (Krugman, 2014. Finally, Piketty’s book is included in the list of the year’s best books by prestigious journals, such as The Economist, Financial Times, The Washington Post, Observer, The Independent, Daily Telegraph; Financial Times and McKinsey have hailed it as the best book of 2014.

  18. CHINA'S FIRST 'KISS' IN SPACE

    Institute of Scientific and Technical Information of China (English)

    YIN PUMIN

    2011-01-01

    At 1:36 a.m.on November 3,nearly two days after it was launched,the unmanned spacecraft Shenzhou 8 docked with space lab module Tiangong-1,or Heavenly Palace-Ⅰ.The docking represents another significant milestone for China's space program.

  19. Numerical simulation of electromagnetic waves in Schwarzschild space-time by finite difference time domain method and Green function method

    Science.gov (United States)

    Jia, Shouqing; La, Dongsheng; Ma, Xuelian

    2018-04-01

    The finite difference time domain (FDTD) algorithm and Green function algorithm are implemented into the numerical simulation of electromagnetic waves in Schwarzschild space-time. FDTD method in curved space-time is developed by filling the flat space-time with an equivalent medium. Green function in curved space-time is obtained by solving transport equations. Simulation results validate both the FDTD code and Green function code. The methods developed in this paper offer a tool to solve electromagnetic scattering problems.

  20. LIFE experiment: isolation of cryptoendolithic organisms from Antarctic colonized sandstone exposed to space and simulated Mars conditions on the international space station.

    Science.gov (United States)

    Scalzi, Giuliano; Selbmann, Laura; Zucconi, Laura; Rabbow, Elke; Horneck, Gerda; Albertano, Patrizia; Onofri, Silvano

    2012-06-01

    Desiccated Antarctic rocks colonized by cryptoendolithic communities were exposed on the International Space Station (ISS) to space and simulated Mars conditions (LiFE-Lichens and Fungi Experiment). After 1.5 years in space samples were retrieved, rehydrated and spread on different culture media. Colonies of a green alga and a pink-coloured fungus developed on Malt-Agar medium; they were isolated from a sample exposed to simulated Mars conditions beneath a 0.1 % T Suprasil neutral density filter and from a sample exposed to space vacuum without solar radiation exposure, respectively. None of the other flight samples showed any growth after incubation. The two organisms able to grow were identified at genus level by Small SubUnit (SSU) and Internal Transcribed Spacer (ITS) rDNA sequencing as Stichococcus sp. (green alga) and Acarospora sp. (lichenized fungal genus) respectively. The data in the present study provide experimental information on the possibility of eukaryotic life transfer from one planet to another by means of rocks and of survival in Mars environment.

  1. Unique migration of a dental needle into the parapharyngeal space: successful removal by an intraoral approach and simulation for tracking visibility in X-ray fluoroscopy.

    Science.gov (United States)

    Okumura, Yuri; Hidaka, Hiroshi; Seiji, Kazumasa; Nomura, Kazuhiro; Takata, Yusuke; Suzuki, Takahiro; Katori, Yukio

    2015-02-01

    The first objective was to describe a novel case of migration of a broken dental needle into the parapharyngeal space. The second was to address the importance of simulation elucidating visualization of such a thin needle under X-ray fluoroscopy. Clinical case records (including computed tomography [CT] and surgical approaches) were reviewed, and a simulation experiment using a head phantom was conducted using the same settings applied intraoperatively. A 36-year-old man was referred after failure to locate a broken 31-G dental needle. Computed tomography revealed migration of the needle into the parapharyngeal space. Intraoperative X-ray fluoroscopy failed to identify the needle, so a steel wire was applied as a reference during X-ray to locate the foreign body. The needle was successfully removed using an intraoral approach with tonsillectomy under surgical microscopy. The simulation showed that the dental needle was able to be identified only after applying an appropriate compensating filter, contrasting with the steel wire. Meticulous preoperative simulation regarding visual identification of dental needle foreign bodies is mandatory. Intraoperative radiography and an intraoral approach with tonsillectomy under surgical microscopy offer benefits for accessing the parapharyngeal space, specifically for cases medial to the great vessels. © The Author(s) 2014.

  2. Simulations of the observation of clouds and aerosols with the Experimental Lidar in Space Equipment system.

    Science.gov (United States)

    Liu, Z; Voelger, P; Sugimoto, N

    2000-06-20

    We carried out a simulation study for the observation of clouds and aerosols with the Japanese Experimental Lidar in Space Equipment (ELISE), which is a two-wavelength backscatter lidar with three detection channels. The National Space Development Agency of Japan plans to launch the ELISE on the Mission Demonstrate Satellite 2 (MDS-2). In the simulations, the lidar return signals for the ELISE are calculated for an artificial, two-dimensional atmospheric model including different types of clouds and aerosols. The signal detection processes are simulated realistically by inclusion of various sources of noise. The lidar signals that are generated are then used as input for simulations of data analysis with inversion algorithms to investigate retrieval of the optical properties of clouds and aerosols. The results demonstrate that the ELISE can provide global data on the structures and optical properties of clouds and aerosols. We also conducted an analysis of the effects of cloud inhomogeneity on retrievals from averaged lidar profiles. We show that the effects are significant for space lidar observations of optically thick broken clouds.

  3. Average accelerator simulation Truebeam using phase space in IAEA format

    International Nuclear Information System (INIS)

    Santana, Emico Ferreira; Milian, Felix Mas; Paixao, Paulo Oliveira; Costa, Raranna Alves da; Velasco, Fermin Garcia

    2015-01-01

    In this paper is used a computational code of radiation transport simulation based on Monte Carlo technique, in order to model a linear accelerator of treatment by Radiotherapy. This work is the initial step of future proposals which aim to study several treatment of patient by Radiotherapy, employing computational modeling in cooperation with the institutions UESC, IPEN, UFRJ e COI. The Chosen simulation code is GATE/Geant4. The average accelerator is TrueBeam of Varian Company. The geometric modeling was based in technical manuals, and radiation sources on the phase space for photons, provided by manufacturer in the IAEA (International Atomic Energy Agency) format. The simulations were carried out in equal conditions to experimental measurements. Were studied photons beams of 6MV, with 10 per 10 cm of field, focusing on a water phantom. For validation were compared dose curves in depth, lateral profiles in different depths of the simulated results and experimental data. The final modeling of this accelerator will be used in future works involving treatments and real patients. (author)

  4. Navigating the Problem Space: The Medium of Simulation Games in the Teaching of History

    Science.gov (United States)

    McCall, Jeremiah

    2012-01-01

    Simulation games can play a critical role in enabling students to navigate the problem spaces of the past while simultaneously critiquing the models designers offer to represent those problem spaces. There is much to be gained through their use. This includes rich opportunities for students to engage the past as independent historians; to consider…

  5. CONFRONTING THREE-DIMENSIONAL TIME-DEPENDENT JET SIMULATIONS WITH HUBBLE SPACE TELESCOPE OBSERVATIONS

    International Nuclear Information System (INIS)

    Staff, Jan E.; Niebergal, Brian P.; Ouyed, Rachid; Pudritz, Ralph E.; Cai, Kai

    2010-01-01

    We perform state-of-the-art, three-dimensional, time-dependent simulations of magnetized disk winds, carried out to simulation scales of 60 AU, in order to confront optical Hubble Space Telescope observations of protostellar jets. We 'observe' the optical forbidden line emission produced by shocks within our simulated jets and compare these with actual observations. Our simulations reproduce the rich structure of time-varying jets, including jet rotation far from the source, an inner (up to 400 km s -1 ) and outer (less than 100 km s -1 ) component of the jet, and jet widths of up to 20 AU in agreement with observed jets. These simulations when compared with the data are able to constrain disk wind models. In particular, models featuring a disk magnetic field with a modest radial spatial variation across the disk are favored.

  6. Projected status of the Pacific walrus (Odobenus rosmarus divergens) in the twenty-first century

    Science.gov (United States)

    Jay, Chadwick V.; Marcot, Bruce G.; Douglas, David C.

    2011-01-01

    Extensive and rapid losses of sea ice in the Arctic have raised conservation concerns for the Pacific walrus (Odobenus rosmarus divergens), a large pinniped inhabiting arctic and subarctic continental shelf waters of the Chukchi and Bering seas. We developed a Bayesian network model to integrate potential effects of changing environmental conditions and anthropogenic stressors on the future status of the Pacific walrus population at four periods through the twenty-first century. The model framework allowed for inclusion of various sources and levels of knowledge, and representation of structural and parameter uncertainties. Walrus outcome probabilities through the century reflected a clear trend of worsening conditions for the subspecies. From the current observation period to the end of century, the greatest change in walrus outcome probabilities was a progressive decrease in the outcome state of robust and a concomitant increase in the outcome state of vulnerable. The probabilities of rare and extirpated states each progressively increased but remained level of 10% in 2004 to 22% by 2050 and 40% by 2095. The degree of uncertainty in walrus outcomes increased monotonically over future periods. In the model, sea ice habitat (particularly for summer/fall) and harvest levels had the greatest influence on future population outcomes. Other potential stressors had much smaller influences on walrus outcomes, mostly because of uncertainty in their future states and our current poor understanding of their mechanistic influence on walrus abundance.

  7. Developing twenty-first century skills: insights from an intensive interdisciplinary workshop Mosaic of Life

    Directory of Open Access Journals (Sweden)

    Tamara Milosevic

    2013-11-01

    Full Text Available The Baltic Sea, one of the world’s largest semi-enclosed seas, which, with its very low salinity and quasi-isolation from the big oceans cannot decide whether it is a sea or a large lake. This geologically-unique environment supports an even more surprising and delicate marine ecosystem, where a complex community of fishes, marine mammals and important microscopic organisms creates a magical mosaic of life. Humans have enjoyed the abundance of life in the Baltic Sea for thousands of years, and major Scandinavian and Baltic cities have oriented themselves towards this geo-ecosystem in order to develop and seek ecological, economical and cultural inspiration and wealth. The ‘Mosaic of Life’ workshop aimed at going beyond the obvious in examining the meaning of the Baltic Sea by gathering together a selection of young, creative minds from different backgrounds ranging from the arts and economics to geology and life sciences. This intensive workshop was designed as a unique training opportunity to develop essential twenty-first century skills – to introduce and develop creative, critical and interdisciplinary thinking and collaborative teamwork, as well as to foster a visual and scientific literacy, using project-based learning and hands-on activities. Our final goal has been to be inspired by the resulting connections, differences and unifying concepts, creating innovative, interdisciplinary projects which would look further than the sea – further than the eye can see and further into the future.

  8. Us, them, and others: reflections on Canadian multiculturalism and national identity at the turn of the twenty-first century.

    Science.gov (United States)

    Winter, Elke

    2014-05-01

    The John Porter Lecture at the annual meeting of the Canadian Sociological Association in Victoria 2013 draws upon my book Us, Them, and Others: Pluralism and National Identity in Diverse Societies. Incorporating the findings from an analysis of Canadian English-language newspaper discourses during the 1990s into a theoretical framework inspired by Weberian sociology, the book argues that pluralism is best understood as a dynamic set of triangular relations where the compromise between unequal groups--"us" and "others"--is rendered meaningful through the confrontation with real or imagined outsiders ("them"). The lecture summarizes the theoretical contribution and explains how multiculturalism became consolidated in dominant Canadian discourses in the late 1990s. The lecture then discusses changes to Canadian multicultural identity at the beginning of the twenty-first century.

  9. First results from the IllustrisTNG simulations: the galaxy colour bimodality

    Science.gov (United States)

    Nelson, Dylan; Pillepich, Annalisa; Springel, Volker; Weinberger, Rainer; Hernquist, Lars; Pakmor, Rüdiger; Genel, Shy; Torrey, Paul; Vogelsberger, Mark; Kauffmann, Guinevere; Marinacci, Federico; Naiman, Jill

    2018-03-01

    We introduce the first two simulations of the IllustrisTNG project, a next generation of cosmological magnetohydrodynamical simulations, focusing on the optical colours of galaxies. We explore TNG100, a rerun of the original Illustris box, and TNG300, which includes 2 × 25003 resolution elements in a volume 20 times larger. Here, we present first results on the galaxy colour bimodality at low redshift. Accounting for the attenuation of stellar light by dust, we compare the simulated (g - r) colours of 109 1011 M⊙ which redden at z z = 0 mass post-reddening; at the same time, ˜18 per cent of such massive galaxies acquire half or more of their final stellar mass while on the red sequence.

  10. First results from simulations of supersymmetric lattices

    Science.gov (United States)

    Catterall, Simon

    2009-01-01

    We conduct the first numerical simulations of lattice theories with exact supersymmetry arising from the orbifold constructions of \\cite{Cohen:2003xe,Cohen:2003qw,Kaplan:2005ta}. We consider the Script Q = 4 theory in D = 0,2 dimensions and the Script Q = 16 theory in D = 0,2,4 dimensions. We show that the U(N) theories do not possess vacua which are stable non-perturbatively, but that this problem can be circumvented after truncation to SU(N). We measure the distribution of scalar field eigenvalues, the spectrum of the fermion operator and the phase of the Pfaffian arising after integration over the fermions. We monitor supersymmetry breaking effects by measuring a simple Ward identity. Our results indicate that simulations of Script N = 4 super Yang-Mills may be achievable in the near future.

  11. The birth of NASA the work of the Space Task Group, America's first true space pioneers

    CERN Document Server

    von Ehrenfried, Dutch

    2016-01-01

    This is the story of the work of the original NASA space pioneers; men and women who were suddenly organized in 1958 from the then National Advisory Committee on Aeronautics (NACA) into the Space Task Group. A relatively small group, they developed the initial mission concept plans and procedures for the U. S. space program. Then they boldly built hardware and facilities to accomplish those missions. The group existed only three years before they were transferred to the Manned Spacecraft Center in Houston, Texas, in 1962, but their organization left a large mark on what would follow. Von Ehrenfried's personal experience with the STG at Langley uniquely positions him to describe the way the group was structured and how it reacted to the new demands of a post-Sputnik era. He artfully analyzes how the growing space program was managed and what techniques enabled it to develop so quickly from an operations perspective. The result is a fascinating window into history, amply backed up by first person documentation ...

  12. Latvian Security and Defense Policy within the Twenty-First Century Security Environment

    Directory of Open Access Journals (Sweden)

    Rublovskis Raimonds

    2014-12-01

    Full Text Available The aim of this paper is to analyze fundamental factors which form and profoundly shape security and defense policy of the Republic of Latvia. One can argue that historical background, geographical location, common institutional history within the former Soviet Union, the Russia factor, the relative smallness of the territory of state and the population, the ethnic composition of the population, the low density of the population and rather limited financial and manpower resources available for the defense of the Republic of Latvia are the key factors of influence on the state security and defense policy. The core principles of the security and defense policy of Latvia are the membership in powerful global military alliance of NATO and bilateral strategic partnership with the United States. However, security and defense cooperation among the three Baltic States as well as enhanced cooperation within the Baltic-Nordic framework is seen as an important supplementary factor for the increased security of the Republic of Latvia. Latvia has developed a sustainable legal and institutional framework in order to contribute to state security and defense; however, security challenges and significant changes within the global security environment of the twenty-first century will further challenge the ability of the Republic of Latvia to sustain its current legal framework, and more importantly, current institutional structure of Latvian security and defense architecture. Significant internal and external challenges will impact the fundamental pillars of Latvian security and defense policy, such as American strategic shift to the Pacific, and lack of political will to increase defense budgets in European part of NATO. It has to be clear that very independence, security and defense of the Republic of Latvia depend on the ability of NATO to remain an effective organization with timely and efficient decision-making, and the ability of the United States to remain

  13. Equilibration and analysis of first-principles molecular dynamics simulations of water

    Science.gov (United States)

    Dawson, William; Gygi, François

    2018-03-01

    First-principles molecular dynamics (FPMD) simulations based on density functional theory are becoming increasingly popular for the description of liquids. In view of the high computational cost of these simulations, the choice of an appropriate equilibration protocol is critical. We assess two methods of estimation of equilibration times using a large dataset of first-principles molecular dynamics simulations of water. The Gelman-Rubin potential scale reduction factor [A. Gelman and D. B. Rubin, Stat. Sci. 7, 457 (1992)] and the marginal standard error rule heuristic proposed by White [Simulation 69, 323 (1997)] are evaluated on a set of 32 independent 64-molecule simulations of 58 ps each, amounting to a combined cumulative time of 1.85 ns. The availability of multiple independent simulations also allows for an estimation of the variance of averaged quantities, both within MD runs and between runs. We analyze atomic trajectories, focusing on correlations of the Kohn-Sham energy, pair correlation functions, number of hydrogen bonds, and diffusion coefficient. The observed variability across samples provides a measure of the uncertainty associated with these quantities, thus facilitating meaningful comparisons of different approximations used in the simulations. We find that the computed diffusion coefficient and average number of hydrogen bonds are affected by a significant uncertainty in spite of the large size of the dataset used. A comparison with classical simulations using the TIP4P/2005 model confirms that the variability of the diffusivity is also observed after long equilibration times. Complete atomic trajectories and simulation output files are available online for further analysis.

  14. The Fraunhofer Quantum Computing Portal - www.qc.fraunhofer.de: A web-based simulator of quantum computing processes

    OpenAIRE

    Rosé, H.; Asselmeyer-Maluga, T.; Kolbe, M.; Niehörster, F.; Schramm, A.

    2004-01-01

    Fraunhofer FIRST develops a computing service and collaborative workspace providing a convenient tool for simulation and investigation of quantum algorithms. To broaden the twenty qubit limit of workstation-based simulations to the next qubit decade we provide a dedicated high memorized Linux cluster with fast Myrinet interconnection network together with a adapted parallel simulator engine. This simulation service supplemented by a collaborative workspace is usable everywhere via web interfa...

  15. Numerical Simulation of Ionospheric Disturbances Generated by the Chelyabinsk and Tunguska Space Body Impacts

    Science.gov (United States)

    Shuvalov, V. V.; Khazins, V. M.

    2018-03-01

    Numerical simulation of atmospheric disturbances during the first hours after the Chelyabinsk and Tunguska space body impacts has been carried out. The results of detailed calculations, including the stages of destruction, evaporation and deceleration of the cosmic body, the generation of atmospheric disturbances and their propagation over distances of thousands of kilometers, have been compared with the results of spherical explosions with energy equal to the kinetic energy of meteoroids. It has been shown that in the case of the Chelyabinsk meteorite, an explosive analogy provides acceptable dimensions of the perturbed region and the perturbation amplitude. With a more powerful Tunguska fall, the resulting atmospheric flow is very different from the explosive one; an atmospheric plume emerges that releases matter from the meteoric trace to an altitude of the order of a thousand kilometers.

  16. Simulation of space protons influence on silicon semiconductor devices using gamma-neutron irradiation

    International Nuclear Information System (INIS)

    Zhukov, Y.N.; Zinchenko, V.F.; Ulimov, V.N.

    1999-01-01

    In this study the authors focus on the problems of simulating the space proton energy spectra under laboratory gamma-neutron radiation tests of semiconductor devices (SD). A correct simulation of radiation effects implies to take into account and evaluate substantial differences in the processes of formation of primary defects in SD in space environment and under laboratory testing. These differences concern: 1) displacement defects, 2) ionization defects and 3) intensity of radiation. The study shows that: - the energy dependence of nonionizing energy loss (NIEL) is quite universal to predict the degradation of SD parameters associated to displacement defects, and - MOS devices that are sensitive to ionization defects indicated the same variation of parameters under conditions of equality of ionization density generated by protons and gamma radiations. (A.C.)

  17. Use of Parallel Micro-Platform for the Simulation the Space Exploration

    Science.gov (United States)

    Velasco Herrera, Victor Manuel; Velasco Herrera, Graciela; Rosano, Felipe Lara; Rodriguez Lozano, Salvador; Lucero Roldan Serrato, Karen

    The purpose of this work is to create a parallel micro-platform, that simulates the virtual movements of a space exploration in 3D. One of the innovations presented in this design consists of the application of a lever mechanism for the transmission of the movement. The development of such a robot is a challenging task very different of the industrial manipulators due to a totally different target system of requirements. This work presents the study and simulation, aided by computer, of the movement of this parallel manipulator. The development of this model has been developed using the platform of computer aided design Unigraphics, in which it was done the geometric modeled of each one of the components and end assembly (CAD), the generation of files for the computer aided manufacture (CAM) of each one of the pieces and the kinematics simulation of the system evaluating different driving schemes. We used the toolbox (MATLAB) of aerospace and create an adaptive control module to simulate the system.

  18. Simulation of the space debris environment in LEO using a simplified approach

    Science.gov (United States)

    Kebschull, Christopher; Scheidemann, Philipp; Hesselbach, Sebastian; Radtke, Jonas; Braun, Vitali; Krag, H.; Stoll, Enrico

    2017-01-01

    Several numerical approaches exist to simulate the evolution of the space debris environment. These simulations usually rely on the propagation of a large population of objects in order to determine the collision probability for each object. Explosion and collision events are triggered randomly using a Monte-Carlo (MC) approach. So in many different scenarios different objects are fragmented and contribute to a different version of the space debris environment. The results of the single Monte-Carlo runs therefore represent the whole spectrum of possible evolutions of the space debris environment. For the comparison of different scenarios, in general the average of all MC runs together with its standard deviation is used. This method is computationally very expensive due to the propagation of thousands of objects over long timeframes and the application of the MC method. At the Institute of Space Systems (IRAS) a model capable of describing the evolution of the space debris environment has been developed and implemented. The model is based on source and sink mechanisms, where yearly launches as well as collisions and explosions are considered as sources. The natural decay and post mission disposal measures are the only sink mechanisms. This method reduces the computational costs tremendously. In order to achieve this benefit a few simplifications have been applied. The approach of the model partitions the Low Earth Orbit (LEO) region into altitude shells. Only two kinds of objects are considered, intact bodies and fragments, which are also divided into diameter bins. As an extension to a previously presented model the eccentricity has additionally been taken into account with 67 eccentricity bins. While a set of differential equations has been implemented in a generic manner, the Euler method was chosen to integrate the equations for a given time span. For this paper parameters have been derived so that the model is able to reflect the results of the numerical MC

  19. A laser particulate spectrometer for a space simulation facility

    Science.gov (United States)

    Schmitt, R. J.; Boyd, B. A.; Linford, R. M. F.; Richmond, R. G.

    1975-01-01

    A laser particulate spectrometer (LPS) system was developed to measure the size and speed distributions of particulate contaminants. Detection of the particulates is achieved by means of light scattering and extinction effects using a single laser beam to cover a size range of 0.8 to 275 microns diameter and a speed range of 0.2 to 20 meters/second. The LPS system was designed to operate in the high-vacuum environment of a space simulation chamber with cold shroud temperatures ranging from 77 to 300 K.

  20. Astronomers Make First Images With Space Radio Telescope

    Science.gov (United States)

    1997-07-01

    Marking an important new milestone in radio astronomy history, scientists at the National Radio Astronomy Observatory (NRAO) in Socorro, New Mexico, have made the first images using a radio telescope antenna in space. The images, more than a million times more detailed than those produced by the human eye, used the new Japanese HALCA satellite, working in conjunction with the National Science Foundation's (NSF) Very Long Baseline Array (VLBA) and Very Large Array (VLA) ground-based radio telescopes. The landmark images are the result of a long-term NRAO effort supported by the National Aeronautics and Space Administration (NASA). "This success means that our ability to make detailed radio images of objects in the universe is no longer limited by the size of the Earth," said NRAO Director Paul Vanden Bout. "Astronomy's vision has just become much sharper." HALCA, launched on Feb. 11 by Japan's Institute of Space and Astronautical Science (ISAS), is the first satellite designed for radio astronomy imaging. It is part of an international collaboration led by ISAS and backed by NRAO; Japan's National Astronomical Observatory; NASA's Jet Propulsion Laboratory (JPL); the Canadian Space Agency; the Australia Telescope National Facility; the European VLBI Network and the Joint Institute for Very Long Baseline Interferometry in Europe. On May 22, HALCA observed a distant active galaxy called PKS 1519-273, while the VLBA and VLA also observed it. Data from the satellite was received by a tracking station at the NRAO facility in Green Bank, West Virginia. Tape-recorded data from the satellite and from the radio telescopes on the ground were sent to NRAO's Array Operations Center (AOC) in Socorro, NM. In Socorro, astronomers and computer scientists used a special-purpose computer to digitally combine the signals from the satellite and the ground telescopes to make them all work together as a single, giant radio telescope. This dedicated machine, the VLBA Correlator, built as

  1. Earth Observations from Space: The First 50 Years of Scientific Achievements

    Science.gov (United States)

    2008-01-01

    Observing Earth from space over the past 50 years has fundamentally transformed the way people view our home planet. The image of the "blue marble" is taken for granted now, but it was revolutionary when taken in 1972 by the crew on Apollo 17. Since then the capability to look at Earth from space has grown increasingly sophisticated and has evolved from simple photographs to quantitative measurements of Earth properties such as temperature, concentrations of atmospheric trace gases, and the exact elevation of land and ocean. Imaging Earth from space has resulted in major scientific accomplishments; these observations have led to new discoveries, transformed the Earth sciences, opened new avenues of research, and provided important societal benefits by improving the predictability of Earth system processes. This report highlights the scientific achievements made possible by the first five decades of Earth satellite observations by space-faring nations. It follows on a recent report from the National Research Council (NRC) entitled Earth Science and Applications from Space: National Imperatives for the Next Decade and Beyond, also referred to as the "decadal survey." Recognizing the increasing need for space observations, the decadal survey identifies future directions and priorities for Earth observations from space. This companion report was requested by the National Aeronautics and Space Administration (NASA) to highlight, through selected examples, important past contributions of Earth observations from space to our current understanding of the planet.

  2. Wicked Female Characters in Roddy Doyle’s “The Pram”: Revisiting Celtic and Polish Myths in the Context of Twenty-First Century Ireland

    Directory of Open Access Journals (Sweden)

    Burcu Gülüm Tekin

    2015-07-01

    Full Text Available “The Pram” is the only horror story in Roddy Doyle’s collection The Deportees and Other Stories (2007. It is also unique in terms of its approach to Ireland’s multicultural scene in the twenty-first century. Doyle turns the other side of the coin and introduces a migrant caretaker (Alina, who loses her mind due to her employees’ (the O’Reilly family ill-treatment. As a reaction to their scornful attitude, Alina becomes a murderer. Set in the context of twenty-first century Dublin, “The Pram” contains various references to Celtic and Polish mythological female figures (in particular, the Old Hag of Beara and Boginka, which strengthen the thrilling, mythical elements in the plot. This paper aims to examine the characters’ negative attitude towards migrants in Ireland in the light of the racist discourse present in the story. Also, I will focus on the story’s female characters and discuss the handicaps of being a female migrant in Ireland. The parallels between the mythical female figures and the protagonist Alina will be another point to be analyzed. The argument of this paper is that Doyle does not always portray the positive outcomes of a multicultural society. On the contrary, he conveys the perspective of the incoming migrant. “The Pram” stages the obstacles that a female outsider may experience in Ireland and her subsequent transformation as a result of the racism she encounters there.

  3. A Review of Twenty-First Century Higher Education

    Science.gov (United States)

    Chan, Shirley

    2018-01-01

    This article is predominantly concerned with the global challenges associated with managing an academic workforce in an era characterised by increased demand for higher education. In scrutinising global trends in higher education and academic workforce management, the article will address two research questions. First, what are the global trends…

  4. First-Principles Petascale Simulations for Predicting Deflagration to Detonation Transition in Hydrogen-Oxygen Mixtures

    Energy Technology Data Exchange (ETDEWEB)

    Khokhlov, Alexei [Univ. of Chicago, IL (United States). Dept. of Astronomy and Astrophysics. Enrico Fermi Inst.; Austin, Joanna [Argonne National Lab. (ANL), Argonne, IL (United States). Argonne Leadership Computing Facility; Bacon, C. [Univ. of Illinois, Urbana, IL (United States). Dept. of Aerospace Engineering

    2015-03-02

    Hydrogen has emerged as an important fuel across a range of industries as a means of achieving energy independence and to reduce emissions. DDT and the resulting detonation waves in hydrogen-oxygen can have especially catastrophic consequences in a variety of industrial and energy producing settings related to hydrogen. First-principles numerical simulations of flame acceleration and DDT are required for an in-depth understanding of the phenomena and facilitating design of safe hydrogen systems. The goals of this project were (1) to develop first-principles petascale reactive flow Navier-Stokes simulation code for predicting gaseous high-speed combustion and detonation (HSCD) phenomena and (2) demonstrate feasibility of first-principles simulations of rapid flame acceleration and deflagration-to-detonation transition (DDT) in stoichiometric hydrogen-oxygen mixture (2H2 + O2). The goals of the project have been accomplished. We have developed a novel numerical simulation code, named HSCD, for performing first-principles direct numerical simulations of high-speed hydrogen combustion. We carried out a series of validating numerical simulations of inert and reactive shock reflection experiments in shock tubes. We then performed a pilot numerical simulation of flame acceleration in a long pipe. The simulation showed the transition of the rapidly accelerating flame into a detonation. The DDT simulations were performed using BG/Q Mira at the Argonne National Laboratory, currently the fourth fastest super-computer in the world.

  5. First urology simulation boot camp in the United Kingdom

    Directory of Open Access Journals (Sweden)

    C.S. Biyani

    2017-09-01

    Conclusion: This first UK Urology Simulation Boot Camp has demonstrated feasibility and effectiveness in enhancing trainee’s experience. Given these positive feedbacks there is a good reason to expect that future courses will improve the overall skills of a new urology trainee.

  6. Extended phase-space methods for enhanced sampling in molecular simulations: a review

    Directory of Open Access Journals (Sweden)

    Hiroshi eFujisaki

    2015-09-01

    Full Text Available Molecular Dynamics simulations are a powerful approach to study biomolecular conformational changes or protein-ligand, protein-protein and protein-DNA/RNA interactions. Straightforward applications however are often hampered by incomplete sampling, since in a typical simulated trajectory the system will spend most of its time trapped by high energy barriers in restricted regions of the configuration space. Over the years, several techniques have been designed to overcome this problem and enhance space sampling. Here, we review a class of methods that rely on the idea of extending the set of dynamical variables of the system by adding extra ones associated to functions describing the process under study. In particular, we illustrate the Temperature Accelerated Molecular Dynamics (TAMD, Logarithmic Mean Force Dynamics (LogMFD, andMultiscale Enhanced Sampling (MSES algorithms. We also discuss combinations with techniques for searching reaction paths. We show the advantages presented by this approach and how it allows to quickly sample important regions of the free energy landscape via automatic exploration.

  7. Low urinary albumin excretion in astronauts during space missions

    DEFF Research Database (Denmark)

    Cirillo, Massimo; De Santo, Natale G; Heer, Martina

    2003-01-01

    BACKGROUND: Physiological changes occur in man during space missions also at the renal level. Proteinuria was hypothesized for space missions but research data are missing. METHODS: Urinary albumin, as an index of proteinuria, and other variables were analyzed in 4 astronauts during space missions...... onboard the MIR station and on the ground (control). Mission duration before first urine collection in the four astronauts was 4, 26, 26, and 106 days, respectively. On the ground, data were collected 2 months before mission in two astronauts, 6 months after in the other astronauts. A total of twenty......-two 24-hour urine collections were obtained in space (n per astronaut = 1-14) and on the ground (n per astronaut = 2-12). Urinary albumin was measured by radioimmunoassay. For each astronaut, mean of data in space and on the ground was defined as individual average. RESULTS: The individual averages of 24...

  8. Tetrahedral-Mesh Simulation of Turbulent Flows with the Space-Time Conservative Schemes

    Science.gov (United States)

    Chang, Chau-Lyan; Venkatachari, Balaji; Cheng, Gary C.

    2015-01-01

    Direct numerical simulations of turbulent flows are predominantly carried out using structured, hexahedral meshes despite decades of development in unstructured mesh methods. Tetrahedral meshes offer ease of mesh generation around complex geometries and the potential of an orientation free grid that would provide un-biased small-scale dissipation and more accurate intermediate scale solutions. However, due to the lack of consistent multi-dimensional numerical formulations in conventional schemes for triangular and tetrahedral meshes at the cell interfaces, numerical issues exist when flow discontinuities or stagnation regions are present. The space-time conservative conservation element solution element (CESE) method - due to its Riemann-solver-free shock capturing capabilities, non-dissipative baseline schemes, and flux conservation in time as well as space - has the potential to more accurately simulate turbulent flows using unstructured tetrahedral meshes. To pave the way towards accurate simulation of shock/turbulent boundary-layer interaction, a series of wave and shock interaction benchmark problems that increase in complexity, are computed in this paper with triangular/tetrahedral meshes. Preliminary computations for the normal shock/turbulence interactions are carried out with a relatively coarse mesh, by direct numerical simulations standards, in order to assess other effects such as boundary conditions and the necessity of a buffer domain. The results indicate that qualitative agreement with previous studies can be obtained for flows where, strong shocks co-exist along with unsteady waves that display a broad range of scales, with a relatively compact computational domain and less stringent requirements for grid clustering near the shock. With the space-time conservation properties, stable solutions without any spurious wave reflections can be obtained without a need for buffer domains near the outflow/farfield boundaries. Computational results for the

  9. James Webb Space Telescope Optical Simulation Testbed: Segmented Mirror Phase Retrieval Testing

    Science.gov (United States)

    Laginja, Iva; Egron, Sylvain; Brady, Greg; Soummer, Remi; Lajoie, Charles-Philippe; Bonnefois, Aurélie; Long, Joseph; Michau, Vincent; Choquet, Elodie; Ferrari, Marc; Leboulleux, Lucie; Mazoyer, Johan; N’Diaye, Mamadou; Perrin, Marshall; Petrone, Peter; Pueyo, Laurent; Sivaramakrishnan, Anand

    2018-01-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a hardware simulator designed to produce JWST-like images. A model of the JWST three mirror anastigmat is realized with three lenses in form of a Cooke Triplet, which provides JWST-like optical quality over a field equivalent to a NIRCam module, and an Iris AO segmented mirror with hexagonal elements is standing in for the JWST segmented primary. This setup successfully produces images extremely similar to NIRCam images from cryotesting in terms of the PSF morphology and sampling relative to the diffraction limit.The testbed is used for staff training of the wavefront sensing and control (WFS&C) team and for independent analysis of WFS&C scenarios of the JWST. Algorithms like geometric phase retrieval (GPR) that may be used in flight and potential upgrades to JWST WFS&C will be explored. We report on the current status of the testbed after alignment, implementation of the segmented mirror, and testing of phase retrieval techniques.This optical bench complements other work at the Makidon laboratory at the Space Telescope Science Institute, including the investigation of coronagraphy for segmented aperture telescopes. Beyond JWST we intend to use JOST for WFS&C studies for future large segmented space telescopes such as LUVOIR.

  10. Simulation Evaluation of Controller-Managed Spacing Tools under Realistic Operational Conditions

    Science.gov (United States)

    Callantine, Todd J.; Hunt, Sarah M.; Prevot, Thomas

    2014-01-01

    Controller-Managed Spacing (CMS) tools have been developed to aid air traffic controllers in managing high volumes of arriving aircraft according to a schedule while enabling them to fly efficient descent profiles. The CMS tools are undergoing refinement in preparation for field demonstration as part of NASA's Air Traffic Management (ATM) Technology Demonstration-1 (ATD-1). System-level ATD-1 simulations have been conducted to quantify expected efficiency and capacity gains under realistic operational conditions. This paper presents simulation results with a focus on CMS-tool human factors. The results suggest experienced controllers new to the tools find them acceptable and can use them effectively in ATD-1 operations.

  11. Benchmark of Space Charge Simulations and Comparison with Experimental Results for High Intensity, Low Energy Accelerators

    CERN Document Server

    Cousineau, Sarah M

    2005-01-01

    Space charge effects are a major contributor to beam halo and emittance growth leading to beam loss in high intensity, low energy accelerators. As future accelerators strive towards unprecedented levels of beam intensity and beam loss control, a more comprehensive understanding of space charge effects is required. A wealth of simulation tools have been developed for modeling beams in linacs and rings, and with the growing availability of high-speed computing systems, computationally expensive problems that were inconceivable a decade ago are now being handled with relative ease. This has opened the field for realistic simulations of space charge effects, including detailed benchmarks with experimental data. A great deal of effort is being focused in this direction, and several recent benchmark studies have produced remarkably successful results. This paper reviews the achievements in space charge benchmarking in the last few years, and discusses the challenges that remain.

  12. Impact of dynamic vegetation phenology on the simulated pan-Arctic land surface state

    Science.gov (United States)

    Teufel, Bernardo; Sushama, Laxmi; Arora, Vivek K.; Verseghy, Diana

    2018-03-01

    The pan-Arctic land surface is undergoing rapid changes in a warming climate, with near-surface permafrost projected to degrade significantly during the twenty-first century. Vegetation-related feedbacks have the potential to influence the rate of degradation of permafrost. In this study, the impact of dynamic phenology on the pan-Arctic land surface state, particularly near-surface permafrost, for the 1961-2100 period, is assessed by comparing two simulations of the Canadian Land Surface Scheme (CLASS)—one with dynamic phenology, modelled using the Canadian Terrestrial Ecosystem Model (CTEM), and the other with prescribed phenology. These simulations are forced by atmospheric data from a transient climate change simulation of the 5th generation Canadian Regional Climate Model (CRCM5) for the Representative Concentration Pathway 8.5 (RCP8.5). Comparison of the CLASS coupled to CTEM simulation to available observational estimates of plant area index, spatial distribution of permafrost and active layer thickness suggests that the model captures reasonably well the overall distribution of vegetation and permafrost. It is shown that the most important impact of dynamic phenology on the land surface occurs through albedo and it is demonstrated for the first time that vegetation control on albedo during late spring and early summer has the highest potential to impact the degradation of permafrost. While both simulations show extensive near-surface permafrost degradation by the end of the twenty-first century, the strong projected response of vegetation to climate warming and increasing CO2 concentrations in the coupled simulation results in accelerated permafrost degradation in the northernmost continuous permafrost regions.

  13. Real-time graphics for the Space Station Freedom cupola, developed in the Systems Engineering Simulator

    Science.gov (United States)

    Red, Michael T.; Hess, Philip W.

    1989-01-01

    Among the Lyndon B. Johnson Space Center's responsibilities for Space Station Freedom is the cupola. Attached to the resource node, the cupola is a windowed structure that will serve as the space station's secondary control center. From the cupola, operations involving the mobile service center and orbital maneuvering vehicle will be conducted. The Systems Engineering Simulator (SES), located in building 16, activated a real-time man-in-the-loop cupola simulator in November 1987. The SES cupola is an engineering tool with the flexibility to evolve in both hardware and software as the final cupola design matures. Two workstations are simulated with closed-circuit television monitors, rotational and translational hand controllers, programmable display pushbuttons, and graphics display with trackball and keyboard. The displays and controls of the SES cupola are driven by a Silicon Graphics Integrated Raster Imaging System (IRIS) 4D/70 GT computer. Through the use of an interactive display builder program, SES, cupola display pages consisting of two dimensional and three dimensional graphics are constructed. These display pages interact with the SES via the IRIS real-time graphics interface. The focus is on the real-time graphics interface applications software developed on the IRIS.

  14. Simulation of space-charge effects in an ungated GEM-based TPC

    Energy Technology Data Exchange (ETDEWEB)

    Böhmer, F.V., E-mail: felix.boehmer@tum.de; Ball, M.; Dørheim, S.; Höppner, C.; Ketzer, B.; Konorov, I.; Neubert, S.; Paul, S.; Rauch, J.; Vandenbroucke, M.

    2013-08-11

    A fundamental limit to the application of Time Projection Chambers (TPCs) in high-rate experiments is the accumulation of slowly drifting ions in the active gas volume, which compromises the homogeneity of the drift field and hence the detector resolution. Conventionally, this problem is overcome by the use of ion-gating structures. This method, however, introduces large dead times and restricts trigger rates to a few hundred per second. The ion gate can be eliminated from the setup by the use of Gas Electron Multiplier (GEM) foils for gas amplification, which intrinsically suppress the backflow of ions. This makes the continuous operation of a TPC at high rates feasible. In this work, Monte Carlo simulations of the buildup of ion space charge in a GEM-based TPC and the correction of the resulting drift distortions are discussed, based on realistic numbers for the ion backflow in a triple-GEM amplification stack. A TPC in the future P{sup ¯}ANDA experiment at FAIR serves as an example for the experimental environment. The simulations show that space charge densities up to 65 fC cm{sup −3} are reached, leading to electron drift distortions of up to 10 mm. The application of a laser calibration system to correct these distortions is investigated. Based on full simulations of the detector physics and response, we show that it is possible to correct for the drift distortions and to maintain the good momentum resolution of the GEM-TPC.

  15. Simulation of space-charge effects in an ungated GEM-based TPC

    International Nuclear Information System (INIS)

    Böhmer, F.V.; Ball, M.; Dørheim, S.; Höppner, C.; Ketzer, B.; Konorov, I.; Neubert, S.; Paul, S.; Rauch, J.; Vandenbroucke, M.

    2013-01-01

    A fundamental limit to the application of Time Projection Chambers (TPCs) in high-rate experiments is the accumulation of slowly drifting ions in the active gas volume, which compromises the homogeneity of the drift field and hence the detector resolution. Conventionally, this problem is overcome by the use of ion-gating structures. This method, however, introduces large dead times and restricts trigger rates to a few hundred per second. The ion gate can be eliminated from the setup by the use of Gas Electron Multiplier (GEM) foils for gas amplification, which intrinsically suppress the backflow of ions. This makes the continuous operation of a TPC at high rates feasible. In this work, Monte Carlo simulations of the buildup of ion space charge in a GEM-based TPC and the correction of the resulting drift distortions are discussed, based on realistic numbers for the ion backflow in a triple-GEM amplification stack. A TPC in the future P ¯ ANDA experiment at FAIR serves as an example for the experimental environment. The simulations show that space charge densities up to 65 fC cm −3 are reached, leading to electron drift distortions of up to 10 mm. The application of a laser calibration system to correct these distortions is investigated. Based on full simulations of the detector physics and response, we show that it is possible to correct for the drift distortions and to maintain the good momentum resolution of the GEM-TPC

  16. Twenty years of diffraction at the Tevatron

    International Nuclear Information System (INIS)

    Goulianos, K.; Rockefeller U.

    2005-01-01

    Results on diffractive particle interactions from the Fermilab Tevatron (bar p)p collider are placed in perspective through a QCD inspired phenomenological approach, which exploits scaling and factorization properties observed in data. The results discussed are those obtained by the CDF Collaboration from a comprehensive set of single, double, and multigap soft and hard diffraction processes studied during the twenty year period since 1985, when the CDF diffractive program was proposed and the first Blois Workshop was held

  17. Using Discrete Event Simulation to Model Integrated Commodities Consumption for a Launch Campaign of the Space Launch System

    Science.gov (United States)

    Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant

    2014-01-01

    In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.

  18. Lower first permanent molars: developing better predictors of spontaneous space closure.

    Science.gov (United States)

    Teo, Terry Kuo-Yih; Ashley, Paul Francis; Derrick, Donald

    2016-02-01

    First, first permanent molars (FPMs) of poor prognosis are often planned for extraction at an 'ideal time' so that second permanent molars (SPMs) erupt favourably to replace them. However for lower FPM extractions, timing is not an accurate predictor of success. The aim of this study was to identify additional radiographic factors that could better predict the degree of spontaneous space closure of the lower SPM following FPM extraction. Data from a previous study of 127 lower SPMs from 66 patients was re-analysed by incorporating additional radiographic factors. These included calcification stage of the bifurcation of the SPM, position of the second premolar, mesial angulation of SPM in relation to the FPM, and presence of the third permanent molar. Results were analysed using ordered logistic regression. Only 58 per cent of FPMs extracted at the 'ideal time' (SPM development at Demirjian stage E) had complete space closure. The best outcomes resulted from a combination of SPMs not at Demirjian development stage G, together with the presence of mesial angulation of the SPM and presence of the third permanent molar, where 85 per cent of those cases had complete space closure. Apart from extraction timing of the FPM, consideration must also be given to the presence of the third permanent molar and angulation of the SPM in order to ensure a reliable degree of spontaneous space closure of the lower SPM. © The Author 2015. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  19. First experiences of high-fidelity simulation training in junior nursing students in Korea.

    Science.gov (United States)

    Lee, Suk Jeong; Kim, Sang Suk; Park, Young-Mi

    2015-07-01

    This study was conducted to explore first experiences of high-fidelity simulation training in Korean nursing students, in order to develop and establish more effective guidelines for future simulation training in Korea. Thirty-three junior nursing students participated in high-fidelity simulation training for the first time. Using both qualitative and quantitative methods, data were collected from reflective journals and questionnaires of simulation effectiveness after simulation training. Descriptive statistics were used to analyze simulation effectiveness and content analysis was performed with the reflective journal data. Five dimensions and 31 domains, both positive and negative experiences, emerged from qualitative analysis: (i) machine-human interaction in a safe environment; (ii) perceived learning capability; (iii) observational learning; (iv) reconciling practice with theory; and (v) follow-up debriefing effect. More than 70% of students scored high on increased ability to identify changes in the patient's condition, critical thinking, decision-making, effectiveness of peer observation, and debriefing in effectiveness of simulation. This study reported both positive and negative experiences of simulation. The results of this study could be used to set the level of task difficulty in simulation. Future simulation programs can be designed by reinforcing the positive experiences and modifying the negative results. © 2014 The Authors. Japan Journal of Nursing Science © 2014 Japan Academy of Nursing Science.

  20. Effects of incentives on psychosocial performances in simulated space-dwelling groups

    Science.gov (United States)

    Hienz, Robert D.; Brady, Joseph V.; Hursh, Steven R.; Gasior, Eric D.; Spence, Kevin R.; Emurian, Henry H.

    Prior research with individually isolated 3-person crews in a distributed, interactive, planetary exploration simulation examined the effects of communication constraints and crew configuration changes on crew performance and psychosocial self-report measures. The present report extends these findings to a model of performance maintenance that operationalizes conditions under which disruptive affective responses by crew participants might be anticipated to emerge. Experiments evaluated the effects of changes in incentive conditions on crew performance and self-report measures in simulated space-dwelling groups. Crews participated in a simulated planetary exploration mission that required identification, collection, and analysis of geologic samples. Results showed that crew performance effectiveness was unaffected by either positive or negative incentive conditions, while self-report measures were differentially affected—negative incentive conditions produced pronounced increases in negative self-report ratings and decreases in positive self-report ratings, while positive incentive conditions produced increased positive self-report ratings only. Thus, incentive conditions associated with simulated spaceflight missions can significantly affect psychosocial adaptation without compromising task performance effectiveness in trained and experienced crews.

  1. Private ground infrastructures for space exploration missions simulations

    Science.gov (United States)

    Souchier, Alain

    2010-06-01

    The Mars Society, a private non profit organisation devoted to promote the red planet exploration, decided to implement simulated Mars habitat in two locations on Earth: in northern Canada on the rim of a meteoritic crater (2000), in a US Utah desert, location of a past Jurassic sea (2001). These habitats have been built with large similarities to actual planned habitats for first Mars exploration missions. Participation is open to everybody either proposing experimentations or wishing only to participate as a crew member. Participants are from different organizations: Mars Society, Universities, experimenters working with NASA or ESA. The general philosophy of the work conducted is not to do an innovative scientific work on the field but to learn how the scientific work is affected or modified by the simulation conditions. Outside activities are conducted with simulated spacesuits limiting the experimenter abilities. Technology or procedures experimentations are also conducted as well as experimentations on the crew psychology and behaviour.

  2. Validation of Varian TrueBeam electron phase–spaces for Monte Carlo simulation of MLC-shaped fields

    International Nuclear Information System (INIS)

    Lloyd, Samantha A. M.; Gagne, Isabelle M.; Zavgorodni, Sergei; Bazalova-Carter, Magdalena

    2016-01-01

    Purpose: This work evaluates Varian’s electron phase–space sources for Monte Carlo simulation of the TrueBeam for modulated electron radiation therapy (MERT) and combined, modulated photon and electron radiation therapy (MPERT) where fields are shaped by the photon multileaf collimator (MLC) and delivered at 70 cm SSD. Methods: Monte Carlo simulations performed with EGSnrc-based BEAMnrc/DOSXYZnrc and PENELOPE-based PRIMO are compared against diode measurements for 5 × 5, 10 × 10, and 20 × 20 cm 2 MLC-shaped fields delivered with 6, 12, and 20 MeV electrons at 70 cm SSD (jaws set to 40 × 40 cm 2 ). Depth dose curves and profiles are examined. In addition, EGSnrc-based simulations of relative output as a function of MLC-field size and jaw-position are compared against ion chamber measurements for MLC-shaped fields between 3 × 3 and 25 × 25 cm 2 and jaw positions that range from the MLC-field size to 40 × 40 cm 2 . Results: Percent depth dose curves generated by BEAMnrc/DOSXYZnrc and PRIMO agree with measurement within 2%, 2 mm except for PRIMO’s 12 MeV, 20 × 20 cm 2 field where 90% of dose points agree within 2%, 2 mm. Without the distance to agreement, differences between measurement and simulation are as large as 7.3%. Characterization of simulated dose parameters such as FWHM, penumbra width and depths of 90%, 80%, 50%, and 20% dose agree within 2 mm of measurement for all fields except for the FWHM of the 6 MeV, 20 × 20 cm 2 field which falls within 2 mm distance to agreement. Differences between simulation and measurement exist in the profile shoulders and penumbra tails, in particular for 10 × 10 and 20 × 20 cm 2 fields of 20 MeV electrons, where both sets of simulated data fall short of measurement by as much as 3.5%. BEAMnrc/DOSXYZnrc simulated outputs agree with measurement within 2.3% except for 6 MeV MLC-shaped fields. Discrepancies here are as great as 5.5%. Conclusions: TrueBeam electron phase–spaces available from Varian have been

  3. A Data Management System for International Space Station Simulation Tools

    Science.gov (United States)

    Betts, Bradley J.; DelMundo, Rommel; Elcott, Sharif; McIntosh, Dawn; Niehaus, Brian; Papasin, Richard; Mah, Robert W.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Groups associated with the design, operational, and training aspects of the International Space Station make extensive use of modeling and simulation tools. Users of these tools often need to access and manipulate large quantities of data associated with the station, ranging from design documents to wiring diagrams. Retrieving and manipulating this data directly within the simulation and modeling environment can provide substantial benefit to users. An approach for providing these kinds of data management services, including a database schema and class structure, is presented. Implementation details are also provided as a data management system is integrated into the Intelligent Virtual Station, a modeling and simulation tool developed by the NASA Ames Smart Systems Research Laboratory. One use of the Intelligent Virtual Station is generating station-related training procedures in a virtual environment, The data management component allows users to quickly and easily retrieve information related to objects on the station, enhancing their ability to generate accurate procedures. Users can associate new information with objects and have that information stored in a database.

  4. A simulation model for reliability evaluation of Space Station power systems

    Science.gov (United States)

    Singh, C.; Patton, A. D.; Kumar, Mudit; Wagner, H.

    1988-01-01

    A detailed simulation model for the hybrid Space Station power system is presented which allows photovoltaic and solar dynamic power sources to be mixed in varying proportions. The model considers the dependence of reliability and storage characteristics during the sun and eclipse periods, and makes it possible to model the charging and discharging of the energy storage modules in a relatively accurate manner on a continuous basis.

  5. MAGNETIC NULL POINTS IN KINETIC SIMULATIONS OF SPACE PLASMAS

    International Nuclear Information System (INIS)

    Olshevsky, Vyacheslav; Innocenti, Maria Elena; Cazzola, Emanuele; Lapenta, Giovanni; Deca, Jan; Divin, Andrey; Peng, Ivy Bo; Markidis, Stefano

    2016-01-01

    We present a systematic attempt to study magnetic null points and the associated magnetic energy conversion in kinetic particle-in-cell simulations of various plasma configurations. We address three-dimensional simulations performed with the semi-implicit kinetic electromagnetic code iPic3D in different setups: variations of a Harris current sheet, dipolar and quadrupolar magnetospheres interacting with the solar wind, and a relaxing turbulent configuration with multiple null points. Spiral nulls are more likely created in space plasmas: in all our simulations except lunar magnetic anomaly (LMA) and quadrupolar mini-magnetosphere the number of spiral nulls prevails over the number of radial nulls by a factor of 3–9. We show that often magnetic nulls do not indicate the regions of intensive energy dissipation. Energy dissipation events caused by topological bifurcations at radial nulls are rather rare and short-lived. The so-called X-lines formed by the radial nulls in the Harris current sheet and LMA simulations are rather stable and do not exhibit any energy dissipation. Energy dissipation is more powerful in the vicinity of spiral nulls enclosed by magnetic flux ropes with strong currents at their axes (their cross sections resemble 2D magnetic islands). These null lines reminiscent of Z-pinches efficiently dissipate magnetic energy due to secondary instabilities such as the two-stream or kinking instability, accompanied by changes in magnetic topology. Current enhancements accompanied by spiral nulls may signal magnetic energy conversion sites in the observational data

  6. Does the Common Agricultural Policy still make sense in the twenty-first century? CAP after 2013 from the perspective of Poland and Hungary

    Directory of Open Access Journals (Sweden)

    Elżbieta Daszkowska

    2009-01-01

    Full Text Available The EU CAP has developed immensely since the 1960’s. However, its current determinants are completely different from those which formed the CAP foundations. This results mainly from the fact that the UE CAP must meet present-day challenges and threats. Moreover, further EU enlargements also significantly influenced performance of this sector of economy. It is important to determine whether the existence of the CAP in the twenty-first century still makes sense and to specify in more detail the CAP reform directions after 2013 from the perspective of Poland and Hungary.

  7. Primary loop simulation of the SP-100 space nuclear reactor

    International Nuclear Information System (INIS)

    Borges, Eduardo M.; Braz Filho, Francisco A.; Guimaraes, Lamartine N.F.

    2011-01-01

    Between 1983 and 1992 the SP-100 space nuclear reactor development project for electric power generation in a range of 100 to 1000 kWh was conducted in the USA. Several configurations were studied to satisfy different mission objectives and power systems. In this reactor the heat is generated in a compact core and refrigerated by liquid lithium, the primary loops flow are controlled by thermoelectric electromagnetic pumps (EMTE), and thermoelectric converters produce direct current energy. To define the system operation point for an operating nominal power, it is necessary the simulation of the thermal-hydraulic components of the space nuclear reactor. In this paper the BEMTE-3 computer code is used to EMTE pump design performance evaluation to a thermalhydraulic primary loop configuration, and comparison of the system operation points of SP-100 reactor to two thermal powers, with satisfactory results. (author)

  8. FIRST experiment: Fragmentation of Ions Relevant for Space and Therapy

    International Nuclear Information System (INIS)

    Agodi, C; Bondì, M; Cavallaro, M; Carbone, D; Cirrone, G A P; Cuttone, G; Abou-Haidar, Z; Alvarez, M A G; Bocci, A; Aumann, T; Durante, M; Balestra, F; Battistoni, G; Bohlen, T T; Boudard, A; Brunetti, A; Carpinelli, M; Cappuzzello, F; Cortes-Giraldo, M A; Napoli, M De

    2013-01-01

    Nuclear fragmentation processes are relevant in different fields of basic research and applied physics and are of particular interest for tumor therapy and for space radiation protection applications. The FIRST (Fragmentation of Ions Relevant for Space and Therapy) experiment at SIS accelerator of GSI laboratory in Darmstadt, has been designed for the measurement of different ions fragmentation cross sections at different energies between 100 and 1000 MeV/nucleon. The experiment is performed by an international collaboration made of institutions from Germany, France, Italy and Spain. The experimental apparatus is partly based on an already existing setup made of the ALADIN magnet, the MUSIC IV TPC, the LAND2 neutron detector and the TOFWALL scintillator TOF system, integrated with newly designed detectors in the interaction Region (IR) around the carbon removable target: a scintillator Start Counter, a Beam Monitor drift chamber, a silicon Vertex Detector and a Proton Tagger for detection of light fragments emitted at large angles (KENTROS). The scientific program of the FIRST experiment started on summer 2011 with the study of the 400 MeV/nucleon 12C beam fragmentation on thin (8 mm) carbon target.

  9. FIRST experiment: Fragmentation of Ions Relevant for Space and Therapy

    Science.gov (United States)

    Agodi, C.; Abou-Haidar, Z.; Alvarez, M. A. G.; Aumann, T.; Balestra, F.; Battistoni, G.; Bocci, A.; Bohlen, T. T.; Bondì, M.; Boudard, A.; Brunetti, A.; Carpinelli, M.; Cappuzzello, F.; Cavallaro, M.; Carbone, D.; Cirrone, G. A. P.; Cortes-Giraldo, M. A.; Cuttone, G.; De Napoli, M.; Durante, M.; Fernandez-Garcia, J. P.; Finck, C.; Foti, A.; Gallardo, M. I.; Golosio, B.; Iarocci, E.; Iazzi, F.; Ickert, G.; Introzzi, R.; Juliani, D.; Krimmer, J.; Kurz, N.; Labalme, M.; Lavagno, A.; Leifels, Y.; Le Fevre, A.; Leray, S.; Marchetto, F.; Monaco, V.; Morone, M. C.; Nicolosi, D.; Oliva, P.; Paoloni, A.; Patera, V.; Piersanti, L.; Pleskac, R.; Quesada, J. M.; Randazzo, N.; Romano, F.; Rossi, D.; Rosso, V.; Rousseau, M.; Sacchi, R.; Sala, P.; Sarti, A.; Scheidenberger, C.; Schuy, C.; Sciubba, A.; Sfienti, C.; Simon, H.; Sipala, V.; Spiriti, E.; Stuttge, L.; Tropea, S.; Younis, H.

    2013-03-01

    Nuclear fragmentation processes are relevant in different fields of basic research and applied physics and are of particular interest for tumor therapy and for space radiation protection applications. The FIRST (Fragmentation of Ions Relevant for Space and Therapy) experiment at SIS accelerator of GSI laboratory in Darmstadt, has been designed for the measurement of different ions fragmentation cross sections at different energies between 100 and 1000 MeV/nucleon. The experiment is performed by an international collaboration made of institutions from Germany, France, Italy and Spain. The experimental apparatus is partly based on an already existing setup made of the ALADIN magnet, the MUSIC IV TPC, the LAND2 neutron detector and the TOFWALL scintillator TOF system, integrated with newly designed detectors in the interaction Region (IR) around the carbon removable target: a scintillator Start Counter, a Beam Monitor drift chamber, a silicon Vertex Detector and a Proton Tagger for detection of light fragments emitted at large angles (KENTROS). The scientific program of the FIRST experiment started on summer 2011 with the study of the 400 MeV/nucleon 12C beam fragmentation on thin (8mm) carbon target.

  10. Shelter and indoor air in the twenty-first century--radon, smoking, and lung cancer risks

    International Nuclear Information System (INIS)

    Fabrikant, J.I.

    1990-01-01

    Recognition that radon and its daughter products may accumulate to high levels in homes and in the workplace has led to concern about the potential lung cancer risk resulting from indoor domestic exposure. While such risks can be estimated with current dosimetric and epidemiological models for excess relative risks, it must be recognized that these models are based on data from occupational exposure and from underground miners' mortality experience. Several assumptions are required to apply risk estimates from an occupational setting to the indoor domestic environment. Analyses of the relevant data do not lead to a conclusive description of the interaction between radon daughters and cigarette smoking for the induction of lung cancer. The evidence compels the conclusion that indoor radon daughter exposure in homes represents a potential life-threatening public health hazard, particularly in males, and in cigarette smokers. Resolution of complex societal interactions will require public policy decisions involving the governmental, scientific, financial, and industrial sectors. These decisions impact the home, the workplace, and the marketplace, and they extend beyond the constraints of science. Risk identification, assessment, and management require scientific and engineering approaches to guide policy decisions to protect the public health. Mitigation and control procedures are only beginning to receive attention. Full acceptance for protection against what could prove to be a significant public health hazard in the twenty-first century will certainly involve policy decisions, not by scientists, but rather by men and women of government and law

  11. A Tale within a Tale: Mise en Abyme Adaptations of the Twenty-first Century

    Directory of Open Access Journals (Sweden)

    Željka Flegar

    2017-12-01

    Full Text Available In accord with the promise made by Henry Jenkins that “old and new media will interact in ever more complex ways” (Convergence Culture 6, this research observes metamodern fairy tale adaptations of the twenty-first century in light of Christina Bacchilega’s construct of the fairy-tale web and Henry Jenkins’ theory of convergence culture and transmedia storytelling. The research will address the growing trend of embedding “wonder tale” collections within the context of a larger narrative as an artefact of significance, power, and material value. Although original tales with known authorship, these fairy tale adaptations are appended to the mythology and culture of the fantastic secondary worlds. Such texts tend to be parodic, subversive, and even carnivalesque (Bakhtin; Stephens, providing a commentary on the culture of their origin, as well as our own. By blending cultures, styles, and formats, mise en abyme wonder tales also result in the empowerment of specifically marginalised groups. Generally defined as spin-offs that are otherwise a part of a complex inter- and hypertextual web, these fairy tale collections constitute a metafictional body of knowledge and wisdom. In the digital era much focus is placed on multimodal, hypertextual, and transmedia narratives with a significant influence of fandom on the production of such literary works. The study will focus on the popular examples of such practice, J.K. Rowling’s Tales of Beedle the Bard (2007/2008 and Ransom Riggs’ Tales of the Peculiar (2016, in order to define mise en abyme fairy tale adaptations, as well as to discuss their cultural significance and function.

  12. Simulating Space Radiation-Induced Breast Tumor Incidence Using Automata.

    Science.gov (United States)

    Heuskin, A C; Osseiran, A I; Tang, J; Costes, S V

    2016-07-01

    Estimating cancer risk from space radiation has been an ongoing challenge for decades primarily because most of the reported epidemiological data on radiation-induced risks are derived from studies of atomic bomb survivors who were exposed to an acute dose of gamma rays instead of chronic high-LET cosmic radiation. In this study, we introduce a formalism using cellular automata to model the long-term effects of ionizing radiation in human breast for different radiation qualities. We first validated and tuned parameters for an automata-based two-stage clonal expansion model simulating the age dependence of spontaneous breast cancer incidence in an unexposed U.S. We then tested the impact of radiation perturbation in the model by modifying parameters to reflect both targeted and nontargeted radiation effects. Targeted effects (TE) reflect the immediate impact of radiation on a cell's DNA with classic end points being gene mutations and cell death. They are well known and are directly derived from experimental data. In contrast, nontargeted effects (NTE) are persistent and affect both damaged and undamaged cells, are nonlinear with dose and are not well characterized in the literature. In this study, we introduced TE in our model and compared predictions against epidemiologic data of the atomic bomb survivor cohort. TE alone are not sufficient for inducing enough cancer. NTE independent of dose and lasting ∼100 days postirradiation need to be added to accurately predict dose dependence of breast cancer induced by gamma rays. Finally, by integrating experimental relative biological effectiveness (RBE) for TE and keeping NTE (i.e., radiation-induced genomic instability) constant with dose and LET, the model predicts that RBE for breast cancer induced by cosmic radiation would be maximum at 220 keV/μm. This approach lays the groundwork for further investigation into the impact of chronic low-dose exposure, inter-individual variation and more complex space radiation

  13. International Summer School on Astronomy and Space Science in Chile, first experience.

    Science.gov (United States)

    Stepanova, M.; Arellano-Baeza, A. A.

    I International Summer School on Astronomy and Space Science took place in the Elqui Valley Chile January 15-29 2005 Eighty 12-17 year old students from Chile Russia Venezuela and Bulgaria obtained a valuable experience to work together with outstanding scientists from Chile and Russia and with Russian cosmonaut Alexander Balandine They also had opportunity to visit the main astronomical observatories and to participate in workshops dedicated to the telescope and satellite design and remote sensing This activity was supported by numerous institutions in Chile including the Ministry of Education the European Southern Observatory Chilean Space Agency Chilean Air Force Latin American Association of Space Geophysics the principal Chilean universities and the First Lady Mrs Luisa Duran

  14. Cosmological observations with a wide field telescope in space: Pixel simulations of EUCLID spectrometer

    International Nuclear Information System (INIS)

    Zoubian, Julien

    2012-01-01

    The observations of the supernovae, the cosmic microwave background, and more recently the measurement of baryon acoustic oscillations and the weak lensing effects, converge to a Lambda CDM model, with an accelerating expansion of the today Universe. This model need two dark components to fit the observations, the dark matter and the dark energy. Two approaches seem particularly promising to measure both geometry of the Universe and growth of dark matter structures, the analysis of the weak distortions of distant galaxies by gravitational lensing and the study of the baryon acoustic oscillations. Both methods required a very large sky surveys of several thousand square degrees. In the context of the spectroscopic survey of the space mission EUCLID, dedicated to the study of the dark side of the universe, I developed a pixel simulation tool for analyzing instrumental performances. The proposed method can be summarized in three steps. The first step is to simulate the observables, i.e. mainly the sources of the sky. I work up a new method, adapted for spectroscopic simulations, which allows to mock an existing survey of galaxies in ensuring that the distribution of the spectral properties of galaxies are representative of current observations, in particular the distribution of the emission lines. The second step is to simulate the instrument and produce images which are equivalent to the expected real images. Based on the pixel simulator of the HST, I developed a new tool to compute the images of the spectroscopic channel of EUCLID. The new simulator have the particularity to be able to simulate PSF with various energy distributions and detectors which have different pixels. The last step is the estimation of the performances of the instrument. Based on existing tools, I set up a pipeline of image processing and performances measurement. My main results were: 1) to validate the method by simulating an existing survey of galaxies, the WISP survey, 2) to determine the

  15. Kristian Birkeland the first space scientist

    CERN Document Server

    Egeland, Alv

    2005-01-01

    At the beginning of the 20th century Kristian Birkeland (1867-1917), a Norwegian scientist of insatiable curiosity, addressed questions that had vexed European scientists for centuries. Why do the northern lights appear overhead when the Earth’s magnetic field is disturbed? How are magnetic storms connected to disturbances on the Sun? To answer these questions Birkeland interpreted his advance laboratory simulations and daring campaigns in the Arctic wilderness in the light of Maxwell’s newly discovered laws of electricity and magnetism. Birkeland’s ideas were dismissed for decades, only to be vindicated when satellites could fly above the Earth’s atmosphere. Faced with the depleting stocks of Chilean saltpeter and the consequent prospect of mass starvation, Birkeland showed his practical side, inventing the first industrial scale method to extract nitrogen-based fertilizers from the air. Norsk Hydro, one of modern Norway’s largest industries, stands as a living tribute to his genius. Hoping to demo...

  16. Effects of simulated space environmental parameters on six commercially available composite materials

    International Nuclear Information System (INIS)

    Funk, J.G.; Sykes, G.F. Jr.

    1989-04-01

    The effects of simulated space environmental parameters on microdamage induced by the environment in a series of commercially available graphite-fiber-reinforced composite materials were determined. Composites with both thermoset and thermoplastic resin systems were studied. Low-Earth-Orbit (LEO) exposures were simulated by thermal cycling; geosynchronous-orbit (GEO) exposures were simulated by electron irradiation plus thermal cycling. The thermal cycling temperature range was -250 F to either 200 F or 150 F. The upper limits of the thermal cycles were different to ensure that an individual composite material was not cycled above its glass transition temperature. Material response was characterized through assessment of the induced microcracking and its influence on mechanical property changes at both room temperature and -250 F. Microdamage was induced in both thermoset and thermoplastic advanced composite materials exposed to the simulated LEO environment. However, a 350 F cure single-phase toughened epoxy composite was not damaged during exposure to the LEO environment. The simuated GEO environment produced microdamage in all materials tested

  17. Optimal design of a composite space shield based on numerical simulations

    International Nuclear Information System (INIS)

    Son, Byung Jin; Yoo, Jeong Hoon; Lee, Min Hyung

    2015-01-01

    In this study, optimal design of a stuffed Whipple shield is proposed by using numerical simulations and new penetration criterion. The target model was selected based on the shield model used in the Columbus module of the international space station. Because experimental results can be obtained only in the low velocity region below 7 km/s, it is required to derive the Ballistic limit curve (BLC) in the high velocity region above 7 km/s by numerical simulation. AUTODYN-2D, the commercial hydro-code package, was used to simulate the nonlinear transient analysis for the hypervelocity impact. The Smoothed particle hydrodynamics (SPH) method was applied to projectile and bumper modeling to represent the debris cloud generated after the impact. Numerical simulation model and selected material properties were validated through a quantitative comparison between numerical and experimental results. A new criterion to determine whether the penetration occurs or not is proposed from kinetic energy analysis by numerical simulation in the velocity region over 7 km/s. The parameter optimization process was performed to improve the protection ability at a specific condition through the Design of experiment (DOE) method and the Response surface methodology (RSM). The performance of the proposed optimal design was numerically verified.

  18. Experimental identification of a comb-shaped chaotic region in multiple parameter spaces simulated by the Hindmarsh—Rose neuron model

    Science.gov (United States)

    Jia, Bing

    2014-03-01

    A comb-shaped chaotic region has been simulated in multiple two-dimensional parameter spaces using the Hindmarsh—Rose (HR) neuron model in many recent studies, which can interpret almost all of the previously simulated bifurcation processes with chaos in neural firing patterns. In the present paper, a comb-shaped chaotic region in a two-dimensional parameter space was reproduced, which presented different processes of period-adding bifurcations with chaos with changing one parameter and fixed the other parameter at different levels. In the biological experiments, different period-adding bifurcation scenarios with chaos by decreasing the extra-cellular calcium concentration were observed from some neural pacemakers at different levels of extra-cellular 4-aminopyridine concentration and from other pacemakers at different levels of extra-cellular caesium concentration. By using the nonlinear time series analysis method, the deterministic dynamics of the experimental chaotic firings were investigated. The period-adding bifurcations with chaos observed in the experiments resembled those simulated in the comb-shaped chaotic region using the HR model. The experimental results show that period-adding bifurcations with chaos are preserved in different two-dimensional parameter spaces, which provides evidence of the existence of the comb-shaped chaotic region and a demonstration of the simulation results in different two-dimensional parameter spaces in the HR neuron model. The results also present relationships between different firing patterns in two-dimensional parameter spaces.

  19. Experimental identification of a comb-shaped chaotic region in multiple parameter spaces simulated by the Hindmarsh—Rose neuron model

    International Nuclear Information System (INIS)

    Jia Bing

    2014-01-01

    A comb-shaped chaotic region has been simulated in multiple two-dimensional parameter spaces using the Hindmarsh—Rose (HR) neuron model in many recent studies, which can interpret almost all of the previously simulated bifurcation processes with chaos in neural firing patterns. In the present paper, a comb-shaped chaotic region in a two-dimensional parameter space was reproduced, which presented different processes of period-adding bifurcations with chaos with changing one parameter and fixed the other parameter at different levels. In the biological experiments, different period-adding bifurcation scenarios with chaos by decreasing the extra-cellular calcium concentration were observed from some neural pacemakers at different levels of extra-cellular 4-aminopyridine concentration and from other pacemakers at different levels of extra-cellular caesium concentration. By using the nonlinear time series analysis method, the deterministic dynamics of the experimental chaotic firings were investigated. The period-adding bifurcations with chaos observed in the experiments resembled those simulated in the comb-shaped chaotic region using the HR model. The experimental results show that period-adding bifurcations with chaos are preserved in different two-dimensional parameter spaces, which provides evidence of the existence of the comb-shaped chaotic region and a demonstration of the simulation results in different two-dimensional parameter spaces in the HR neuron model. The results also present relationships between different firing patterns in two-dimensional parameter spaces

  20. Space-Charge Simulation of Integrable Rapid Cycling Synchrotron

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Jeffery [Fermilab; Valishev, Alexander [Fermilab

    2017-05-01

    Integrable optics is an innovation in particle accelerator design that enables strong nonlinear focusing without generating parametric resonances. We use a Synergia space-charge simulation to investigate the application of integrable optics to a high-intensity hadron ring that could replace the Fermilab Booster. We find that incorporating integrability into the design suppresses the beam halo generated by a mismatched KV beam. Our integrable rapid cycling synchrotron (iRCS) design includes other features of modern ring design such as low momentum compaction factor and harmonically canceling sextupoles. Experimental tests of high-intensity beams in integrable lattices will take place over the next several years at the Fermilab Integrable Optics Test Accelerator (IOTA) and the University of Maryland Electron Ring (UMER).

  1. Transition From NASA Space Communication Systems to Commerical Communication Products

    Science.gov (United States)

    Ghazvinian, Farzad; Lindsey, William C.

    1994-01-01

    Transitioning from twenty-five years of space communication system architecting, engineering and development to creating and marketing of commercial communication system hardware and software products is no simple task for small, high-tech system engineering companies whose major source of revenue has been the U.S. Government. Yet, many small businesses are faced with this onerous and perplexing task. The purpose of this talk/paper is to present one small business (LinCom) approach to taking advantage of the systems engineering expertise and knowledge captured in physical neural networks and simulation software by supporting numerous National Aeronautics and Space Administration (NASA) and the Department of Defense (DoD) projects, e.g., Space Shuttle, TDRSS, Space Station, DCSC, Milstar, etc. The innovative ingredients needed for a systems house to transition to a wireless communication system products house that supports personal communication services and networks (PCS and PCN) development in a global economy will be discussed. Efficient methods for using past government sponsored space system research and development to transition to VLSI communication chip set products will be presented along with notions of how synergy between government and industry can be maintained to benefit both parties.

  2. Continuing Professional Development in the Twenty-First Century.

    Science.gov (United States)

    Sachdeva, Ajit K

    2016-01-01

    The critical role of continuing professional development (CPD) in supporting delivery of patient care of the highest quality and safety is receiving significant attention in the current era of monumental change. CPD is essential in efforts to ensure effectiveness of new models of health care delivery, improve outcomes and value in health care, address external regulations, and foster patient engagement. The unique features of CPD; the use of special mastery-based teaching, learning, and assessment methods, and other special interventions to promote excellence; and direct involvement of a variety of key stakeholders differentiate CPD from undergraduate medical education and graduate medical education. The needs of procedural specialties relating to CPD are different from those of primary care disciplines and require special attention for the greatest impact. Simulation-based education and training can be very useful in CPD aimed at improving outcomes and promoting patient safety. Preceptoring, proctoring, mentoring, and coaching should be used routinely to address specific needs in CPD. Distinct CPD strategies are necessary for retraining, reentry, and remediation. Participation in CPD programs can be encouraged by leveraging the joy of learning, which should drive physicians and surgeons to strive continually to be the best in their professional work.

  3. Twenty-First Century Educational Theory and the Challenges of Modern Education: Appealing to the Heritage of the General Teaching Theory of the Secondary Educational Curriculum and the Learning Process

    Science.gov (United States)

    Klarin, Mikhail V.

    2016-01-01

    The article presents an analysis of educational theory in light of the challenges confronting education in the twenty-first century. The author examines how our ideas about the methods for managing the transmission of culture, the subject of education, and the consequences of these changes for the theory of education have changed. The author…

  4. Human spaceflight and space adaptations: Computational simulation of gravitational unloading on the spine

    Science.gov (United States)

    Townsend, Molly T.; Sarigul-Klijn, Nesrin

    2018-04-01

    Living in reduced gravitational environments for a prolonged duration such, as a fly by mission to Mars or an extended stay at the international space station, affects the human body - in particular, the spine. As the spine adapts to spaceflight, morphological and physiological changes cause the mechanical integrity of the spinal column to be compromised, potentially endangering internal organs, nervous health, and human body mechanical function. Therefore, a high fidelity computational model and simulation of the whole human spine was created and validated for the purpose of investigating the mechanical integrity of the spine in crew members during exploratory space missions. A spaceflight exposed spine has been developed through the adaptation of a three-dimensional nonlinear finite element model with the updated Lagrangian formulation of a healthy ground-based human spine in vivo. Simulation of the porohyperelastic response of the intervertebral disc to mechanical unloading resulted in a model capable of accurately predicting spinal swelling/lengthening, spinal motion, and internal stress distribution. The curvature of this space adaptation exposed spine model was compared to a control terrestrial-based finite element model, indicating how the shape changed. Finally, the potential of injury sites to crew members are predicted for a typical 9 day mission.

  5. USSR Space Life Sciences Digest, issue 21

    Science.gov (United States)

    Hooke, Lydia Razran; Donaldson, P. Lynn; Garshnek, Victoria; Rowe, Joseph

    1989-01-01

    This is the twenty-first issue of NASA's USSR Space Life Sciences Digest. It contains abstracts of 37 papers published in Russian language periodicals or books or presented at conferences and of a Soviet monograph on animal ontogeny in weightlessness. Selected abstracts are illustrated with figures and tables from the original. A book review of a work on adaptation to stress is also included. The abstracts in this issue have been identified as relevant to 25 areas of space biology and medicine. These areas are: adaptation, biological rhythms, body fluids, botany, cardiovascular and respiratory systems, cytology, developmental biology, endocrinology, enzymology, equipment and instrumentation, exobiology, gravitational biology, habitability and environmental effects, hematology, human performance, life support systems, mathematical modeling, metabolism, microbiology, musculoskeletal system, neurophysiology, operational medicine, perception, psychology, and reproductive system.

  6. Virtual Reality Simulation of the International Space Welding Experiment

    Science.gov (United States)

    Phillips, James A.

    1996-01-01

    Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) It involves 3-dimensional computer graphics; (2) It includes real-time feedback and response to user actions; and (3) It must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, and we have therefore implemented a VR trainer for the International Space Welding Experiment. My role in the development of the ISWE trainer consisted of the following: (1) created texture-mapped models of the ISWE's rotating sample drum, technology block, tool stowage assembly, sliding foot restraint, and control panel; (2) developed C code for control panel button selection and rotation of the sample drum; (3) In collaboration with Tim Clark (Antares Virtual Reality Systems), developed a serial interface box for the PC and the SGI Indigo so that external control devices, similar to ones actually used on the ISWE, could be used to control virtual objects in the ISWE simulation; (4) In collaboration with Peter Wang (SFFP) and Mark Blasingame (Boeing), established the interference characteristics of the VIM 1000 head-mounted-display and tested software filters to correct the problem; (5) In collaboration with Peter Wang and Mark Blasingame, established software and procedures for interfacing the VPL DataGlove and the Polhemus 6DOF position sensors to the SGI Indigo serial ports. The majority of the ISWE modeling effort was conducted on a PC-based VR Workstation, described below.

  7. Real-space grids and the Octopus code as tools for the development of new simulation approaches for electronic systems

    Science.gov (United States)

    Andrade, Xavier; Strubbe, David; De Giovannini, Umberto; Larsen, Ask Hjorth; Oliveira, Micael J. T.; Alberdi-Rodriguez, Joseba; Varas, Alejandro; Theophilou, Iris; Helbig, Nicole; Verstraete, Matthieu J.; Stella, Lorenzo; Nogueira, Fernando; Aspuru-Guzik, Alán; Castro, Alberto; Marques, Miguel A. L.; Rubio, Angel

    Real-space grids are a powerful alternative for the simulation of electronic systems. One of the main advantages of the approach is the flexibility and simplicity of working directly in real space where the different fields are discretized on a grid, combined with competitive numerical performance and great potential for parallelization. These properties constitute a great advantage at the time of implementing and testing new physical models. Based on our experience with the Octopus code, in this article we discuss how the real-space approach has allowed for the recent development of new ideas for the simulation of electronic systems. Among these applications are approaches to calculate response properties, modeling of photoemission, optimal control of quantum systems, simulation of plasmonic systems, and the exact solution of the Schr\\"odinger equation for low-dimensionality systems.

  8. Twenty years of society of medical informatics of b&h and the journal acta informatica medica.

    Science.gov (United States)

    Masic, Izet

    2012-03-01

    In 2012, Health/Medical informatics profession celebrates five jubilees in Bosnia and Herzegovina: a) Thirty five years from the introduction of the first automatic manipulation of data; b) Twenty five years from establishing Society for Medical Informatics BiH; c) Twenty years from establishing scientific and professional journal of the Society for Medical Informatics of Bosnia and Herzegovina "Acta Informatica Medica"; d) Twenty years from establishing first Cathdra for Medical Informatics on biomedical faculties in Bosnia and Herzegovina and e) Ten years from the introduction of "Distance learning" in medical curriculum. All of the five mentioned activities in the area of Medical informatics had special importance and gave appropriate contribution in the development of Health/Medical informatics in Bosnia And Herzegovina.

  9. Cyber Attacks and Terrorism: A Twenty-First Century Conundrum.

    Science.gov (United States)

    Albahar, Marwan

    2017-01-05

    In the recent years, an alarming rise in the incidence of cyber attacks has made cyber security a major concern for nations across the globe. Given the current volatile socio-political environment and the massive increase in the incidence of terrorism, it is imperative that government agencies rapidly realize the possibility of cyber space exploitation by terrorist organizations and state players to disrupt the normal way of life. The threat level of cyber terrorism has never been as high as it is today, and this has created a lot of insecurity and fear. This study has focused on different aspects of cyber attacks and explored the reasons behind their increasing popularity among the terrorist organizations and state players. This study proposes an empirical model that can be used to estimate the risk levels associated with different types of cyber attacks and thereby provide a road map to conceptualize and formulate highly effective counter measures and cyber security policies.

  10. Projected impact of climate change in the hydroclimatology of Senegal with a focus over the Lake of Guiers for the twenty-first century

    Science.gov (United States)

    Tall, Moustapha; Sylla, Mouhamadou Bamba; Diallo, Ismaïla; Pal, Jeremy S.; Faye, Aïssatou; Mbaye, Mamadou Lamine; Gaye, Amadou Thierno

    2017-07-01

    This study analyzes the impact of anthropogenic climate change in the hydroclimatology of Senegal with a focus over the lake of Guiers basin for the middle (2041-2060) and late twenty-first century (2080-2099). To this end, high-resolution multimodel ensemble based on regional climate model experiments considering two Representative Concentration Pathways (RCP4.5 and RCP8.5) is used. The results indicate that an elevated warming, leading to substantial increase of atmospheric water demand, is projected over the whole of Senegal. In the Lake basin, these increases in potential evapotranspiration (PE) range between 10 and 25 % in the near future and for RCP4.5 while for the far future and RCP8.5, they exceed 50 %. In addition, mean precipitation unveils contrasting changes with wetter (10 to 25 % more) conditions by the middle of the century and drier conditions (more than 50 %) during the late twenty-first century. Such changes cause more/less evapotranspiration and soil moisture respectively during the two future periods. Furthermore, surface runoff shows a tendency to increase in most areas amid few locations including the Lake basin with substantial reduction. Finally, it is found that while semi-arid climates develop in the RCP4.5 scenario, generalized arid conditions prevail over the whole Senegal for RCP8.5. It is thus evident that these future climate conditions substantially threaten freshwater availability for the country and irrigated cropping over the Lake basin. Therefore, strong governmental politics are needed to help design response options to cope with the challenges posed by the projected climate change for the country.

  11. Validation of Varian TrueBeam electron phase–spaces for Monte Carlo simulation of MLC-shaped fields

    Energy Technology Data Exchange (ETDEWEB)

    Lloyd, Samantha A. M. [Department of Physics and Astronomy, University of Victoria, Victoria, British Columbia V8P 3P6 5C2 (Canada); Gagne, Isabelle M., E-mail: imgagne@bccancer.bc.ca; Zavgorodni, Sergei [Department of Medical Physics, BC Cancer Agency–Vancouver Island Centre, Victoria, British Columbia V8R 6V5, Canada and Department of Physics and Astronomy, University of Victoria, Victoria, British Columbia V8W 3P6 5C2 (Canada); Bazalova-Carter, Magdalena [Department of Physics and Astronomy, University of Victoria, Victoria, British Columbia V8W 3P6 5C2 (Canada)

    2016-06-15

    Purpose: This work evaluates Varian’s electron phase–space sources for Monte Carlo simulation of the TrueBeam for modulated electron radiation therapy (MERT) and combined, modulated photon and electron radiation therapy (MPERT) where fields are shaped by the photon multileaf collimator (MLC) and delivered at 70 cm SSD. Methods: Monte Carlo simulations performed with EGSnrc-based BEAMnrc/DOSXYZnrc and PENELOPE-based PRIMO are compared against diode measurements for 5 × 5, 10 × 10, and 20 × 20 cm{sup 2} MLC-shaped fields delivered with 6, 12, and 20 MeV electrons at 70 cm SSD (jaws set to 40 × 40 cm{sup 2}). Depth dose curves and profiles are examined. In addition, EGSnrc-based simulations of relative output as a function of MLC-field size and jaw-position are compared against ion chamber measurements for MLC-shaped fields between 3 × 3 and 25 × 25 cm{sup 2} and jaw positions that range from the MLC-field size to 40 × 40 cm{sup 2}. Results: Percent depth dose curves generated by BEAMnrc/DOSXYZnrc and PRIMO agree with measurement within 2%, 2 mm except for PRIMO’s 12 MeV, 20 × 20 cm{sup 2} field where 90% of dose points agree within 2%, 2 mm. Without the distance to agreement, differences between measurement and simulation are as large as 7.3%. Characterization of simulated dose parameters such as FWHM, penumbra width and depths of 90%, 80%, 50%, and 20% dose agree within 2 mm of measurement for all fields except for the FWHM of the 6 MeV, 20 × 20 cm{sup 2} field which falls within 2 mm distance to agreement. Differences between simulation and measurement exist in the profile shoulders and penumbra tails, in particular for 10 × 10 and 20 × 20 cm{sup 2} fields of 20 MeV electrons, where both sets of simulated data fall short of measurement by as much as 3.5%. BEAMnrc/DOSXYZnrc simulated outputs agree with measurement within 2.3% except for 6 MeV MLC-shaped fields. Discrepancies here are as great as 5.5%. Conclusions: TrueBeam electron phase–spaces

  12. Book review of Capital in the Twenty-First Century, by Thomas Piketty. Cambridge, Massachusetts, London, England: The Belknap Press of Harvard Press, 2014, 605 pages

    OpenAIRE

    Paul Dobrescu; Mălina Ciocea

    2015-01-01

    “Every now and then, the field of economics produces an important book; this is one of them” (Cowen, 2014). These are the opening words of Tyler Cowen’s presentation of Thomas Piketty’s work, “Capital in the Twenty-First Century” (Piketty, 2014), in Foreign Affairs. This is a book that is visibly placed in all important bookstores around the world, widely debated, acclaimed, sold (over 1 million copies have been sold so far). It has been favorably reviewed or quoted in all major journals. The...

  13. First-Year Residents Outperform Third-Year Residents after Simulation-Based Education in Critical Care Medicine

    Science.gov (United States)

    Singer, Benjamin D.; Corbridge, Thomas C.; Schroedl, Clara J.; Wilcox, Jane E.; Cohen, Elaine R.; McGaghie, William C.; Wayne, Diane B.

    2012-01-01

    Introduction Prior research shows that gaps exist in internal medicine residents’ critical care knowledge and skills. The purpose of this study was to compare the bedside critical care competency of first-year residents who received a simulation-based educational intervention plus clinical training to third-year residents who received clinical training alone. Methods During their first three months of residency, a group of first-year residents completed a simulation-based educational intervention. A group of traditionally-trained third-year residents who did not receive simulation-based training served as a comparison group. Both groups were evaluated using a 20-item clinical skills assessment at the bedside of a patient receiving mechanical ventilation at the end of their medical intensive care unit rotation. Scores on the skills assessment were compared between groups. Results Simulator-trained first-year residents (n=40) scored significantly higher compared to traditionally-trained third-year residents (n=27) on the bedside assessment, 91.3% (95% CI 88.2% to 94.3%) vs. 80.9% (95% CI 76.8% to 85.0%), P = simulation-based educational intervention demonstrated higher clinical competency than third-year residents who did not undergo simulation training. Critical care competency cannot be assumed after clinical ICU rotations; simulation-based curricula can help ensure residents are proficient to care for critically ill patients. PMID:23222546

  14. Research on the method of measuring space information network capacity in communication service

    Directory of Open Access Journals (Sweden)

    Zhu Shichao

    2017-02-01

    Full Text Available Because of the large scale characteristic of space information network in terms of space and time and the increasing of its complexity,existing measuring methods of information transmission capacity have been unable to measure the existing and future space information networkeffectively.In this study,we firstly established a complex model of space information network,and measured the whole space information network capacity by means of analyzing data access capability to the network and data transmission capability within the network.At last,we verified the rationality of the proposed measuring method by using STK and Matlab simulation software for collaborative simulation.

  15. OECD/NEZ Main Steam Line Break Benchmark Problem Exercise I Simulation Using the SPACE Code with the Point Kinetics Model

    International Nuclear Information System (INIS)

    Kim, Yohan; Kim, Seyun; Ha, Sangjun

    2014-01-01

    The Safety and Performance Analysis Code for Nuclear Power Plants (SPACE) has been developed in recent years by the Korea Nuclear Hydro and Nuclear Power Co. (KHNP) through collaborative works with other Korean nuclear industries. The SPACE is a best-estimated two-phase three-field thermal-hydraulic analysis code to analyze the safety and performance of pressurized water reactors (PWRs). The SPACE code has sufficient features to replace outdated vendor supplied codes and to be used for the safety analysis of operating PWRs and the design of advanced reactors. As a result of the second phase of the development, the 2.14 version of the code was released through the successive various V and V works. The topical reports on the code and related safety analysis methodologies have been prepared for license works. In this study, the OECD/NEA Main Steam Line Break (MSLB) Benchmark Problem Exercise I was simulated as a V and V work. The results were compared with those of the participants in the benchmark project. The OECD/NEA MSLB Benchmark Problem Exercise I was simulated using the SPACE code. The results were compared with those of the participants in the benchmark project. Through the simulation, it was concluded that the SPACE code can effectively simulate PWR MSLB accidents

  16. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    Science.gov (United States)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  17. Deformation quantization: Twenty years after

    International Nuclear Information System (INIS)

    Sternheimer, Daniel

    1998-01-01

    We first review the historical developments, both in physics and in mathematics, that preceded (and in some sense provided the background of) deformation quantization. Then we describe the birth of the latter theory and its evolution in the past twenty years, insisting on the main conceptual developments and keeping here as much as possible on the physical side. For the physical part the accent is put on its relations to, and relevance for, 'conventional' physics. For the mathematical part we concentrate on the questions of existence and equivalence, including most recent developments for general Poisson manifolds; we touch also noncommutative geometry and index theorems, and relations with group theory, including quantum groups. An extensive (though very incomplete) bibliography is appended and includes background mathematical literature

  18. CfDS attends the first meeting of the All-Party Parliamentary Astronomy and Space Environment Group

    Science.gov (United States)

    Mizon, B.

    1999-06-01

    This group first met on March 11th, 1999, as 'a forum for discussion to further parliamentary interest in astronomy and the space environment affecting terrestrial life and its climate; and to increase awareness of the social, political and philosophical implications of present and future space technologies connected with exploring and understanding the cosmos'. CfDS coordinator Bob Mizon attended the first meeting of the group.

  19. Space for the river, space for diversity?

    NARCIS (Netherlands)

    Warner, J.F.

    2012-01-01

    During the first decade of the twenty-first century, water availability and distribution have become increasingly important for sustainable development and biodiversity conservation. Issues of water scarcity, quality, and accessibility affect the livelihood of many communities across the globe, as

  20. Program NAJOCSC and space charge effect simulation in C01

    International Nuclear Information System (INIS)

    Tang, J.Y.; Chabert, A.; Baron, E.

    1999-01-01

    During the beam tests of the THI project at GANIL, it was found it difficult to increase the beam power above 2 kW at CSS2 extraction. The space charge effect (abbreviated as S.C. effect) in cyclotrons is suspected to play some role in the phenomenon, especially the longitudinal S.C. one and also the coupling between longitudinal and radial motions. The injector cyclotron C01 is studied, and the role played by the S.C. effect in this cyclotron in the THI case is investigated by a simulation method. (K.A.)

  1. Simulations of space charge neutralization in a magnetized electron cooler

    Energy Technology Data Exchange (ETDEWEB)

    Gerity, James [Texas A-M; McIntyre, Peter M. [Texas A-M; Bruhwiler, David Leslie [RadiaSoft, Boulder; Hall, Christopher [RadiaSoft, Boulder; Moens, Vince Jan [Ecole Polytechnique, Lausanne; Park, Chong Shik [Fermilab; Stancari, Giulio [Fermilab

    2017-02-02

    Magnetized electron cooling at relativistic energies and Ampere scale current is essential to achieve the proposed ion luminosities in a future electron-ion collider (EIC). Neutralization of the space charge in such a cooler can significantly increase the magnetized dynamic friction and, hence, the cooling rate. The Warp framework is being used to simulate magnetized electron beam dynamics during and after the build-up of neutralizing ions, via ionization of residual gas in the cooler. The design follows previous experiments at Fermilab as a verification case. We also discuss the relevance to EIC designs.

  2. Contamination Control Assessment of the World's Largest Space Environment Simulation Chamber

    Science.gov (United States)

    Snyder, Aaron; Henry, Michael W.; Grisnik, Stanley P.; Sinclair, Stephen M.

    2012-01-01

    The Space Power Facility s thermal vacuum test chamber is the largest chamber in the world capable of providing an environment for space simulation. To improve performance and meet stringent requirements of a wide customer base, significant modifications were made to the vacuum chamber. These include major changes to the vacuum system and numerous enhancements to the chamber s unique polar crane, with a goal of providing high cleanliness levels. The significance of these changes and modifications are discussed in this paper. In addition, the composition and arrangement of the pumping system and its impact on molecular back-streaming are discussed in detail. Molecular contamination measurements obtained with a TQCM and witness wafers during two recent integrated system tests of the chamber are presented and discussed. Finally, a concluding remarks section is presented.

  3. First-principles real-space tight-binding LMTO calculation of electronic structures for atomic clusters

    International Nuclear Information System (INIS)

    Xie, Z.L.; Dy, K.S.; Wu, S.Y.

    1997-01-01

    A real-space scheme has been developed for a first-principles calculation of electronic structures and total energies of atomic clusters. The scheme is based on the combination of the tight-binding linear muffin-tin orbital (TBLMTO) method and the method of real-space Green close-quote s function. With this approach, the local electronic density of states can be conveniently determined from the real-space Green close-quote s function. Furthermore, the full electron density of a cluster can be directly calculated in real space. The scheme has been shown to be very efficient due to the incorporation of the method of real-space Green close-quote s function and Delley close-quote s method of evaluating multicenter integrals. copyright 1996 The American Physical Society

  4. Space time problems and applications

    DEFF Research Database (Denmark)

    Dethlefsen, Claus

    models, cubic spline models and structural time series models. The development of state space theory has interacted with the development of other statistical disciplines.   In the first part of the Thesis, we present the theory of state space models, including Gaussian state space models, approximative...... analysis of non-Gaussian models, simulation based techniques and model diagnostics.   The second part of the Thesis considers Markov random field models. These are spatial models applicable in e.g. disease mapping and in agricultural experiments. Recently, the Gaussian Markov random field models were...... techniques with importance sampling.   The third part of the Thesis contains applications of the theory. First, a univariate time series of count data is analysed. Then, a spatial model is used to compare wheat yields. Weed count data in connection with a project in precision farming is analysed using...

  5. The next twenty years - IAEA's role

    International Nuclear Information System (INIS)

    Tape, G.F.

    1977-01-01

    The twentieth anniversary of an institution is an appropriate time to look back and to ask what has been achieved. It is also an appropriate time to look ahead and to ask what should be the mission for the future. How can the strengths of the International Atomic Energy Agency (IAEA) be best utilized, what new opportunities should be seized upon, and what challenges should the IAEA be prepared to meet in the next twenty years? Forward planning is a very necessary activity in today's world. There are so many demands on national or institutional resources that careful analysis of options is necessary to establish priorities and ultimately to provide for implementation. But such planning must be done carefully with full appreciation for the validity and sensitivity of the input assumptions and data. Furthermore, today's plan, while setting goals and directions, cannot be so inflexible that it cannot be responsive to ever-changing political, economic and technical constraints or opportunities. Thus in looking ahead, the plan must contain provisions for flexibility to provide for further modifications in the light of ever-changing knowledge, attitudes, and world conditions. The experience of the past five years in the energy field, and especially in nuclear energy, underscores this need. In looking ahead for the next twenty years, we are attempting to describe the International Atomic Energy Agency and its role through the twentieth century. In doing so, we are automatically laying the base for the Agency's work going into the twenty-first century. In short, we are trying to visualize a programme that can serve the coming generation and, in doing so, creating a base from which the needs of the succeeding generation can be met. This is a large order and the crystal ball is less than clear. (author)

  6. The First Soviet Cosmonaut Team Their Lives, Legacy, and Historical Impact

    CERN Document Server

    Burgess, Colin

    2009-01-01

    The First Soviet Cosmonaut Team will relate who these men were and offer far more extensive background stories, in addition to those of the more familiar names of early Soviet space explorers from that group. Many previously-unpublished photographs of these “missing” candidates will also be included for the first time in this book. It will be a detailed, but highly readable and balanced account of the history, training and experiences of the first group of twenty cosmonauts of the USSR. A covert recruitment and selection process was set in motion throughout the Soviet military in August 1959, just prior to the naming of America’s Mercury astronauts. Those selected were ordered to report for training at a special camp outside of Moscow in the spring of 1960. Just a year later, Senior Lieutenant Yuri Gagarin of the Soviet Air Force (promoted in flight to the rank of major) was launched aboard a Vostok spacecraft and became the first person ever to achieve space flight and orbit the Earth.

  7. Analysis of Waves in Space Plasma (WISP) near field simulation and experiment

    Science.gov (United States)

    Richie, James E.

    1992-01-01

    The WISP payload scheduler for a 1995 space transportation system (shuttle flight) will include a large power transmitter on board at a wide range of frequencies. The levels of electromagnetic interference/electromagnetic compatibility (EMI/EMC) must be addressed to insure the safety of the shuttle crew. This report is concerned with the simulation and experimental verification of EMI/EMC for the WISP payload in the shuttle cargo bay. The simulations have been carried out using the method of moments for both thin wires and patches to stimulate closed solids. Data obtained from simulation is compared with experimental results. An investigation of the accuracy of the modeling approach is also included. The report begins with a description of the WISP experiment. A description of the model used to simulate the cargo bay follows. The results of the simulation are compared to experimental data on the input impedance of the WISP antenna with the cargo bay present. A discussion of the methods used to verify the accuracy of the model is shown to illustrate appropriate methods for obtaining this information. Finally, suggestions for future work are provided.

  8. Long-term space changes after premature loss of a primary maxillary first molar

    Directory of Open Access Journals (Sweden)

    Yng-Tzer J. Lin

    2017-03-01

    Conclusion: The anterior and posterior arch dimensions significantly increased 81 months after premature loss of a primary maxillary first molar, which suggested that space maintainers were not needed in these cases.

  9. Virtual Environment User Interfaces to Support RLV and Space Station Simulations in the ANVIL Virtual Reality Lab

    Science.gov (United States)

    Dumas, Joseph D., II

    1998-01-01

    Several virtual reality I/O peripherals were successfully configured and integrated as part of the author's 1997 Summer Faculty Fellowship work. These devices, which were not supported by the developers of VR software packages, use new software drivers and configuration files developed by the author to allow them to be used with simulations developed using those software packages. The successful integration of these devices has added significant capability to the ANVIL lab at MSFC. In addition, the author was able to complete the integration of a networked virtual reality simulation of the Space Shuttle Remote Manipulator System docking Space Station modules which was begun as part of his 1996 Fellowship. The successful integration of this simulation demonstrates the feasibility of using VR technology for ground-based training as well as on-orbit operations.

  10. Robotic Design Choice Overview using Co-simulation and Design Space Exploration

    DEFF Research Database (Denmark)

    Christiansen, Martin Peter; Larsen, Peter Gorm; Nyholm Jørgensen, Rasmus

    2015-01-01

    . Simulations are used to evaluate the robot model output response in relation to operational demands. An example of a load carrying challenge in relation to the feeding robot is presented and a design space is defined with candidate solutions in both the mechanical and software domains. Simulation results......Rapid robotic system development has created a demand for multi-disciplinary methods and tools to explore and compare design alternatives. In this paper, we present a collaborative modelling technique that combines discrete-event models of controller software with continuous-time models of physical...... robot components. The proposed co-modelling method utilises Vienna Development Method (VDM) and Matlab for discrete-event modelling and 20-sim for continuous-time modelling. The model-based development of a mobile robot mink feeding system is used to illustrate the collaborative modelling method...

  11. Deep Space Storm Shelter Simulation Study

    Science.gov (United States)

    Dugan, Kathryn; Phojanamongkolkij, Nipa; Cerro, Jeffrey; Simon, Matthew

    2015-01-01

    Missions outside of Earth's magnetic field are impeded by the presence of radiation from galactic cosmic rays and solar particle events. To overcome this issue, NASA's Advanced Exploration Systems Radiation Works Storm Shelter (RadWorks) has been studying different radiation protective habitats to shield against the onset of solar particle event radiation. These habitats have the capability of protecting occupants by utilizing available materials such as food, water, brine, human waste, trash, and non-consumables to build short-term shelters. Protection comes from building a barrier with the materials that dampens the impact of the radiation on astronauts. The goal of this study is to develop a discrete event simulation, modeling a solar particle event and the building of a protective shelter. The main hallway location within a larger habitat similar to the International Space Station (ISS) is analyzed. The outputs from this model are: 1) the total area covered on the shelter by the different materials, 2) the amount of radiation the crew members receive, and 3) the amount of time for setting up the habitat during specific points in a mission given an event occurs.

  12. Simulation-based training for thoracoscopic lobectomy

    DEFF Research Database (Denmark)

    Jensen, Katrine; Ringsted, Charlotte; Hansen, Henrik Jessen

    2014-01-01

    overcome the first part of the learning curve, but no virtual-reality simulators for thoracoscopy are commercially available. This study aimed to investigate whether training on a laparoscopic simulator enables trainees to perform a thoracoscopic lobectomy. METHODS: Twenty-eight surgical residents were...... randomized to either virtual-reality training on a nephrectomy module or traditional black-box simulator training. After a retention period they performed a thoracoscopic lobectomy on a porcine model and their performance was scored using a previously validated assessment tool. RESULTS: The groups did...... not differ in age or gender. All participants were able to complete the lobectomy. The performance of the black-box group was significantly faster during the test scenario than the virtual-reality group: 26.6 min (SD 6.7 min) versus 32.7 min (SD 7.5 min). No difference existed between the two groups when...

  13. Understanding the microscopic moisture migration in pore space using DEM simulation

    Directory of Open Access Journals (Sweden)

    Yuan Guo

    2015-04-01

    Full Text Available The deformation of soil skeleton and migration of pore fluid are the major factors relevant to the triggering of and damages by liquefaction. The influence of pore fluid migration during earthquake has been demonstrated from recent model experiments and field case studies. Most of the current liquefaction assessment models are based on testing of isotropic liquefiable materials. However the recent New Zealand earthquake shows much severer damages than those predicted by existing models. A fundamental cause has been contributed to the embedded layers of low permeability silts. The existence of these silt layers inhibits water migration under seismic loads, which accelerated liquefaction and caused a much larger settlement than that predicted by existing theories. This study intends to understand the process of moisture migration in the pore space of sand using discrete element method (DEM simulation. Simulations were conducted on consolidated undrained triaxial testing of sand where a cylinder sample of sand was built and subjected to a constant confining pressure and axial loading. The porosity distribution was monitored during the axial loading process. The spatial distribution of porosity change was determined, which had a direct relationship with the distribution of excess pore water pressure. The non-uniform distribution of excess pore water pressure causes moisture migration. From this, the migration of pore water during the loading process can be estimated. The results of DEM simulation show a few important observations: (1 External forces are mainly carried and transmitted by the particle chains of the soil sample; (2 Porosity distribution during loading is not uniform due to non-homogeneous soil fabric (i.e. the initial particle arrangement and existence of particle chains; (3 Excess pore water pressure develops differently at different loading stages. At the early stage of loading, zones with a high initial porosity feature higher

  14. Comparative proteomic analysis of rice after seed ground simulated radiation and spaceflight explains the radiation effects of space environment

    Science.gov (United States)

    Wang, Wei; Shi, Jinming; Liang, Shujian; Lei, Huang; Shenyi, Zhang; Sun, Yeqing

    In previous work, we compared the proteomic profiles of rice plants growing after seed space-flights with ground controls by two-dimensional difference gel electrophoresis (2-D DIGE) and found that the protein expression profiles were changed after seed space environment exposures. Spaceflight represents a complex environmental condition in which several interacting factors such as cosmic radiation, microgravity and space magnetic fields are involved. Rice seed is in the process of dormant of plant development, showing high resistance against stresses, so the highly ionizing radiation (HZE) in space is considered as main factor causing biological effects to seeds. To further investigate the radiation effects of space environment, we performed on-ground simulated HZE particle radiation and compared between the proteomes of seed irra-diated plants and seed spaceflight (20th recoverable satellite) plants from the same rice variety. Space ionization shows low-dose but high energy particle effects, for searching the particle effects, ground radiations with the same low-dose (2mGy) but different liner energy transfer (LET) values (13.3KeV/µm-C, 30KeV/µm-C, 31KeV/µm-Ne, 62.2KeV/µm-C, 500Kev/µm-Fe) were performed; using 2-D DIGE coupled with clustering and principle component analysis (PCA) for data process and comparison, we found that the holistic protein expression patterns of plants irradiated by LET-62.2KeV/µm carbon particles were most similar to spaceflight. In addition, although space environment presents a low-dose radiation (0.177 mGy/day on the satellite), the equivalent simulated radiation dose effects should still be evaluated: radiations of LET-62.2KeV/µm carbon particles with different cumulative doses (2mGy, 20mGy, 200mGy, 2000mGy) were further carried out and resulted that the 2mGy radiation still shared most similar proteomic profiles with spaceflight, confirming the low-dose effects of space radiation. Therefore, in the protein expression level

  15. Modal survey testing of the Lidar In-space Technology Experiment (LITE) - A Space Shuttle payload

    Science.gov (United States)

    Anderson, J. B.; Coleman, A. D.; Driskill, T. C.; Lindell, M. C.

    This paper presents the results of the modal survey test of the Lidar In-space Technology Experiment (LITE), a Space Shuttle payload mounted in a Spacelab flight single pallet. The test was performed by the Dynamics Test Branch at Marshall Space Flight Center, AL and run in two phases. In the first phase, an unloaded orthogrid connected to the pallet with 52 tension struts was tested. This test included 73 measurement points in three directions. In the second phase, the pallet was integrated with mass simulators mounted on the flight support structure to represent the dynamics (weight and center of gravity) of the various components comprising the LITE experiment and instrumented at 213 points in 3 directions. The test article was suspended by an air bag system to simulate a free-free boundary condition. This paper presents the results obtained from the testing and analytical model correlation efforts. The effect of the suspension system on the test article is also discussed.

  16. The ionic structure of liquid sodium obtained by numerical simulation from 'first principles' and ab initio 'norm-conserving' pseudopotentials

    International Nuclear Information System (INIS)

    Harchaoui, N; Hellal, S; Grosdidier, B; Gasser, J G

    2008-01-01

    The physical properties of disordered matter depend on the 'atomic structure' i.e. the arrangement of the atoms. This arrangement is described by the structure factor S (q) in reciprocal space and by the pair correlation function g(r) in real space. The structure factor is obtained experimentally while the numerical simulation enables us to obtain the pair correlation function. Liquid sodium is one of the elements the most studied and one can wonder about new scientific contribution appropriateness. The majority of theoretical calculations are compared with the experiment of Waseda. However two other posterior measurements have been published and give different results, in particular with regard to the height of the first peak of the structure factor. Three models of pseudopotential are considered to describe the electron-ion interaction. The first is a local pseudopotential with the alternative known as 'individual' of the model suggested by Fiolhais et al. The second model considered is that of Bachelet et al. This one, ab-initio and 'norm conserving', is non local. The last model is that proposed by Shaw known as 'first principles' and 'energy dependent'. Various static dielectric functions characteristic of the effects of exchange and correlation have been used and developed by Hellal et al. We calculated the form factors (pseudopotential in reciprocal space) and deduce the normalized energy-wave-number characteristic F N (q), the interatomic pair potential V eff (r), then the pair correlation function g(r) by molecular dynamics. The structure factor S(q) is obtained by Fourier transform and is compared with the experiment. Our calculations with the Bachelet and Shaw pseudopotentials are close to the last experiments of Greenfield et al. and of Huijben et al. Our results are discussed

  17. Outlook: The Next Twenty Years

    Energy Technology Data Exchange (ETDEWEB)

    Murayama, Hitoshi

    2003-12-07

    I present an outlook for the next twenty years in particle physics. I start with the big questions in our field, broken down into four categories: horizontal, vertical, heaven, and hell. Then I discuss how we attack the bigquestions in each category during the next twenty years. I argue for a synergy between many different approaches taken in our field.

  18. Numerical Simulation and Optimization of Hole Spacing for Cement Grouting in Rocks

    Directory of Open Access Journals (Sweden)

    Ping Fu

    2013-01-01

    Full Text Available The fine fissures of V-diabase were the main stratigraphic that affected the effectiveness of foundation grout curtain in Dagang Mountain Hydropower Station. Thus, specialized in situ grouting tests were conducted to determine reasonable hole spacing and other parameters. Considering time variation of the rheological parameters of grout, variation of grouting pressure gradient, and evolution law of the fracture opening, numerical simulations were performed on the diffusion process of cement grouting in the fissures of the rock mass. The distribution of permeability after grouting was obtained on the basis of analysis results, and the grouting hole spacing was discussed based on the reliability analysis. A probability of optimization along with a finer optimization precision as 0.1 m could be adopted when compared with the accuracy of 0.5 m that is commonly used. The results could provide a useful reference for choosing reasonable grouting hole spacing in similar projects.

  19. Simulation of the Plasma Meniscus with and without Space Charge using Triode Extraction System

    International Nuclear Information System (INIS)

    Abdel Rahman, M.M.; EI-Khabeary, H.

    2007-01-01

    In this work simulation of the singly charged argon ion trajectories for a variable plasma meniscus is studied with and without space charge for the triode extraction system by using SIMION 3D (Simulation of Ion Optics in Three Dimensions) version 7 personal computer program. Tbe influence of acceleration voltage applied to tbe acceleration electrode of the triode extraction system on the shape of the plasma meniscus has been determined. The plasma electrode is set at +5000 volt and the acceleration voltage applied to the acceleration electrode is varied from -5000 volt to +5000 volt. In the most of the concave and convex plasma shapes ion beam emittance can be calculated by using separate standard deviations of positions and elevations angles. Ion beam emittance as a function of the curvature of the plasma meniscus for different plasma shapes ( flat concave and convex ) without space change at acceleration voltage varied from -5000 volt to +5000 volt applied to the acceleration electrode of the triode extraction system has been investigated. Tbe influence of the extraction gap on ion beam emittance for a plasma concave shape of 3.75 mm without space charge at acceleration voltage, V a cc = -2000 volt applied to the acceleration electrode of the triode extraction system has been determined. Also the influence of space charge on ion beam emittance for variable plasma meniscus at acceleration voltage, V a cc = - 2000 volt applied to the acceleration electrode of. the triode extraction system has been studied

  20. Simulation of the plasma meniscus with and without space charge using triode extraction system

    International Nuclear Information System (INIS)

    Rahman, M.M.Abdel; El-Khabeary, H.

    2009-01-01

    In this work, simulation of the singly charged argon ion trajectories for a variable plasma meniscus is studied with and without space charge for the triode extraction system by using SIMION 3D (Simulation of Ion Optics in Three Dimensions) version 7 personal computer program. The influence of acceleration voltage applied to the acceleration electrode of the triode extraction system on the shape of the plasma meniscus has been determined. The plasma electrode is set at +5000 volt and the acceleration voltage applied to the acceleration electrode is varied from -5000 volt to +5000 volt. In the most of the concave and convex plasma shapes, ion beam emittance can be calculated by using separate standard deviations of positions and elevations angles. Ion beam emittance as a function of the curvature of the plasma meniscus for different plasma shapes ( flat, concave and convex ) without space charge at acceleration voltage varied from -5000 volt to +5000 volt applied to the acceleration electrode of the triode extraction system has been investigated. The influence of the extraction gap on ion beam emittance for a plasma concave shape of 3.75 mm without space charge at acceleration voltage, V acc = -2000 volt applied to the acceleration electrode of the triode extraction system has been determined. Also the influence of space charge on ion beam emittance for variable plasma meniscus at acceleration voltage, V acc = -2000 volt applied to the acceleration electrode of the triode extraction system has been studied. (author)

  1. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    International Nuclear Information System (INIS)

    Ahmadi, Rouhollah; Khamehchi, Ehsan

    2013-01-01

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data

  2. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com [Amirkabir University of Technology, PhD Student at Reservoir Engineering, Department of Petroleum Engineering (Iran, Islamic Republic of); Khamehchi, Ehsan [Amirkabir University of Technology, Faculty of Petroleum Engineering (Iran, Islamic Republic of)

    2013-12-15

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.

  3. Loss of space and dental arch length after the loss of the lower first primary molar: a longitudinal study.

    Science.gov (United States)

    Cuoghi, O A; Bertoz, F A; de Mendonca, M R; Santos, E C

    1998-01-01

    The premature loss of primary teeth may harm the normal occlusal development, although there are debates relating to the necessity of using space maintainer appliances. The aim of the study is to evaluate the changes in the dental arch perimeter and the space reduction after the premature loss of the lower first primary molar in the mixed dentition stage. The sample consists of 4 lower arch plaster models of 31 patients, within the period of pre-extraction, 6, 12 and 18 months after the lower first primary molar extraction. A reduction of space was of noted with the cuspid dislocation and the permanent incisors moving toward the space of the extraction site. It was concluded that the lower first molar primary premature loss, during the mixed dentition, implicates an immediate placement of a space maintainer.

  4. ALADIN: the first european lidar in space

    Science.gov (United States)

    Morançais, Didier; Fabre, Frédéric; Schillinger, Marc; Barthès, Jean-Claude; Endemann, Martin; Culoma, Alain; Durand, Yannig

    2017-11-01

    The Atmospheric LAser Doppler INstrument (ALADIN) is the payload of the ESA's ADMAEOLUS mission, which aims at measuring wind profiles as required by the climatology and meteorology users. ALADIN belongs to a new class of Earth Observation payloads and will be the first European Lidar in space. The instrument comprises a diode-pumped high energy Nd:YAG laser and a direct detection receiver operating on aerosol and molecular backscatter signals in parallel. In addition to the Proto- Flight Model (PFM)., two instrument models are developed: a Pre-development Model (PDM) and an Opto-Structure-Thermal Model (OSTM). The flight instrument design and the industrial team has been finalised and the major equipment are now under development. This paper describes the instrument design and performance as well as the development and verification approach. The main results obtained during the PDM programme are also reported. The ALADIN instrument is developed under prime contractorship from EADS Astrium SAS with a consortium of thirty European companies.

  5. First observations of iodine oxide from space

    Science.gov (United States)

    Saiz-Lopez, Alfonso; Chance, Kelly; Liu, Xiong; Kurosu, Thomas P.; Sander, Stanley P.

    2007-06-01

    We present retrievals of IO total columns from the Scanning Imaging Absorption Spectrometer for Atmospheric Chartography (SCIAMACHY) satellite instrument. We analyze data for October 2005 in the polar regions to demonstrate for the first time the capability to measure IO column abundances from space. During the period of analysis (i.e. Southern Hemisphere springtime), enhanced IO vertical columns over 3 × 1013 molecules cm-2 are observed around coastal Antarctica; by contrast during that time in the Artic region IO is consistently below the calculated instrumental detection limit for individual radiance spectra (2-4 × 1012 molecules cm-2 for slant columns). The levels reported here are in reasonably good agreement with previous ground-based measurements at coastal Antarctica. These results also demonstrate that IO is widespread over sea-ice covered areas in the Southern Ocean. The occurrence of elevated IO and its hitherto unrecognized spatial distribution suggest an efficient iodine activation mechanism at a synoptic scale over coastal Antarctica.

  6. Pre-simulation orientation for medical trainees: An approach to decrease anxiety and improve confidence and performance.

    Science.gov (United States)

    Bommer, Cassidy; Sullivan, Sarah; Campbell, Krystle; Ahola, Zachary; Agarwal, Suresh; O'Rourke, Ann; Jung, Hee Soo; Gibson, Angela; Leverson, Glen; Liepert, Amy E

    2018-02-01

    We assessed the effect of basic orientation to the simulation environment on anxiety, confidence, and clinical decision making. Twenty-four graduating medical students participated in a two-week surgery preparatory curriculum, including three simulations. Baseline anxiety was assessed pre-course. Scenarios were completed on day 2 and day 9. Prior to the first simulation, participants were randomly divided into two groups. Only one group received a pre-simulation orientation. Before the second simulation, all students received the same orientation. Learner anxiety was reported immediately preceding and following each simulation. Confidence was assessed post-simulation. Performance was evaluated by surgical faculty. The oriented group experienced decreased anxiety following the first simulation (p = 0.003); the control group did not. Compared to the control group, the oriented group reported less anxiety and greater confidence and received higher performance scores following all three simulations (all p simulation orientation reduces anxiety while increasing confidence and improving performance. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Strong Inference in Mathematical Modeling: A Method for Robust Science in the Twenty-First Century

    Science.gov (United States)

    Ganusov, Vitaly V.

    2016-01-01

    While there are many opinions on what mathematical modeling in biology is, in essence, modeling is a mathematical tool, like a microscope, which allows consequences to logically follow from a set of assumptions. Only when this tool is applied appropriately, as microscope is used to look at small items, it may allow to understand importance of specific mechanisms/assumptions in biological processes. Mathematical modeling can be less useful or even misleading if used inappropriately, for example, when a microscope is used to study stars. According to some philosophers (Oreskes et al., 1994), the best use of mathematical models is not when a model is used to confirm a hypothesis but rather when a model shows inconsistency of the model (defined by a specific set of assumptions) and data. Following the principle of strong inference for experimental sciences proposed by Platt (1964), I suggest “strong inference in mathematical modeling” as an effective and robust way of using mathematical modeling to understand mechanisms driving dynamics of biological systems. The major steps of strong inference in mathematical modeling are (1) to develop multiple alternative models for the phenomenon in question; (2) to compare the models with available experimental data and to determine which of the models are not consistent with the data; (3) to determine reasons why rejected models failed to explain the data, and (4) to suggest experiments which would allow to discriminate between remaining alternative models. The use of strong inference is likely to provide better robustness of predictions of mathematical models and it should be strongly encouraged in mathematical modeling-based publications in the Twenty-First century. PMID:27499750

  8. Strong Inference in Mathematical Modeling: A Method for Robust Science in the Twenty-First Century.

    Science.gov (United States)

    Ganusov, Vitaly V

    2016-01-01

    While there are many opinions on what mathematical modeling in biology is, in essence, modeling is a mathematical tool, like a microscope, which allows consequences to logically follow from a set of assumptions. Only when this tool is applied appropriately, as microscope is used to look at small items, it may allow to understand importance of specific mechanisms/assumptions in biological processes. Mathematical modeling can be less useful or even misleading if used inappropriately, for example, when a microscope is used to study stars. According to some philosophers (Oreskes et al., 1994), the best use of mathematical models is not when a model is used to confirm a hypothesis but rather when a model shows inconsistency of the model (defined by a specific set of assumptions) and data. Following the principle of strong inference for experimental sciences proposed by Platt (1964), I suggest "strong inference in mathematical modeling" as an effective and robust way of using mathematical modeling to understand mechanisms driving dynamics of biological systems. The major steps of strong inference in mathematical modeling are (1) to develop multiple alternative models for the phenomenon in question; (2) to compare the models with available experimental data and to determine which of the models are not consistent with the data; (3) to determine reasons why rejected models failed to explain the data, and (4) to suggest experiments which would allow to discriminate between remaining alternative models. The use of strong inference is likely to provide better robustness of predictions of mathematical models and it should be strongly encouraged in mathematical modeling-based publications in the Twenty-First century.

  9. Twenty Practices of an Entrepreneurial University

    DEFF Research Database (Denmark)

    Gjerding, Allan Næs; Wilderom, Celeste P.M.; Cameron, Shona P.B.

    2006-01-01

    studies twenty organisational practices against which a University's entrepreneurship can be measured. These twenty practices or factors in effect formed the basis for an entrepreneurship audit. During a series of interviews, the extent to which the universities are seen as entrepreneurial...

  10. Computer graphics testbed to simulate and test vision systems for space applications

    Science.gov (United States)

    Cheatham, John B.

    1991-01-01

    Artificial intelligence concepts are applied to robotics. Artificial neural networks, expert systems and laser imaging techniques for autonomous space robots are being studied. A computer graphics laser range finder simulator developed by Wu has been used by Weiland and Norwood to study use of artificial neural networks for path planning and obstacle avoidance. Interest is expressed in applications of CLIPS, NETS, and Fuzzy Control. These applications are applied to robot navigation.

  11. Surgical Space Suits Increase Particle and Microbiological Emission Rates in a Simulated Surgical Environment.

    Science.gov (United States)

    Vijaysegaran, Praveen; Knibbs, Luke D; Morawska, Lidia; Crawford, Ross W

    2018-05-01

    The role of space suits in the prevention of orthopedic prosthetic joint infection remains unclear. Recent evidence suggests that space suits may in fact contribute to increased infection rates, with bioaerosol emissions from space suits identified as a potential cause. This study aimed to compare the particle and microbiological emission rates (PER and MER) of space suits and standard surgical clothing. A comparison of emission rates between space suits and standard surgical clothing was performed in a simulated surgical environment during 5 separate experiments. Particle counts were analyzed with 2 separate particle counters capable of detecting particles between 0.1 and 20 μm. An Andersen impactor was used to sample bacteria, with culture counts performed at 24 and 48 hours. Four experiments consistently showed statistically significant increases in both PER and MER when space suits are used compared with standard surgical clothing. One experiment showed inconsistent results, with a trend toward increases in both PER and MER when space suits are used compared with standard surgical clothing. Space suits cause increased PER and MER compared with standard surgical clothing. This finding provides mechanistic evidence to support the increased prosthetic joint infection rates observed in clinical studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Global Vlasov simulation on magnetospheres of astronomical objects

    International Nuclear Information System (INIS)

    Umeda, Takayuki; Ito, Yosuke; Fukazawa, Keiichiro

    2013-01-01

    Space plasma is a collisionless, multi-scale, and highly nonlinear medium. There are various types of self-consistent computer simulations that treat space plasma according to various approximations. We develop numerical schemes for solving the Vlasov (collisionless Boltzmann) equation, which is the first-principle kinetic equation for collisionless plasma. The weak-scaling benchmark test shows that our parallel Vlasov code achieves a high performance and a high scalability. Currently, we use more than 1000 cores for parallel computations and apply the present parallel Vlasov code to various cross-scale processes in space plasma, such as a global simulation on the interaction between solar/stellar wind and magnetospheres of astronomical objects

  13. Space Debris Elimination (SpaDE)

    Data.gov (United States)

    National Aeronautics and Space Administration — The amount of debris in low Earth orbit (LEO) has increased rapidly over the last twenty years. This prevalence of debris increases the likelihood of cascading...

  14. Framatome's ''SAF'' engineering simulator: a first step toward defining the engineer's simulation tool of the year 2000

    International Nuclear Information System (INIS)

    Constantieux, T.

    1986-01-01

    Among the techniques available to engineers today, computerized simulation is taking on an ever-growing importance. The ''SAF'' simulator, designed by Framatome for the use of its own engineers, has been in service since 1985. The SAF simulator provides continuous assistance to the engineer, from the preliminary design stage to the precise definition of operating procedures, including safety analysis and sizing computations. For the engineer of the year 2000, who will be used to dialoguing with the computer from a very young age, the SAF represents a first step toward a comprehensive simulation tool. Interactive and thus ''alive'', the SAF combines both extensive programming and data processing capabilities. Its simulation domain can still be considerably extended. Highly modular and equipped with easy-to-use compilers, the SAF can be readily modified and reconfigured by the user, to enable testing new models or new systems, in the complex and detailed environment of the nuclear unit being analysed. Employing the advanced computer programs used in project design, the SAF simulator is a particularly high-performance tool for simulating and analysing complex accident scenarios, including multiple equipment failures and possible operator errors, which may extend to complete draining of the reactor vessel and the release of radioactive fission products within the containment structure

  15. A Simulation Model Articulation of the REA Ontology

    Science.gov (United States)

    Laurier, Wim; Poels, Geert

    This paper demonstrates how the REA enterprise ontology can be used to construct simulation models for business processes, value chains and collaboration spaces in supply chains. These models support various high-level and operational management simulation applications, e.g. the analysis of enterprise sustainability and day-to-day planning. First, the basic constructs of the REA ontology and the ExSpect modelling language for simulation are introduced. Second, collaboration space, value chain and business process models and their conceptual dependencies are shown, using the ExSpect language. Third, an exhibit demonstrates the use of value chain models in predicting the financial performance of an enterprise.

  16. First order simulations on time measurements using inorganic scintillators for PET applications

    International Nuclear Information System (INIS)

    Joly, B.; Montarou, G.; Pauna, N.

    2008-01-01

    Time measurements based on scintillating crystals are used in many different experimental sets-up in high energy physics, nuclear physics and medical imaging (e.g. PET). Time of Flight (TOF) positron emission tomography (PET) is based on the measurement of the difference between the detection times of the two gamma arising from positrons decays. The fundamental improvement of TOF is an increase in signal to noise ratio which translates into sensitivity improvement. Conventional method for time measurements is based on the detection of first photoelectrons. Recently, in LHC experiments and more particularly for electromagnetic calorimeter, a fully digital method based on optimal filtering that considers samples of the entire signal was successfully applied. Since such a method allows ultimately time resolutions of about a few tens of picoseconds, for this report, first order simulations were performed using a simplified model of a detection block made of a PMT coupled to a LYSO or LaBr 3 crystal. These simulations were achieved to estimate time resolutions with the conventional method (first photoelectrons detection with CFD) or the optimal filtering. A hybrid method is also tested to be applied with fast running front-end electronics. These simulations will be the basis for experimental future studies. (authors)

  17. First order simulations on time measurements using inorganic scintillators for PET applications

    Energy Technology Data Exchange (ETDEWEB)

    Joly, B.; Montarou, G.; Pauna, N

    2008-07-01

    Time measurements based on scintillating crystals are used in many different experimental sets-up in high energy physics, nuclear physics and medical imaging (e.g. PET). Time of Flight (TOF) positron emission tomography (PET) is based on the measurement of the difference between the detection times of the two gamma arising from positrons decays. The fundamental improvement of TOF is an increase in signal to noise ratio which translates into sensitivity improvement. Conventional method for time measurements is based on the detection of first photoelectrons. Recently, in LHC experiments and more particularly for electromagnetic calorimeter, a fully digital method based on optimal filtering that considers samples of the entire signal was successfully applied. Since such a method allows ultimately time resolutions of about a few tens of picoseconds, for this report, first order simulations were performed using a simplified model of a detection block made of a PMT coupled to a LYSO or LaBr{sub 3} crystal. These simulations were achieved to estimate time resolutions with the conventional method (first photoelectrons detection with CFD) or the optimal filtering. A hybrid method is also tested to be applied with fast running front-end electronics. These simulations will be the basis for experimental future studies. (authors)

  18. An Optical Lightning Simulator in an Electrified Cloud-Resolving Model to Prepare the Future Space Lightning Missions

    Science.gov (United States)

    Bovalo, Christophe; Defer, Eric; Pinty, Jean-Pierre

    2016-04-01

    The future decade will see the launch of several space missions designed to monitor the total lightning activity. Among these missions, the American (Geostationary Lightning Mapper - GLM) and European (Lightning Imager - LI) optical detectors will be onboard geostationary satellites (GOES-R and MTG, respectively). For the first time, the total lightning activity will be monitored over the full Earth disk and at a very high temporal resolution (2 and 1 ms, respectively). Missions like the French Tool for the Analysis of Radiation from lightNIng and Sprites (TARANIS) and ISS-LIS will bring complementary information in order to better understand the lightning physics and to improve the weather prediction (nowcasting and forecasting). Such missions will generate a huge volume of new and original observations for the scientific community and weather prediction centers that have to be prepared. Moreover, before the launch of these missions, fundamental questions regarding the interpretation of the optical signal property and its relation to cloud optical thickness and lightning discharge processes need to be further investigated. An innovative approach proposed here is to use the synergy existing in the French MesoNH Cloud-Resolving Model (CRM). Indeed, MesoNH is one of the only CRM able to simulate the lifecycle of electrical charges generated within clouds through non-inductive charging process (dependent of the 1-moment microphysical scheme). The lightning flash geometry is based on a fractal law while the electrical field is diagnosed thanks to the Gauss' law. The lightning optical simulator is linked to the electrical scheme as the lightning radiance at 777.4 nm is a function of the lightning current, approximated by the charges neutralized along the lightning path. Another important part is the scattering of this signal by the hydrometeors (mainly ice particles) that is taken into account. Simulations at 1-km resolution are done over the Langmuir Laboratory (New

  19. Qualitative Simulation of Photon Transport in Free Space Based on Monte Carlo Method and Its Parallel Implementation

    Directory of Open Access Journals (Sweden)

    Xueli Chen

    2010-01-01

    Full Text Available During the past decade, Monte Carlo method has obtained wide applications in optical imaging to simulate photon transport process inside tissues. However, this method has not been effectively extended to the simulation of free-space photon transport at present. In this paper, a uniform framework for noncontact optical imaging is proposed based on Monte Carlo method, which consists of the simulation of photon transport both in tissues and in free space. Specifically, the simplification theory of lens system is utilized to model the camera lens equipped in the optical imaging system, and Monte Carlo method is employed to describe the energy transformation from the tissue surface to the CCD camera. Also, the focusing effect of camera lens is considered to establish the relationship of corresponding points between tissue surface and CCD camera. Furthermore, a parallel version of the framework is realized, making the simulation much more convenient and effective. The feasibility of the uniform framework and the effectiveness of the parallel version are demonstrated with a cylindrical phantom based on real experimental results.

  20. A Painter's View of the Cosmos In the Twenty-first Century

    Science.gov (United States)

    Cro-Ken, K.

    2016-01-01

    I am an ecosystem artist who uses paint to bring nature's “invisible forces” into view. My eco-sensitive palette recreates the push-pull forces that shape and mold all things. As a result, I create microscopic and telescopic views of earth and places scattered throughout our universe. Self-similarity led me to realize that if I want my mind to wonder into the far reaches of the universe, I must draw closer to nature. I show how space looks and appears and, more importantly, how it moves. My speed element palette is a portal through which I peer into the universe at scales great and small using paint as my lens. Microscopes, telescopes, the Internet, and even eyeglasses are portals through which technology affords us the ability to see that which is unseen to the unaided eye. Rather than see the world and then paint, the opposite is true for me. My work is revelatory, not representational and, as such, seeks similar occurrences in nature. Just as a planet's surface is a visual record of past events, so too do speed element experiments reveal traces of the past. It would be more accurate to call a painting that comes to rest a “painted.” It is video that captures images that eluded capture by the canvas and could more accurately be called a “painting. ” Simply put, I manipulate space, time, and matter—and the matter is never just paint.

  1. Isobaric-isothermal Monte Carlo simulations from first principles: Application to liquid water at ambient conditions

    Energy Technology Data Exchange (ETDEWEB)

    McGrath, M; Siepmann, J I; Kuo, I W; Mundy, C J; VandeVondele, J; Hutter, J; Mohamed, F; Krack, M

    2004-12-02

    A series of first principles Monte Carlo simulations in the isobaric-isothermal ensemble were carried out for liquid water at ambient conditions (T = 298 K and p = 1 atm). The Becke-Lee-Yang-Parr (BLYP) exchange and correlation energy functionals and norm-conserving Goedecker-Teter-Hutter (GTH) pseudopotentials were employed with the CP2K simulation package to examine systems consisting of 64 water molecules. The fluctuations in the system volume encountered in simulations in the isobaric-isothermal ensemble requires a reconsideration of the suitability of the typical charge density cutoff and the regular grid generation method previously used for the computation of the electrostatic energy in first principles simulations in the microcanonical or canonical ensembles. In particular, it is noted that a much higher cutoff is needed and that the most computationally efficient method of creating grids can result in poor simulations. Analysis of the simulation trajectories using a very large charge density cutoff at 1200 Ry and four different grid generation methods point to a substantially underestimated liquid density of about 0.85 g/cm{sup 3} resulting in a somewhat understructured liquid (with a value of about 2.7 for the height of the first peak in the oxygen/oxygen radial distribution function) for BLYP-GTH water at ambient conditions.

  2. Creating the Deep Space Environment for Testing the James Webb Space Telescope at NASA Johnson Space Center's Chamber A

    Science.gov (United States)

    Homan, Jonathan L.; Cerimele, Mary P.; Montz, Michael E.; Bachtel, Russell; Speed, John; O'Rear, Patrick

    2013-01-01

    Chamber A is the largest thermal vacuum chamber at the Johnson Space Center and is one of the largest space environment chambers in the world. The chamber is 19.8 m (65 ft.) in diameter and 36.6 m (120 ft.) tall and is equipped with cryogenic liquid nitrogen panels (shrouds) and gaseous helium shrouds to create a simulated space environment. It was originally designed and built in the mid 1960 s to test the Apollo Command and Service Module and several manned tests were conducted on that spacecraft, contributing to the success of the program. The chamber has been used since that time to test spacecraft active thermal control systems, Shuttle DTO, DOD, and ESA hardware in simulated Low Earth Orbit (LEO) conditions. NASA is now moving from LEO towards exploration of locations with environments approaching those of deep space. Therefore, Chamber A has undergone major modifications to enable it to simulate these deeper space environments. Environmental requirements were driven, and modifications were funded by the James Webb Space Telescope program, and this telescope, which will orbit Solar/Earth L2, will be the first test article to benefit from the chamber s new capabilities. To accommodate JWST, the Chamber A high vacuum system has been modernized, additional LN2 shrouds have been installed, the liquid nitrogen system has been modified to minimize dependency on electrical power and increase its reliability, a new helium shroud/refrigeration system has been installed to create a colder more stable and uniform heat sink, and the controls have been updated to increase the level of automation and improve operator interfaces. Testing of these major modifications was conducted in August of 2012 and this initial test was very successful, with all major systems exceeding their performance requirements. This paper will outline the changes in overall environmental requirements, discuss the technical design data that was used in the decisions leading to the extensive

  3. Creating the Deep Space Environment for Testing the James Webb Space Telescope at the Johnson Space Center's Chamber A

    Science.gov (United States)

    Homan, Jonathan L.; Cerimele, Mary P.; Montz, Michael E.

    2012-01-01

    Chamber A is the largest thermal vacuum chamber at the Johnson Space Center and is one of the largest space environment chambers in the world. The chamber is 19.8 m (65 ft) in diameter and 36.6 m (120 ft) tall and is equipped with cryogenic liquid nitrogen panels (shrouds) and gaseous helium shrouds to create a simulated space environment. It was originally designed and built in the mid 1960's to test the Apollo Command and Service Module and several manned tests were conducted on that spacecraft, contributing to the success of the program. The chamber has been used since that time to test spacecraft active thermal control systems, Shuttle DTO, DOD, and ESA hardware in simulated Low Earth Orbit (LEO) conditions. NASA is now moving from LEO towards exploration of locations with environments approaching those of deep space. Therefore, Chamber A has undergone major modifications to enable it to simulate these deeper space environments. Environmental requirements were driven, and the modifications were funded, by the James Webb Space Telescope program, and this telescope which will orbit Solar/Earth L2, will be the first test article to benefit from the chamber s new capabilities. To accommodate JWST, the Chamber A high vacuum system has been modernized, additional LN2 shrouds have been installed, the liquid nitrogen system has been modified to remove dependency on electrical power and increase its reliability, a new helium shroud/refrigeration system has been installed to create a colder more stable and uniform heat sink and, the controls have been updated to increase the level of automation and improve operator interfaces. Testing of these major modifications was conducted in August 2012 and this initial test was very successful, with all major systems exceeding their performance requirements. This paper will outline the changes in the overall environmental requirements, discuss the technical design data that was used in the decisions leading to the extensive

  4. Fast, Accurate Memory Architecture Simulation Technique Using Memory Access Characteristics

    OpenAIRE

    小野, 貴継; 井上, 弘士; 村上, 和彰

    2007-01-01

    This paper proposes a fast and accurate memory architecture simulation technique. To design memory architecture, the first steps commonly involve using trace-driven simulation. However, expanding the design space makes the evaluation time increase. A fast simulation is achieved by a trace size reduction, but it reduces the simulation accuracy. Our approach can reduce the simulation time while maintaining the accuracy of the simulation results. In order to evaluate validity of proposed techniq...

  5. Toward multi-scale simulation of reconnection phenomena in space plasma

    Science.gov (United States)

    Den, M.; Horiuchi, R.; Usami, S.; Tanaka, T.; Ogawa, T.; Ohtani, H.

    2013-12-01

    Magnetic reconnection is considered to play an important role in space phenomena such as substorm in the Earth's magnetosphere. It is well known that magnetic reconnection is controlled by microscopic kinetic mechanism. Frozen-in condition is broken due to particle kinetic effects and collisionless reconnection is triggered when current sheet is compressed as thin as ion kinetic scales under the influence of external driving flow. On the other hand configuration of the magnetic field leading to formation of diffusion region is determined in macroscopic scale and topological change after reconnection is also expressed in macroscopic scale. Thus magnetic reconnection is typical multi-scale phenomenon and microscopic and macroscopic physics are strongly coupled. Recently Horiuchi et al. developed an effective resistivity model based on particle-in-cell (PIC) simulation results obtained in study of collisionless driven reconnection and applied to a global magnetohydrodynamics (MHD) simulation of substorm in the Earth's magnetosphere. They showed reproduction of global behavior in substrom such as dipolarization and flux rope formation by global three dimensional MHD simulation. Usami et al. developed multi-hierarchy simulation model, in which macroscopic and microscopic physics are solved self-consistently and simultaneously. Based on the domain decomposition method, this model consists of three parts: a MHD algorithm for macroscopic global dynamics, a PIC algorithm for microscopic kinetic physics, and an interface algorithm to interlock macro and micro hierarchies. They verified the interface algorithm by simulation of plasma injection flow. In their latest work, this model was applied to collisionless reconnection in an open system and magnetic reconnection was successfully found. In this paper, we describe our approach to clarify multi-scale phenomena and report the current status. Our recent study about extension of the MHD domain to global system is presented. We

  6. Simulations of VLBI observations of a geodetic satellite providing co-location in space

    Science.gov (United States)

    Anderson, James M.; Beyerle, Georg; Glaser, Susanne; Liu, Li; Männel, Benjamin; Nilsson, Tobias; Heinkelmann, Robert; Schuh, Harald

    2018-02-01

    We performed Monte Carlo simulations of very-long-baseline interferometry (VLBI) observations of Earth-orbiting satellites incorporating co-located space-geodetic instruments in order to study how well the VLBI frame and the spacecraft frame can be tied using such measurements. We simulated observations of spacecraft by VLBI observations, time-of-flight (TOF) measurements using a time-encoded signal in the spacecraft transmission, similar in concept to precise point positioning, and differential VLBI (D-VLBI) observations using angularly nearby quasar calibrators to compare their relative performance. We used the proposed European Geodetic Reference Antenna in Space (E-GRASP) mission as an initial test case for our software. We found that the standard VLBI technique is limited, in part, by the present lack of knowledge of the absolute offset of VLBI time to Coordinated Universal Time at the level of microseconds. TOF measurements are better able to overcome this problem and provide frame ties with uncertainties in translation and scale nearly a factor of three smaller than those yielded from VLBI measurements. If the absolute time offset issue can be resolved by external means, the VLBI results can be significantly improved and can come close to providing 1 mm accuracy in the frame tie parameters. D-VLBI observations with optimum performance assumptions provide roughly a factor of two higher uncertainties for the E-GRASP orbit. We additionally simulated how station and spacecraft position offsets affect the frame tie performance.

  7. Space Environmental Viewing and Analysis Network (SEVAN) – characteristics and first operation results

    International Nuclear Information System (INIS)

    Chilingarian, Ashot; Arakelyan, Karen; Avakyan, Karen; Bostanjyan, Nikolaj; Chilingaryan, Suren; Pokhsraryan, D; Sargsyan, D; Reymers, A

    2013-01-01

    Space Environmental Viewing and Analysis Network is a worldwide network of identical particle detectors located at middle and low latitudes aimed to improve fundamental research of space weather conditions and to provide short- and long-term forecasts of the dangerous consequences of space storms. SEVAN detected changing fluxes of different species of secondary cosmic rays at different altitudes and latitudes, thus turning SEVAN into a powerful integrated device used to explore solar modulation effects. Till to now the SEVAN modules are installed at Aragats Space Environmental Centre in Armenia (3 units at altitudes 800, 2000 and 3200 m a.s.l.), Bulgaria (Moussala), Croatia and India (New-Delhi JNU.) and now under installation in Slovakia, LomnitskySchtit). Recently SEVAN detectors were used for research of new high-energy phenomena originated in terrestrial atmosphere – Thunderstorm Ground Enhancements (TGEs). In 2011 first joint measurements of solar modulation effects were detected by SEVAN network, now under analysis.

  8. Space inhomogeneity and detuning effects in a laser with a saturable absorber: a first-order approximation

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Fernandez, P.; Velarde, M.G.

    1988-05-01

    To a first approximation the effects of detuning and/or space inhomogeneity on the stability domain of a model for a laser with a saturable absorber are presented. It appears that the space dependence increases the domain of the emissionless state, thus delaying the laser action.

  9. Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes

    CERN Document Server

    Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R

    2001-01-01

    This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...

  10. Building Interdisciplinary Leadership Skills among Health Practitioners in the Twenty-First Century: An Innovative Training Model.

    Science.gov (United States)

    Negandhi, Preeti; Negandhi, Himanshu; Tiwari, Ritika; Sharma, Kavya; Zodpey, Sanjay P; Quazi, Zahiruddin; Gaidhane, Abhay; Jayalakshmi N; Gijare, Meenakshi; Yeravdekar, Rajiv

    2015-01-01

    Transformational learning is the focus of twenty-first century global educational reforms. In India, there is a need to amalgamate the skills and knowledge of medical, nursing, and public health practitioners and to develop robust leadership competencies among them. This initiative proposed to identify interdisciplinary leadership competencies among Indian health practitioners and to develop a training program for interdisciplinary leadership skills through an Innovation Collaborative. Medical, nursing, and public health institutions partnered in this endeavor. An exhaustive literature search was undertaken to identify leadership competencies in these three professions. Published evidence was utilized in searching for the need for interdisciplinary training of health practitioners, including current scenarios in interprofessional health education and the key competencies required. The interdisciplinary leadership competencies identified were self-awareness, vision, self-regulation, motivation, decisiveness, integrity, interpersonal communication skills, strategic planning, team building, innovation, and being an effective change agent. Subsequently, a training program was developed, and three training sessions were piloted with 66 participants. Each cohort comprised a mix of participants from different disciplines. The pilot training guided the development of a training model for building interdisciplinary leadership skills and organizing interdisciplinary leadership workshops. The need for interdisciplinary leadership competencies is recognized. The long-term objective of the training model is integration into the regular medical, nursing, and public health curricula, with the aim of developing interdisciplinary leadership skills among them. Although challenging, formal incorporation of leadership skills into health professional education is possible within the interdisciplinary classroom setting using principles of transformative learning.

  11. Application of the Intervention Mapping Framework to Develop an Integrated Twenty-first Century Core Curriculum-Part Two: Translation of MPH Core Competencies into an Integrated Theory-Based Core Curriculum.

    Science.gov (United States)

    Corvin, Jaime A; DeBate, Rita; Wolfe-Quintero, Kate; Petersen, Donna J

    2017-01-01

    In the twenty-first century, the dynamics of health and health care are changing, necessitating a commitment to revising traditional public health curricula to better meet present day challenges. This article describes how the College of Public Health at the University of South Florida utilized the Intervention Mapping framework to translate revised core competencies into an integrated, theory-driven core curriculum to meet the training needs of the twenty-first century public health scholar and practitioner. This process resulted in the development of four sequenced courses: History and Systems of Public Health and Population Assessment I delivered in the first semester and Population Assessment II and Translation to Practice delivered in the second semester. While the transformation process, moving from traditional public health core content to an integrated and innovative curriculum, is a challenging and daunting task, Intervention Mapping provides the ideal framework for guiding this process. Intervention mapping walks the curriculum developers from the broad goals and objectives to the finite details of a lesson plan. Throughout this process, critical lessons were learned, including the importance of being open to new ideologies and frameworks and the critical need to involve key-stakeholders in every step of the decision-making process to ensure the sustainability of the resulting integrated and theory-based curriculum. Ultimately, as a stronger curriculum emerged, the developers and instructors themselves were changed, fostering a stronger public health workforce from within.

  12. A multiprocessor computer simulation model employing a feedback scheduler/allocator for memory space and bandwidth matching and TMR processing

    Science.gov (United States)

    Bradley, D. B.; Irwin, J. D.

    1974-01-01

    A computer simulation model for a multiprocessor computer is developed that is useful for studying the problem of matching multiprocessor's memory space, memory bandwidth and numbers and speeds of processors with aggregate job set characteristics. The model assumes an input work load of a set of recurrent jobs. The model includes a feedback scheduler/allocator which attempts to improve system performance through higher memory bandwidth utilization by matching individual job requirements for space and bandwidth with space availability and estimates of bandwidth availability at the times of memory allocation. The simulation model includes provisions for specifying precedence relations among the jobs in a job set, and provisions for specifying precedence execution of TMR (Triple Modular Redundant and SIMPLEX (non redundant) jobs.

  13. The OTTI space experiments

    International Nuclear Information System (INIS)

    Brewer, D.A.; Clifton, K.S.; Pearson, S.D.; Barth, J.L.; LaBel, K.; Ritter, J.C.; Peden, J.; Campbell, A.; Liang, R.

    1999-01-01

    The orbiting technology tested initiative (OTTI) provides a concept for a series of space experiment platforms to be flown at 2-year interval over the next ten years. The long-term purpose of this program is to provide a convenient test-beds to simulate high radiation environments. The purposes of the first platform is to evaluate the on-orbit performance of novel, emerging, breakthrough technologies and advanced state-of-the-art devices in high radiation orbits and to provide correlations between the natural space radiation environment and the device response in the flight test-bed. This short article presents the concept of the OTTI program

  14. Simulation of Cascaded Longitudinal-Space-Charge Amplifier at the Fermilab Accelerator Science & Technology (Fast) Facility

    Energy Technology Data Exchange (ETDEWEB)

    Halavanau, A. [Northern Illinois U.; Piot, P. [Northern Illinois U.

    2015-12-01

    Cascaded Longitudinal Space Charge Amplifiers (LSCA) have been proposed as a mechanism to generate density modulation over a board spectral range. The scheme has been recently demonstrated in the optical regime and has confirmed the production of broadband optical radiation. In this paper we investigate, via numerical simulations, the performance of a cascaded LSCA beamline at the Fermilab Accelerator Science & Technology (FAST) facility to produce broadband ultraviolet radiation. Our studies are carried out using elegant with included tree-based grid-less space charge algorithm.

  15. Non-International Armed Conflict in the Twenty-first Century

    Science.gov (United States)

    2012-01-01

    that "this extension has not taken place in the form of a full and mechanical trans- plant of those rules to internal conflict; rather, the general...firmly im- planted in the international legal mind-set. NIAC jus in bello governs armed conflicts above either the first or the second threshold...hand, and those of the Federal Republic of Yugoslavia (Serbia and Montenegro ), on the other.96 Yet, the majority of the Chamber (Judges Stephen and

  16. A High Fidelity Approach to Data Simulation for Space Situational Awareness Missions

    Science.gov (United States)

    Hagerty, S.; Ellis, H., Jr.

    2016-09-01

    Space Situational Awareness (SSA) is vital to maintaining our Space Superiority. A high fidelity, time-based simulation tool, PROXOR™ (Proximity Operations and Rendering), supports SSA by generating realistic mission scenarios including sensor frame data with corresponding truth. This is a unique and critical tool for supporting mission architecture studies, new capability (algorithm) development, current/future capability performance analysis, and mission performance prediction. PROXOR™ provides a flexible architecture for sensor and resident space object (RSO) orbital motion and attitude control that simulates SSA, rendezvous and proximity operations scenarios. The major elements of interest are based on the ability to accurately simulate all aspects of the RSO model, viewing geometry, imaging optics, sensor detector, and environmental conditions. These capabilities enhance the realism of mission scenario models and generated mission image data. As an input, PROXOR™ uses a library of 3-D satellite models containing 10+ satellites, including low-earth orbit (e.g., DMSP) and geostationary (e.g., Intelsat) spacecraft, where the spacecraft surface properties are those of actual materials and include Phong and Maxwell-Beard bidirectional reflectance distribution function (BRDF) coefficients for accurate radiometric modeling. We calculate the inertial attitude, the changing solar and Earth illumination angles of the satellite, and the viewing angles from the sensor as we propagate the RSO in its orbit. The synthetic satellite image is rendered at high resolution and aggregated to the focal plane resolution resulting in accurate radiometry even when the RSO is a point source. The sensor model includes optical effects from the imaging system [point spread function (PSF) includes aberrations, obscurations, support structures, defocus], detector effects (CCD blooming, left/right bias, fixed pattern noise, image persistence, shot noise, read noise, and quantization

  17. Talent management for the twenty-first century.

    Science.gov (United States)

    Cappelli, Peter

    2008-03-01

    Most firms have no formal programs for anticipating and fulfilling talent needs, relying on an increasingly expensive pool of outside candidates that has been shrinking since it was created from the white-collar layoffs of the 1980s. But the advice these companies are getting to solve the problem--institute large-scale internal development programs--is equally ineffective. Internal development was the norm back in the 1950s, and every management-development practice that seems novel today was routine in those years--from executive coaching to 360-degree feedback to job rotation to high-potential programs. However, the stable business environment and captive talent pipelines in which such practices were born no longer exist. It's time for a fundamentally new approach to talent management. Fortunately, companies already have such a model, one that has been well honed over decades to anticipate and meet demand in uncertain environments: supply chain management. Cappelli, a professor at the Wharton School, focuses on four practices in particular. First, companies should balance make-versus-buy decisions by using internal development programs to produce most--but not all--of the needed talent, filling in with outside hiring. Second, firms can reduce the risks in forecasting the demand for talent by sending smaller batches of candidates through more modularized training systems in much the same way manufacturers now employ components in just-in-time production lines. Third, companies can improve their returns on investment in development efforts by adopting novel cost-sharing programs. Fourth, they should seek to protect their investments by generating internal opportunities to encourage newly trained managers to stick with the firm. Taken together, these principles form the foundation for a new paradigm in talent management: a talent-on-demand system.

  18. Creating the Deep Space Environment for Testing the James Webb Space Telescope (JWST) at NASA Johnson Space Center's Chamber A

    Science.gov (United States)

    Homan, Jonathan L.; Cerimele, Mary P.; Montz, Michael E.; Bachtel, Russell; Speed, John; O'Rear, Patrick

    2013-01-01

    Chamber A is the largest thermal vacuum chamber at the Johnson Space Center and is one of the largest space environment chambers in the world. The chamber is 19.8 m (65 ft) in diameter and 36.6 m (120 ft) tall and is equipped with cryogenic liquid nitrogen panels (shrouds) and gaseous helium shrouds to create a simulated space environment. It was originally designed and built in the mid 1960 s to test the Apollo Command and Service Module and several manned tests were conducted on that spacecraft, contributing to the success of the program. The chamber has been used since that time to test spacecraft active thermal control systems, Shuttle DTO, DOD, and ESA hardware in simulated Low Earth Orbit (LEO) conditions. NASA is now moving from LEO towards exploration of locations with environments approaching those of deep space. Therefore, Chamber A has undergone major modifications to enable it to simulate these deeper space environments. Environmental requirements were driven, and modifications were funded by the James Webb Space Telescope program, and this telescope which will orbit Solar/Earth L2, will be the first test article to benefit from the chamber s new capabilities. To accommodate JWST, the Chamber A high vacuum system has been modernized, additional LN2 shrouds have been installed, the liquid nitrogen system has been modified to remove dependency on electrical power and increase its reliability, a new helium shroud/refrigeration system has been installed to create a colder more stable and uniform heat sink, and the controls have been updated to increase the level of automation and improve operator interfaces. Testing of these major modifications was conducted in August of 2012 and this initial test was very successful, with all major systems exceeding their performance requirements. This paper will outline the changes in overall environmental requirements, discuss the technical design data that was used in the decisions leading to the extensive modifications

  19. Emergency First Response to a Crisis Event: A Multi-Agent Simulation Approach

    National Research Council Canada - National Science Library

    Roginski, Jonathan W

    2006-01-01

    .... This process led to the development of a multi-agent simulation methodology for emergency first response specifically applied to analyze a notional vehicle bomb attack during a festival in the Baltimore Inner Harbor...

  20. Space and Ground-Based Infrastructures

    Science.gov (United States)

    Weems, Jon; Zell, Martin

    This chapter deals first with the main characteristics of the space environment, outside and inside a spacecraft. Then the space and space-related (ground-based) infrastructures are described. The most important infrastructure is the International Space Station, which holds many European facilities (for instance the European Columbus Laboratory). Some of them, such as the Columbus External Payload Facility, are located outside the ISS to benefit from external space conditions. There is only one other example of orbital platforms, the Russian Foton/Bion Recoverable Orbital Capsule. In contrast, non-orbital weightless research platforms, although limited in experimental time, are more numerous: sounding rockets, parabolic flight aircraft, drop towers and high-altitude balloons. In addition to these facilities, there are a number of ground-based facilities and space simulators, for both life sciences (for instance: bed rest, clinostats) and physical sciences (for instance: magnetic compensation of gravity). Hypergravity can also be provided by human and non-human centrifuges.

  1. Solar concentrator panel and gore testing in the JPL 25-foot space simulator

    Science.gov (United States)

    Dennison, E. W.; Argoud, M. J.

    1981-01-01

    The optical imaging characteristics of parabolic solar concentrator panels (or gores) have been measured using the optical beam of the JPL 25-foot space simulator. The simulator optical beam has been characterized, and the virtual source position and size have been determined. These data were used to define the optical test geometry. The point source image size and focal length have been determined for several panels. A flux distribution of a typical solar concentrator has been estimated from these data. Aperture photographs of the panels were used to determine the magnitude and characteristics of the reflecting surface errors. This measurement technique has proven to be highly successful at determining the optical characteristics of solar concentrator panels.

  2. Comparison of distal soft-tissue procedures combined with a distal chevron osteotomy for moderate to severe hallux valgus: first web-space versus transarticular approach.

    Science.gov (United States)

    Park, Yu-Bok; Lee, Keun-Bae; Kim, Sung-Kyu; Seon, Jong-Keun; Lee, Jun-Young

    2013-11-06

    There are two surgical approaches for distal soft-tissue procedures for the correction of hallux valgus-the dorsal first web-space approach, and the medial transarticular approach. The purpose of this study was to compare the outcomes achieved after use of either of these approaches combined with a distal chevron osteotomy in patients with moderate to severe hallux valgus. One hundred and twenty-two female patients (122 feet) who underwent a distal chevron osteotomy as part of a distal soft-tissue procedure for the treatment of symptomatic unilateral moderate to severe hallux valgus constituted the study cohort. The 122 feet were randomly divided into two groups: namely, a dorsal first web-space approach (group D; sixty feet) and a medial transarticular approach (group M; sixty-two feet). The clinical and radiographic results of the two groups were compared at a mean follow-up time of thirty-eight months. The American Orthopaedic Foot & Ankle Society (AOFAS) hindfoot scale hallux metatarsophalangeal-interphalangeal scores improved from a mean and standard deviation of 55.5 ± 12.8 points preoperatively to 93.5 ± 6.3 points at the final follow-up in group D and from 54.9 ± 12.6 points preoperatively to 93.6 ± 6.2 points at the final follow-up in group M. The mean hallux valgus angle in groups D and M was reduced from 32.2° ± 6.3° and 33.1° ± 8.4° preoperatively to 10.5° ± 5.5° and 9.9° ± 5.5°, respectively, at the time of final follow-up. The mean first intermetatarsal angle in groups D and M was reduced from 15.0° ± 2.8° and 15.3° ± 2.7° preoperatively to 6.5° ± 2.2° and 6.3° ± 2.4°, respectively, at the final follow-up. The clinical and radiographic outcomes were not significantly different between the two groups. The final clinical and radiographic outcomes between the two approaches for distal soft-tissue procedures were comparable and equally successful. Accordingly, the results of this study suggest that the medial transarticular

  3. Thomas Piketty: Capital in the Twenty-First Century (Le Capital au XXIe siècle. (Ensk þýðing: Arthur Goldhammer.

    Directory of Open Access Journals (Sweden)

    Gylfi Magnússon

    2014-06-01

    Full Text Available Í umsögn gagnrýnanda kemur meðal annars eftirfarandi fram: Ritinu er ekki ætlað að vera lokaorðin um viðfangsefnið heldur miklu frekar grunnur að frekari umræðu og rannsóknum. Það hefur tekist. Capital in the Twenty-First Century er verk sem hefur þegar vakið mikla umræðu og verður vafalaust rætt áfram árum saman. Það er raunar nánast skyldulesning fyrir þá sem ætla sér að fjalla um þjóðhagfræði og hlutverk hins opinbera, hversu sammála eða ósammála sem þeir eru höfundinum.

  4. The Polar WRF Downscaled Historical and Projected Twenty-First Century Climate for the Coast and Foothills of Arctic Alaska

    Directory of Open Access Journals (Sweden)

    Lei Cai

    2018-01-01

    Full Text Available Climate change is most pronounced in the northern high latitude region. Yet, climate observations are unable to fully capture regional-scale dynamics due to the sparse weather station coverage, which limits our ability to make reliable climate-based assessments. A set of simulated data products was therefore developed for the North Slope of Alaska through a dynamical downscaling approach. The polar-optimized Weather Research and Forecast (Polar WRF model was forced by three sources: The ERA-interim reanalysis data (for 1979–2014, the Community Earth System Model 1.0 (CESM1.0 historical simulation (for 1950–2005, and the CESM1.0 projected (for 2006–2100 simulations in two Representative Concentration Pathways (RCP4.5 and RCP8.5 scenarios. Climatic variables were produced in a 10-km grid spacing and a 3-h interval. The ERA-interim forced WRF (ERA-WRF proves the value of dynamical downscaling, which yields more realistic topographical-induced precipitation and air temperature, as well as corrects underestimations in observed precipitation. In summary, dry and cold biases to the north of the Brooks Range are presented in ERA-WRF, while CESM forced WRF (CESM-WRF holds wet and warm biases in its historical period. A linear scaling method allowed for an adjustment of the biases, while keeping the majority of the variability and extreme values of modeled precipitation and air temperature. CESM-WRF under RCP 4.5 scenario projects smaller increase in precipitation and air temperature than observed in the historical CESM-WRF product, while the CESM-WRF under RCP 8.5 scenario shows larger changes. The fine spatial and temporal resolution, long temporal coverage, and multi-scenario projections jointly make the dataset appropriate to address a myriad of physical and biological changes occurring on the North Slope of Alaska.

  5. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  6. Observation and simulation of AGW in Space

    Science.gov (United States)

    Kunitsyn, Vyacheslav; Kholodov, Alexander; Andreeva, Elena; Nesterov, Ivan; Padokhin, Artem; Vorontsov, Artem

    2014-05-01

    Examples are presented of satellite observations and imaging of AGW and related phenomena in space travelling ionospheric disturbances (TID). The structure of AGW perturbations was reconstructed by satellite radio tomography (RT) based on the signals of Global Navigation Satellite Systems (GNSS). The experiments use different GNSS, both low-orbiting (Russian Tsikada and American Transit) and high-orbiting (GPS, GLONASS, Galileo, Beidou). The examples of RT imaging of TIDs and AGWs from anthropogenic sources such as ground explosions, rocket launching, heating the ionosphere by high-power radio waves are presented. In the latter case, the corresponding AGWs and TIDs were generated in response to the modulation in the power of the heating wave. The natural AGW-like wave disturbances are frequently observed in the atmosphere and ionosphere in the form of variations in density and electron concentration. These phenomena are caused by the influence of the near-space environment, atmosphere, and surface phenomena including long-period vibrations of the Earth's surface, earthquakes, explosions, temperature heating, seisches, tsunami waves, etc. Examples of experimental RT reconstructions of wave disturbances associated with the earthquakes and tsunami waves are presented, and RT images of TIDs caused by the variations in the corpuscular ionization are demonstrated. The results of numerical modeling of AGW generation by some surface and volume sources are discussed. The milli-Hertz AGWs generated by these sources induce perturbations with a typical scale of a few hundred of kilometers at the heights of the middle atmosphere and ionosphere. The numerical modeling is based on the solution of equations of geophysical hydrodynamics. The results of the numerical simulations agree with the observations. The authors acknowledge the support of the Russian Foundation for Basic Research (grants 14-05-00855 and 13-05-01122), grant of the President of Russian Federation MK-2670

  7. Virtual Libraries and Education in Virtual Worlds: Twenty-First Century Library Services

    Science.gov (United States)

    Bell, Lori; Lindbloom, Mary-Carol; Peters, Tom; Pope, Kitty

    2008-01-01

    As the use of the Internet and time spent on the Internet by individuals grows, and the use of virtual worlds like Active Worlds and Second Life increases, the library needs to have an interactive place and role in these worlds as well as a bricks and mortar space. This article provides an overview of what some libraries are doing in these worlds,…

  8. GMC COLLISIONS AS TRIGGERS OF STAR FORMATION. I. PARAMETER SPACE EXPLORATION WITH 2D SIMULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Benjamin [Department of Physics, University of Florida, Gainesville, FL 32611 (United States); Loo, Sven Van [School of Physics and Astronomy, University of Leeds, Leeds LS2 9JT (United Kingdom); Tan, Jonathan C. [Departments of Astronomy and Physics, University of Florida, Gainesville, FL 32611 (United States); Bruderer, Simon, E-mail: benwu@phys.ufl.edu [Max Planck Institute for Extraterrestrial Physics, Giessenbachstrasse 1, D-85748 Garching (Germany)

    2015-09-20

    We utilize magnetohydrodynamic (MHD) simulations to develop a numerical model for giant molecular cloud (GMC)–GMC collisions between nearly magnetically critical clouds. The goal is to determine if, and under what circumstances, cloud collisions can cause pre-existing magnetically subcritical clumps to become supercritical and undergo gravitational collapse. We first develop and implement new photodissociation region based heating and cooling functions that span the atomic to molecular transition, creating a multiphase ISM and allowing modeling of non-equilibrium temperature structures. Then in 2D and with ideal MHD, we explore a wide parameter space of magnetic field strength, magnetic field geometry, collision velocity, and impact parameter and compare isolated versus colliding clouds. We find factors of ∼2–3 increase in mean clump density from typical collisions, with strong dependence on collision velocity and magnetic field strength, but ultimately limited by flux-freezing in 2D geometries. For geometries enabling flow along magnetic field lines, greater degrees of collapse are seen. We discuss observational diagnostics of cloud collisions, focussing on {sup 13}CO(J = 2–1), {sup 13}CO(J = 3–2), and {sup 12}CO(J = 8–7) integrated intensity maps and spectra, which we synthesize from our simulation outputs. We find that the ratio of J = 8–7 to lower-J emission is a powerful diagnostic probe of GMC collisions.

  9. THE FIRST GALAXIES: ASSEMBLY UNDER RADIATIVE FEEDBACK FROM THE FIRST STARS

    International Nuclear Information System (INIS)

    Pawlik, Andreas H.; Milosavljević, Miloš; Bromm, Volker

    2013-01-01

    driven by radiative feedback from sources both internal and external to these galaxies. Our estimates of the observability of the first galaxies show that dwarf galaxies such as simulated here will be among the faintest galaxies the upcoming James Webb Space Telescope will detect. Our conclusions regarding the structure and observability of the first galaxies are subject to our neglect of feedback from supernovae and chemical enrichment as well as to statistical uncertainties implied by the limited number of galaxies in our simulations.

  10. RESULTS OF THE FIRST RUN OF THE NASA SPACE RADIATION LABORATORY AT BNL

    International Nuclear Information System (INIS)

    BROWN, K.A.; AHRENS, L.; BRENNAN, J.M.

    2004-01-01

    The NASA Space Radiation Laboratory (NSRL) was constructed in collaboration with NASA for the purpose of performing radiation effect studies for the NASA space program. The results of commissioning of this new facility were reported in [l]. In this report we will describe the results of the first run. The NSRL is capable of making use of heavy ions in the range of 0.05 to 3 GeV/n slow extracted from BNL's AGS Booster. Many modes of operation were explored during the first run, demonstrating all the capabilities designed into the system. Heavy ion intensities from 100 particles per pulse up to 12 x 10 9 particles per pulse were delivered to a large variety of experiments, providing a dose range up to 70 Gy/min over a 5 x 5 cm 2 area. Results presented will include those related to the production of beams that are highly uniform in both the transverse and longitudinal planes of motion [2

  11. Simulated non-contact atomic force microscopy for GaAs surfaces based on real-space pseudopotentials

    International Nuclear Information System (INIS)

    Kim, Minjung; Chelikowsky, James R.

    2014-01-01

    We simulate non-contact atomic force microscopy (AFM) with a GaAs(1 1 0) surface using a real-space ab initio pseudopotential method. While most ab initio simulations include an explicit model for the AFM tip, our method does not introduce the tip modeling step. This approach results in a considerable reduction of computational work, and also provides complete AFM images, which can be directly compared to experiment. By analyzing tip-surface interaction forces in both our results and previous ab initio simulations, we find that our method provides very similar force profile to the pure Si tip results. We conclude that our method works well for systems in which the tip is not chemically active.

  12. Simulation of the space-time evolution of color-flux tubes (guidelines to the TERMITE program)

    International Nuclear Information System (INIS)

    Dyrek, A.

    1990-08-01

    We give the description of the computer program which simulates boost-invariant evolution of color-flux tubes in high-energy processes. The program provides a graphic demonstration of space-time trajectories of created particles and can also be used as Monte-Carlo generator of events. (author)

  13. The not-so-harmless maxillary primary first molar extraction.

    Science.gov (United States)

    Northway, W M

    2000-12-01

    Premature loss of primary molars has been associated with space loss and eruptive difficulties, especially when the loss occurs to the primary second molars and when it occurs early. This has not been thought to be the case for primary first molars. The author revisited 13 cases from an earlier study on the effects of premature loss of maxillary primary molars. These longitudinal cases were scrutinized, using serial panoramic radiographs, to explain the irregular response in terms of dental migration. The author presents two case reports. In the earlier study, the author used digitized study casts and the concept of D + E space--the space occupied by the primary first and second molars--to describe the dental migration that occurred after premature tooth loss. Using analysis of variance on data generated using an instrument capable of measuring in tenths of millimeters, the author produced findings regarding the amount of space loss, rate of space loss, effect of age at loss, amount of space regained at the time of replacement by the permanent tooth and effect on Angle's classification. Finally, the author created a simulation describing directional change; this revealed that the maxillary primary first molar loss resulted in a mesial displacement of the permanent canine during eruption. When the maxillary primary first molar is lost prematurely, the first premolar erupts in a more mesial direction than normal, as a result of the mesial incline of the primary second molar, and consumes the space of the permanent canine, which becomes blocked out. Rather than use a space maintainer after the premature loss of the maxillary primary first molar, the author suggests, clinicians can choose from a number of other options for preventing the first premolar from erupting too far in a mesial direction.

  14. Amateur Radio on the International Space Station (ARISS) - the First Educational Outreach Program on ISS

    Science.gov (United States)

    Conley, C. L.; Bauer, F. H.; Brown, D.; White, R.

    2002-01-01

    More than 40 missions over five years will be required to assemble the International Space Station in orbit. The astronauts and cosmonauts will work hard on these missions, but they plan to take some time off for educational activities with schools. Amateur Radio on the International Space Station represents the first Educational Outreach program that is flying on ISS. NASA's Division of Education is a major supporter and sponsor of this student outreach activity on the International Space Station. This meets NASA's educational mission objective: "To inspire the next generation of explorers...as only NASA can." As the International Space Station takes its place in the heavens, the amateur radio community is doing its part by helping to enrich the experience of those visiting and living on the station as well as the students on Earth. Through ARISS (Amateur Radio on the International Space Station), students on Earth have a once in a lifetime opportunity--to talk to the crew on-board ISS. Using amateur radio equipment set up in their classroom, students get a first-hand feel of what it is like to live and work in space. Each school gets a 10 minute question and answer interview with the on-orbit crew using a ground station located in their classroom or through a remote ground station. The ARISS opportunity has proven itself as a tremendous educational boon to teachers and students. Through ARISS, students learn about orbit dynamics, Doppler shift, radio communications, and working with the press. Since its first flight in 1983, amateur radio has flown on more than two-dozen space shuttle missions. Dozens of astronauts have used the predecessor program called SAREX (The Space Shuttle Amateur Radio Experiment) to talk to thousands of kids in school and to their families on Earth while they were in orbit. The primary goals of the ARISS program are fourfold: 1) educational outreach through crew contacts with schools, 2) random contacts with the amateur radio public, 3

  15. State, space relay modeling and simulation using the electromagnetic Transients Program and its transient analysis of control systems capability

    International Nuclear Information System (INIS)

    Domijan, A.D. Jr.; Emami, M.V.

    1990-01-01

    This paper reports on a simulation of a MHO distance relay developed to study the effect of its operation under various system conditions. Simulation is accomplished using a state space approach and a modeling technique using ElectroMagnetic Transient Program (Transient Analysis of Control Systems). Furthermore, simulation results are compared with those obtained in another independent study as a control, to validate the results. A data code for the practical utilization of this simulation is given

  16. Base pressure and heat transfer tests of the 0.0225-scale space shuttle plume simulation model (19-OTS) in yawed flight conditions in the NASA-Lewis 10x10-foot supersonic wind tunnel (test IH83)

    Science.gov (United States)

    Foust, J. W.

    1979-01-01

    Wind tunnel tests were performed to determine pressures, heat transfer rates, and gas recovery temperatures in the base region of a rocket firing model of the space shuttle integrated vehicle during simulated yawed flight conditions. First and second stage flight of the space shuttle were simulated by firing the main engines in conjunction with the SRB rocket motors or only the SSME's into the continuous tunnel airstream. For the correct rocket plume environment, the simulated altitude pressures were halved to maintain the rocket chamber/altitude pressure ratio. Tunnel freestream Mach numbers from 2.2 to 3.5 were simulated over an altitude range of 60 to 130 thousand feet with varying angle of attack, yaw angle, nozzle gimbal angle and SRB chamber pressure. Gas recovery temperature data derived from nine gas temperature probe runs are presented. The model configuration, instrumentation, test procedures, and data reduction are described.

  17. The sea ice mass budget of the Arctic and its future change as simulated by coupled climate models

    Energy Technology Data Exchange (ETDEWEB)

    Holland, Marika M. [National Center for Atmospheric Research, Boulder, CO (United States); Serreze, Mark C.; Stroeve, Julienne [University of Colorado, National Snow and Ice Data Center, Cooperative Institute for Research in Environmental Sciences, Boulder, CO (United States)

    2010-02-15

    Arctic sea ice mass budgets for the twentieth century and projected changes through the twenty-first century are assessed from 14 coupled global climate models. Large inter-model scatter in contemporary mass budgets is strongly related to variations in absorbed solar radiation, due in large part to differences in the surface albedo simulation. Over the twenty-first century, all models simulate a decrease in ice volume resulting from increased annual net melt (melt minus growth), partially compensated by reduced transport to lower latitudes. Despite this general agreement, the models vary considerably regarding the magnitude of ice volume loss and the relative roles of changing melt and growth in driving it. Projected changes in sea ice mass budgets depend in part on the initial (mid twentieth century) ice conditions; models with thicker initial ice generally exhibit larger volume losses. Pointing to the importance of evolving surface albedo and cloud properties, inter-model scatter in changing net ice melt is significantly related to changes in downwelling longwave and absorbed shortwave radiation. These factors, along with the simulated mean and spatial distribution of ice thickness, contribute to a large inter-model scatter in the projected onset of seasonally ice-free conditions. (orig.)

  18. Joint Estimation and Decoding of Space-Time Trellis Codes

    Directory of Open Access Journals (Sweden)

    Zhang Jianqiu

    2002-01-01

    Full Text Available We explore the possibility of using an emerging tool in statistical signal processing, sequential importance sampling (SIS, for joint estimation and decoding of space-time trellis codes (STTC. First, we provide background on SIS, and then we discuss its application to space-time trellis code (STTC systems. It is shown through simulations that SIS is suitable for joint estimation and decoding of STTC with time-varying flat-fading channels when phase ambiguity is avoided. We used a design criterion for STTCs and temporally correlated channels that combats phase ambiguity without pilot signaling. We have shown by simulations that the design is valid.

  19. The First 3D Simulations of Carbon Burning in a Massive Star

    Science.gov (United States)

    Cristini, A.; Meakin, C.; Hirschi, R.; Arnett, D.; Georgy, C.; Viallet, M.

    2017-11-01

    We present the first detailed three-dimensional hydrodynamic implicit large eddy simulations of turbulent convection for carbon burning. The simulations start with an initial radial profile mapped from a carbon burning shell within a 15 M⊙ stellar evolution model. We considered 4 resolutions from 1283 to 10243 zones. These simulations confirm that convective boundary mixing (CBM) occurs via turbulent entrainment as in the case of oxygen burning. The expansion of the boundary into the surrounding stable region and the entrainment rate are smaller at the bottom boundary because it is stiffer than the upper boundary. The results of this and similar studies call for improved CBM prescriptions in 1D stellar evolution models.

  20. MCNP Simulations of Measurement of Insulation Compaction in the Cryogenic Rocket Fuel Tanks at Kennedy Space Center by Fast/Thermal Neutron Techniques

    Science.gov (United States)

    Livingston, R. A.; Schweitzer, J. S.; Parsons, A. M.; Arens, E. E.

    2010-01-01

    MCNP simulations have been run to evaluate the feasibility of using a combination of fast and thermal neutrons as a nondestructive method to measure of the compaction of the perlite insulation in the liquid hydrogen and oxygen cryogenic storage tanks at John F. Kennedy Space Center (KSC). Perlite is a feldspathic volcanic rock made up of the major elements Si, AI, Na, K and 0 along with some water. When heated it expands from four to twenty times its original volume which makes it very useful for thermal insulation. The cryogenic tanks at Kennedy Space Center are spherical with outer diameters of 69-70 feet and lined with a layer of expanded perlite with thicknesses on the order of 120 cm. There is evidence that some of the perlite has compacted over time since the tanks were built 1965, affecting the thermal properties and possibly also the structural integrity of the tanks. With commercially available portable neutron generators it is possible to produce simultaneously fluxes of neutrons in two energy ranges: fast (14 Me V) and thermal (25 me V). The two energy ranges produce complementary information. Fast neutrons produce gamma rays by inelastic scattering, which is sensitive to Fe and O. Thermal neutrons produce gamma rays by prompt gamma neutron activation (PGNA) and this is sensitive to Si, Al, Na, K and H. The compaction of the perlite can be measured by the change in gamma ray signal strength which is proportional to the atomic number densities of the constituent elements. The MCNP simulations were made to determine the magnitude of this change. The tank wall was approximated by a I-dimensional slab geometry with an 11/16" outer carbon steel wall, an inner stainless wall and 120 cm thick perlite zone. Runs were made for cases with expanded perlite, compacted perlite or with various void fractions. Runs were also made to simulate the effect of adding a moderator. Tallies were made for decay-time analysis from t=0 to 10 ms; total detected gamma

  1. Dynamic load synthesis for shock numerical simulation in space structure design

    Science.gov (United States)

    Monti, Riccardo; Gasbarri, Paolo

    2017-08-01

    Pyroshock loads are the most stressing environments that a space equipment experiences during its operating life from a mechanical point of view. In general, the mechanical designer considers the pyroshock analysis as a very demanding constraint. Unfortunately, due to the non-linear behaviour of the structure under such loads, only the experimental tests can demonstrate if it is able to withstand these dynamic loads. By taking all the previous considerations into account, some preliminary information about the design correctness could be done by performing ;ad-hoc; numerical simulations, for example via commercial finite element software (i.e. MSC Nastran). Usually these numerical tools face the shock solution in two ways: 1) a direct mode, by using a time dependent enforcement and by evaluating the time-response and space-response as well as the internal forces; 2) a modal basis approach, by considering a frequency dependent load and of course by evaluating internal forces in the frequency domain. This paper has the main aim to develop a numerical tool to synthetize the time dependent enforcement based on deterministic and/or genetic algorithm optimisers. In particular starting from a specified spectrum in terms of SRS (Shock Response Spectrum) a time dependent discrete function, typically an acceleration profile, will be obtained to force the equipment by simulating the shock event. The synthetizing time and the interface with standards numerical codes will be two of the main topics dealt with in the paper. In addition a congruity and consistency methodology will be presented to ensure that the identified time dependent loads fully match the specified spectrum.

  2. Construction material processed using lunar simulant in various environments

    Science.gov (United States)

    Chase, Stan; Ocallaghan-Hay, Bridget; Housman, Ralph; Kindig, Michael; King, John; Montegrande, Kevin; Norris, Raymond; Vanscotter, Ryan; Willenborg, Jonathan; Staubs, Harry

    1995-01-01

    The manufacture of construction materials from locally available resources in space is an important first step in the establishment of lunar and planetary bases. The objective of the CoMPULSIVE (Construction Material Processed Using Lunar Simulant In Various Environments) experiment is to develop a procedure to produce construction materials by sintering or melting Johnson Space Center Simulant 1 (JSC-1) lunar soil simulant in both earth-based (1-g) and microgravity (approximately 0-g) environments. The characteristics of the resultant materials will be tested to determine its physical and mechanical properties. The physical characteristics include: crystalline, thermal, and electrical properties. The mechanical properties include: compressive tensile, and flexural strengths. The simulant, placed in a sealed graphite crucible, will be heated using a high temperature furnace. The crucible will then be cooled by radiative and forced convective means. The core furnace element consists of space qualified quartz-halogen incandescent lamps with focusing mirrors. Sample temperatures of up to 2200 C are attainable using this heating method.

  3. Numerical simulation and experimental research for the natural convection in an annular space in LMFBR

    International Nuclear Information System (INIS)

    Wang Zhou; Luo Rui; Yang Xianyong; Liang Taofeng

    1999-01-01

    In a pool fast reactor, the roof structure is penetrated by a number of pumps and heat exchanger units to form some annular spaces with various sizes. The natural convection of argon gas happens in the pool sky and the small annular gaps between those components and the roof containment due to thermosiphonic effects. The natural convection is studied experimentally and numerically to predict the temperature distributions inside the annular space and its surrounding structure. Numerical simulation is performed by using LVEL turbulence model and extending computational domain to the entire pool sky. The predicted results are in fair agreement with the experimental data. In comparison with commonly used k-ε model, LVEL model has better accuracy for the turbulent flow in a gap space

  4. Free-Space Squeezing Assists Perfectly Matched Layers in Simulations on a Tight Domain

    DEFF Research Database (Denmark)

    Shyroki, Dzmitry; Ivinskaya, Aliaksandra; Lavrinenko, Andrei

    2010-01-01

    outside the object, as in simulations of eigenmodes or scattering at a wavelength comparable to or larger than the object itself. Here, we show how, in addition to applying the perfectly matched layers (PMLs), outer free space can be squeezed to avoid cutting the evanescent field tails by the PMLs...... or computational domain borders. Adding the squeeze-transform layers to the standard PMLs requires no changes to the finite-difference algorithms....

  5. Amateur Radio on the International Space Station - the First Operational Payload on the ISS

    Science.gov (United States)

    Bauer, F. H.; McFadin, L.; Steiner, M.; Conley, C. L.

    2002-01-01

    As astronauts and cosmonauts have adapted to life on the International Space Station (ISS), they have found Amateur Radio and its connection to life on Earth to be a constant companion and a substantial psychological boost. Since its first use in November 2000, the first five expedition crews have utilized the amateur radio station in the FGB to talk to thousands of students in schools, to their families on Earth, and to amateur radio operators around the world. Early in the development of ISS, an international organization called ARISS (Amateur Radio on the International Space Station) was formed to coordinate the construction and operation of amateur radio (ham radio) equipment on ISS. ARISS represents a melding of the volunteer teams that have pioneered the development and use of amateur radio equipment on human spaceflight vehicles. The Shuttle/Space Amateur Radio Experiment (SAREX) team enabled Owen Garriott to become the first astronaut ham to use amateur radio from space in 1983. Since then, amateur radio teams in the U.S. (SAREX), Germany, (SAFEX), and Russia (Mirex) have led the development and operation of amateur radio equipment on board NASA's Space Shuttle, Russia's Mir space station, and the International Space Station. The primary goals of the ARISS program are fourfold: 1) educational outreach through crew contacts with schools, 2) random contacts with the Amateur Radio public, 3) scheduled contacts with the astronauts' friends and families and 4) ISS-based communications experimentation. To date, over 65 schools have been selected from around the world for scheduled contacts with the orbiting ISS crew. Ten or more students at each school ask the astronauts questions, and the nature of these contacts embodies the primary goal of the ARISS program, -- to excite student's interest in science, technology and amateur radio. The ARISS team has developed various hardware elements for the ISS amateur radio station. These hardware elements have flown to ISS

  6. Simulation of first SERENA KROTOS steam explosion experiment

    International Nuclear Information System (INIS)

    Leskovar, Matjaz; Ursic, Mitja

    2009-01-01

    A steam explosion may occur when, during a severe reactor accident, the molten core comes into contact with the coolant water. A strong enough steam explosion in a nuclear power plant could jeopardize the containment integrity and so lead to a direct release of radioactive material to the environment. To resolve the open issues in steam explosion understanding and modeling, the OECD program SERENA Phase 2 was launched at the end of year 2007, focusing on nuclear applications. SERENA comprises an experimental program, which is being carried out in the complementary KROTOS and TROI corium facilities, accompanied by a comprehensive analytical program, where also pre- and post-test calculations are foreseen. In the paper the sensitivity post-test calculations of the first SERENA KROTOS experiment KS-1, which were performed with the code MC3D, are presented and discussed. Since the results of the SERENA tests are restricted to SERENA members, only the various calculation results are given, not comparing them to experimental measurements. Various premixing and explosion simulations were performed on a coarse and a fine numerical mesh, applying two different jet breakup models (global, local) and varying the minimum bubble diameter in the explosion simulations (0.5 mm, 5 mm). The simulations revealed that all varied parameters have a significant influence on the calculation results, as was expected since the fuel coolant interaction process is a highly complex phenomenon. The results of the various calculations are presented in comparison and the observed differences are discussed and explained. (author)

  7. Numerical simulation of collision-free plasma using Vlasov hybrid simulation

    International Nuclear Information System (INIS)

    Nunn, D.

    1990-01-01

    A novel scheme for the numerical simulation of wave particle interactions in space plasmas has been developed. The method, termed VHS or Vlasov Hybrid Simulation, is applicable to hot collision free plasmas in which the unperturbed distribution functions is smooth and free of delta function singularities. The particle population is described as a continuous Vlasov fluid in phase space-granularity and collisional effects being ignored. In traditional PIC/CIC codes the charge/current due to each simulation particle is assigned to a fixed spatial grid. In the VHS method the simulation particles sample the Vlasov fluid and provide information about the value of distribution function (F(r,v) at random points in phase space. Values of F are interpolated from the simulation particles onto a fixed grid in velocity/position or phase space. With distribution function defined on a phase space grid the plasma charge/current field is quickly calculated. The simulation particles serve only to provide information, and thus the particle population may be dynamic. Particles no longer resonant with the wavefield may be discarded from the simulation, and new particles may be inserted into the Vlasov fluid where required

  8. Orthodontic space closure after first molar extraction without skeletal anchorage.

    Science.gov (United States)

    Jacobs, Collin; Jacobs-Müller, Claudia; Luley, Carolin; Erbe, Christina; Wehrbein, Heiner

    2011-03-01

    The aim of the study was an analysis of effects and side-effects during mesialization of second molars after extraction of the first permanent molars using the anterior dentition/premolars (PM) as an anchorage unit. A total of 35 patients were examined retrospectively who had undergone unilateral or bilateral first permanent molar extraction in the upper or lower arch due to carious lesions. Space closure was carried out in all cases through mesialization of the second molar using an elastic chain fixed to an edgewise stainless steel archwire and tying the anterior dentition/PM together with a continuous laceback ligature. Tooth movement was assessed from lateral cephalograms, orthopantomograms (OPGs) and images of the patient's study casts taken before and after the end of therapy. Space closure after first molar extractions by mesialization of the second molars without skeletal anchorage was largely achieved by bodily forward movement of the teeth, including a small tipping component or tooth-uprighting component when molars were already mesially inclined. Unilateral and bilateral mesialization of the second molars led to retrusion in the maxilla and mandible [(∆incl.=-3.6° (max., bil.), ∆incl.=-4.2° (mand., bil.)] and to translational retraction [(∆s=-2.3 mm (max., bil.), ∆s=-1.6 mm (mand., bil.)] of the incisors. Examination of the soft tissues revealed an increased posterior displacement of the upper and lower lips to the esthetic line [(∆s=-2.8 mm (max. bil.), ∆s=-2.2 mm (mand., bil.)]. In cases of unilateral mesialization less than 50% of the patients had a slight midline deviation in the mandible towards the extraction side. Side effects during mesialization of the second molars without skeletal anchorage in the anterior dentition/PM were observed primarily affecting the incisors integrated into the anterior anchorage unit. These side-effects resulted in posterior displacement of the soft tissues, including a change in profile. This must be

  9. Makerspace in STEM for Girls: A Physical Space to Develop Twenty-First-Century Skills

    Science.gov (United States)

    Sheffield, Rachel; Koul, Rekha; Blackley, Susan; Maynard, Nicoleta

    2017-01-01

    "Makerspace" has been lauded as a new way forward to create communities, empower students and bring together enthusiasts of all ages and skill levels "to tinker" and create. Makerspace education has been touted as having the potential to empower young people to become agents of change in their communities. This paper examines…

  10. Vapor Space Corrosion Testing Simulating The Environment Of Hanford Double Shell Tanks

    Energy Technology Data Exchange (ETDEWEB)

    Wiersma, B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Gray, J. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Garcia-Diaz, B. L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Murphy, T. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hicks, K. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2014-01-30

    As part of an integrated program to better understand corrosion in the high level waste tanks, Hanford has been investigating corrosion at the liquid/air interface (LAI) and at higher areas in the tank vapor space. This current research evaluated localized corrosion in the vapor space over Hanford double shell tank simulants to assess the impact of ammonia and new minimum nitrite concentration limits, which are part of the broader corrosion chemistry limits. The findings from this study showed that the presence of ammonia gas (550 ppm) in the vapor space is sufficient to reduce corrosion over the short-term (i.e. four months) for a Hanford waste chemistry (SY102 High Nitrate). These findings are in agreement with previous studies at both Hanford and SRS which showed ammonia gas in the vapor space to be inhibitive. The presence of ammonia in electrochemical test solution, however, was insufficient to inhibit against pitting corrosion. The effect of the ammonia appears to be a function of the waste chemistry and may have more significant effects in waste with low nitrite concentrations. Since high levels of ammonia were found beneficial in previous studies, additional testing is recommended to assess the necessary minimum concentration for protection of carbon steel. The new minimum R value of 0.15 was found to be insufficient to prevent pitting corrosion in the vapor space. The pitting that occurred, however, did not progress over the four-month test. Pits appeared to stop growing, which would indicate that pitting might not progress through wall.

  11. MCNP6 simulation of reactions of interest to FRIB, medical, and space applications

    International Nuclear Information System (INIS)

    Mashnik, Stepan G.

    2015-01-01

    The latest production-version of the Los Alamos Monte Carlo N-Particle transport code MCNP6 has been used to simulate a variety of particle-nucleus and nucleus-nucleus reactions of academic and applied interest to research subjects at the Facility for Rare Isotope Beams (FRIB), medical isotope production, space-radiation shielding, cosmic-ray propagation, and accelerator applications, including several reactions induced by radioactive isotopes, analyzing production of both stable and radioactive residual nuclei. Here, we discuss examples of validation and verification of MCNP6 by comparing with recent neutron spectra measured at the Heavy Ion Medical Accelerator in Chiba, Japan; spectra of light fragments from several reactions measured recently at GANIL, France; INFN Laboratori Nazionali del Sud, Catania, Italy; COSY of the Jülich Research Center, Germany; and cross sections of products from several reactions measured lately at GSI, Darmstadt, Germany; ITEP, Moscow, Russia; and, LANSCE, LANL, Los Alamos, U.S.A. As a rule, MCNP6 provides quite good predictions for most of the reactions we analyzed so far, allowing us to conclude that it can be used as a reliable and useful simulation tool for various applications for FRIB, medical, and space applications involving stable and radioactive isotopes. (author)

  12. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  13. Simulation of Mission Phases

    Science.gov (United States)

    Carlstrom, Nicholas Mercury

    2016-01-01

    Training Materials version 2013.0 release was used to complete the Trick tutorial. Multiple network privilege and repository permission requests were required in order to access previous simulation models. The project was also an introduction to computer programming and the Linux operating system. Basic C++ and Python syntax was used during the completion of the Trick tutorial. Trick's engineering analysis and Monte Carlo simulation capabilities were observed and basic space mission planning procedures were applied in the conceptual design phase. Multiple professional development opportunities were completed in addition to project duties during this internship through the System for Administration, Training, and Education Resources for NASA (SATERN). Topics include: JSC Risk Management Workshop, CCP Risk Management, Basic Radiation Safety Training, X-Ray Radiation Safety, Basic Laser Safety, JSC Export Control, ISS RISE Ambassador, Basic SharePoint 2013, Space Nutrition and Biochemistry, and JSC Personal Protective Equipment. Additionally, this internship afforded the opportunity for formal project presentation and public speaking practice. This was my first experience at a NASA center. After completing this internship I have a much clearer understanding of certain aspects of the agency's processes and procedures, as well as a deeper appreciation from spaceflight simulation design and testing. I will continue to improve my technical skills so that I may have another opportunity to return to NASA and Johnson Space Center.

  14. Slewing mirror telescope of the UFFO-pathfinder: first report on performance in space

    DEFF Research Database (Denmark)

    Gaikov, G.; Jeong, S.; Agaradahalli, V. G.

    2017-01-01

    of the UFFO-pathfinder payload, which was launched on April 28, 2016, onboard the Lomonosov satellite. For the first time, the slewing mirror system has been proven for the precision tracking of astrophysical objects during space operation. We confirmed that the SMT has 1.4 seconds of response time to the X...

  15. A simulation of the laser interferometer space antenna data stream from galactic white dwarf binaries

    International Nuclear Information System (INIS)

    Benacquista, M J; DeGoes, J; Lunder, D

    2004-01-01

    Gravitational radiation from the galactic population of white dwarf binaries is expected to produce a background signal in the laser interferometer space antenna (LISA) frequency band. At frequencies below 1 mHz, this signal is expected to be confusion limited and has been approximated as Gaussian noise. At frequencies above about 5 mHz, the signal will consist of separable individual sources. We have produced a simulation of the LISA data stream from a population of 90k galactic binaries in the frequency range between 1 and 5 mHz. This signal is compared with the simulated signal from globular cluster populations of binaries. Notable features of the simulation as well as potential data analysis schemes for extracting information are presented

  16. Design and Implementation of a Space Environment Simulation Toolbox for Small Satellites

    DEFF Research Database (Denmark)

    Amini, Rouzbeh; Larsen, Jesper A.; Izadi-Zamanabadi, Roozbeh

    2005-01-01

    This paper presents a developed toolbox for space environment model in SIMULINK that facilitates development and design of Attitude Determination and Control Systems (ADCS) for a Low Earth Orbit (LEO) spacecraft. The toolbox includes, among others, models of orbit propagators, disturbances, Earth...... gravity field, Earth magnetic field and eclipse. The structure and facilities within the toolbox are described and exemplified using a student satellite case (AAUSAT-II). The validity of developed models is confirmed by comparing the simulation results with the realistic data obtained from the Danish...

  17. Observation and simulation of space-charge effects in a radio-frequency photoinjector using a transverse multibeamlet distribution

    Directory of Open Access Journals (Sweden)

    M. Rihaoui

    2009-12-01

    Full Text Available We report on an experimental study of space-charge effects in a radio-frequency (rf photoinjector. A 5 MeV electron bunch, consisting of a number of beamlets separated transversely, was generated in an rf photocathode gun and propagated in the succeeding drift space. The collective interaction of these beamlets was studied for different experimental conditions. The experiment allowed the exploration of space-charge effects and its comparison with 3D particle-in-cell simulations. Our observations also suggest the possible use of a multibeam configuration to tailor the transverse distribution of an electron beam.

  18. Proceedings of the first symposium on Monte Carlo simulation

    International Nuclear Information System (INIS)

    2001-01-01

    The first symposium on Monte Carlo simulation was held at Mitsubishi Research Institute, Otemachi, Tokyo, on 10th and 11st of September, 1998. This symposium was organized by Nuclear Code Research Committee at Japan Atomic Energy Research Institute. In the sessions, were presented orally 21 papers on code development, parallel calculation, reactor physics, burn-up, criticality, shielding safety, dose evaluation, nuclear fusion reactor, thermonuclear fusion plasma, nuclear transmutation, electromagnetic cascade, fuel cycle facility. Those presented papers are compiled in this proceedings. The 21 of the presented papers are indexed individually. (J.P.N.)

  19. Fifty Years of Lightning Observations from Space

    Science.gov (United States)

    Christian, H. J., Jr.

    2017-12-01

    Some of the earliest satellites, starting with OSO (1965), ARIEL (1967), and RAE (1968), detected lightning using either optical and RF sensors, although that was not their intent. One of the earliest instruments designed to detect lightning was the PBE (1977). The use of space to study lightning activity has exploded since these early days. The advent of focal-plane imaging arrays made it possible to develop high performance optical lightning sensors. Prior to the use of charged-coupled devices (CCD), most space-based lightning sensors used only a few photo-diodes, which limited the location accuracy and detection efficiency (DE) of the instruments. With CCDs, one can limit the field of view of each detector (pixel), and thus improve the signal to noise ratio over single-detectors that summed the light reflected from many clouds with the lightning produced by a single cloud. This pixelization enabled daytime DE to increase from a few percent to close to 90%. The OTD (1995), and the LIS (1997), were the first lightning sensors to utilize focal-plane arrays. Together they detected global lightning activity for more than twenty years, providing the first detailed information on the distribution of global lightning and its variability. The FORTE satellite was launched shortly after LIS, and became the first dedicated satellite to simultaneously measure RF and optical lightning emissions. It too used a CCD focal plane to detect and locate lightning. In November 2016, the GLM became the first lightning instrument in geostationary orbit. Shortly thereafter, China placed its GLI in orbit. Lightning sensors in geostationary orbit significantly increase the value of space-based observations. For the first time, lightning activity can be monitored continuously, over large areas of the Earth with high, uniform DE and location accuracy. In addition to observing standard lightning, a number of sensors have been placed in orbit to detect transient luminous events and

  20. SPACE code simulation of ATLAS DVI line break accident test (SB DVI 08 Test)

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Sang Gyu [KHNP, Daejeon (Korea, Republic of)

    2012-10-15

    APR1400 has adopted new safety design features which are 4 mechanically independent DVI (Direct Vessel Injection) systems and fluidic device in the safety injection tanks (SITs). Hence, DVI line break accident has to be evaluated as one of the small break LOCA (SBLOCA) to ensure the safety of APR1400. KAERI has been performed for DVI line break test (SB DVI 08) using ATLAS (Advanced Thermal Hydraulic Test Loop for Accident Simulation) facility which is an integral effect test facility for APR1400. The test result shows that the core collapsed water level decreased before a loop seal clearance, so that a core uncover occurred. At this time, the peak cladding temperature (PCT) is rapidly increased even though the emergency core cooling (ECC) water is injected from safety injection pump (SIP). This test result is useful for supporting safety analysis using thermal hydraulic safety analysis code and increases the understanding of SBLOCA phenomena in APR1400. The SBLOCA evaluation methodology for APR1400 is now being developed using SPACE code. The object of the development of this methodology is to set up a conservative evaluation methodology in accordance with appendix K of 10 CFR 50. ATLAS SB DVI 08 test is selected for the evaluation of SBLOCA methodology using SPACE code. Before applying the conservative models and correlations, benchmark calculation of the test is performed with the best estimate models and correlations to verify SPACE code capability. This paper deals with benchmark calculations results of ATLAS SB DVI 08 test. Calculation results of the major hydraulics variables are compared with measured data. Finally, this paper carries out the SPACE code performances for simulating the integral effect test of SBLOCA.