Sample records for twenty-first space simulation

  1. The twenty-first century in space

    CERN Document Server

    Evans, Ben


    This final entry in the History of Human Space Exploration mini-series by Ben Evans continues with an in-depth look at the latter part of the 20th century and the start of the new millennium. Picking up where Partnership in Space left off, the story commemorating the evolution of manned space exploration unfolds in further detail. More than fifty years after Yuri Gagarin’s pioneering journey into space, Evans extends his overview of how that momentous voyage continued through the decades which followed. The Twenty-first Century in Space, the sixth book in the series, explores how the fledgling partnership between the United States and Russia in the 1990s gradually bore fruit and laid the groundwork for today’s International Space Station. The narrative follows the convergence of the Shuttle and Mir programs, together with standalone missions, including servicing the Hubble Space Telescope, many of whose technical and human lessons enabled the first efforts to build the ISS in orbit. The book also looks to...

  2. The twenty-first century commercial space imperative

    CERN Document Server

    Young, Anthony


    Young addresses the impressive expansion across existing and developing commercial space business markets, with multiple private companies competing in the payload launch services sector. The author pinpoints the new markets, technologies, and players in the industry, as well as highlighting the overall reasons why it is important for us to develop space. NASA now relies on commercial partners to supply cargo and crew spacecraft and services to and from the International Space Station. The sizes of satellites are diminishing and their capabilities expanding, while costs to orbit are decreasing. Suborbital space tourism holds the potential of new industries and jobs. Commercial space exploration of the Moon and the planets also holds promise. All this activity is a catalyst for anyone interested in joining the developing space industry, from students and researchers to engineers and entrepreneurs. As more and more satellites and rockets are launched and the business of space is expanding at a signifi...

  3. United States Military Space: Into the Twenty-First Century (United States)


    famous and articulate spokesmen for planetary science; Pale Blue Dot : A Vision of the Human Future in Space (New York: Random House, 1994) was one...and defining human characteristic. Carl Sagan is a primary spokesman for those who view spaceflight in scientific and ecological terms and see it as...Spacefaring Civilization (New York: Jeremy P. Tarcher/Putnam, 1999). Carl Sagan cofounded the Planetary Society in 1980 and was one of the most

  4. Automation and robotics for Space Station in the twenty-first century (United States)

    Willshire, K. F.; Pivirotto, D. L.


    Space Station telerobotics will evolve beyond the initial capability into a smarter and more capable system as we enter the twenty-first century. Current technology programs including several proposed ground and flight experiments to enable development of this system are described. Advancements in the areas of machine vision, smart sensors, advanced control architecture, manipulator joint design, end effector design, and artificial intelligence will provide increasingly more autonomous telerobotic systems.

  5. Space power technology for the twenty-first century (SPT21)

    International Nuclear Information System (INIS)

    Borger, W.U.; Massie, L.D.


    During the spring and summer months of 1987, the Aero Propulsion Laboratory of the Air Force Wright Aeronautical Laboratories, Wright-Patterson AFB, Ohio in cooperation with the Air Force Space Technology Center at Kirtland AFB, New Mexico, undertook an initiative to develop a Strategic Plan for Space Power Technology Development. The initiative was called SPT21, Space Power Technology for the Twenty-First Century. The planning process involved the participation of other Government organizations (U.S. Army, Navy, DOE and NASA) along with major aerospace companies and universities. Following an SPT21 kickoff meeting on 28 May 1987, detailed strategic planning was accomplished through seven (7) Space Power Technology Discipline Workshops commencing in June 1987 and concluding in August 1987. Technology Discipline Workshops were conducted in the following areas: (1) Solar Thermal Dynamic Power Systems (2) Solar Photovoltaic Cells and Arrays (3) Thermal Management Technology (4) Energy Storage Technology (5) Nuclear Power Systems Technology (6) Power Conditioning, Distribution and Control and (7) Systems Technology/Advanced Concepts. This technical paper summarizes the planning process and describes the salient findings and conclusions of the workshops

  6. CLARREO shortwave observing system simulation experiments of the twenty-first century: Simulator design and implementation

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, D.R.; Algieri, C.A.; Ong, J.R.; Collins, W.D.


    Projected changes in the Earth system will likely be manifested in changes in reflected solar radiation. This paper introduces an operational Observational System Simulation Experiment (OSSE) to calculate the signals of future climate forcings and feedbacks in top-of-atmosphere reflectance spectra. The OSSE combines simulations from the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report for the NCAR Community Climate System Model (CCSM) with the MODTRAN radiative transfer code to calculate reflectance spectra for simulations of current and future climatic conditions over the 21st century. The OSSE produces narrowband reflectances and broadband fluxes, the latter of which have been extensively validated against archived CCSM results. The shortwave reflectance spectra contain atmospheric features including signals from water vapor, liquid and ice clouds, and aerosols. The spectra are also strongly influenced by the surface bidirectional reflectance properties of predicted snow and sea ice and the climatological seasonal cycles of vegetation. By comparing and contrasting simulated reflectance spectra based on emissions scenarios with increasing projected and fixed present-day greenhouse gas and aerosol concentrations, we find that prescribed forcings from increases in anthropogenic sulfate and carbonaceous aerosols are detectable and are spatially confined to lower latitudes. Also, changes in the intertropical convergence zone and poleward shifts in the subsidence zones and the storm tracks are all detectable along with large changes in snow cover and sea ice fraction. These findings suggest that the proposed NASA Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission to measure shortwave reflectance spectra may help elucidate climate forcings, responses, and feedbacks.

  7. Simulating care: technology-mediated learning in twenty-first century nursing education. (United States)

    Diener, Elizabeth; Hobbs, Nelda


    The increased reliance on simulation classrooms has proven successful in learning skills. Questions persist concerning the ability of technology-driven robotic devices to form and cultivate caring behaviors, or sufficiently develop interactive nurse-client communication necessary in the context of nursing. This article examines the disconnects created by use of simulation technology in nursing education, raising the question: "Can learning of caring-as-being, be facilitated in simulation classrooms?" We propose that unless time is spent with human beings in the earliest stages of nursing education, transpersonal caring relationships do not have space to develop. Learning, crafting, and maturation of caring behaviors threatens to become a serendipitous event or is no longer perceived as an essential characteristic of nursing. Technology does not negate caring-the isolation it fosters makes transpersonal caring all the more important. We are called to create a new paradigm for nursing education that merges Nightingale's vision with technology's promise. © 2012 Wiley Periodicals, Inc.

  8. Twenty-first century quantum mechanics Hilbert space to quantum computers mathematical methods and conceptual foundations

    CERN Document Server

    Fano, Guido


    This book is designed to make accessible to nonspecialists the still evolving concepts of quantum mechanics and the terminology in which these are expressed. The opening chapters summarize elementary concepts of twentieth century quantum mechanics and describe the mathematical methods employed in the field, with clear explanation of, for example, Hilbert space, complex variables, complex vector spaces and Dirac notation, and the Heisenberg uncertainty principle. After detailed discussion of the Schrödinger equation, subsequent chapters focus on isotropic vectors, used to construct spinors, and on conceptual problems associated with measurement, superposition, and decoherence in quantum systems. Here, due attention is paid to Bell’s inequality and the possible existence of hidden variables. Finally, progression toward quantum computation is examined in detail: if quantum computers can be made practicable, enormous enhancements in computing power, artificial intelligence, and secure communication will result...

  9. The Space Nuclear Thermal Propulsion Program: Propulsion for the twenty first century

    International Nuclear Information System (INIS)

    Bleeker, G.; Moody, J.; Kesaree, M.


    As mission requirements approach the limits of the chemical propulsion systems, new engines must be investigated that can meet the advanced mission requirements of higher payload fractions, higher velocities, and consequently higher specific Impulses (Isp). The propulsion system that can meet these high demands is a nuclear thermal rocket engine. This engine generates the thrust by expanding/existing the hydrogen, heated from the energy derived from the fission process in a reactor, through a nozzle. The Department of Defense (DoD), however, initiated a new nuclear rocket development program in 1987 for ballistic missile defense application. The Space Nuclear Thermal Propulsion (SNTP) Program that seeks to improve on the technology of ROVER/NERVA grew out of this beginning and has been managed by the Air Force, with the involvement of DoE and NASA. The goal of the SNTP Program is to develop an engine to meet potential Air Force requirements for upper stage engine, bimodal propulsion/power applications, and orbital transfer vehicles, as well as the NASA requirements for possible missions to the Moon and Mars. During the entire life of the program, the DoD has considered safety to be of paramount importance, and is following all national environmental policies

  10. Twenty first century climate change as simulated by European climate models

    International Nuclear Information System (INIS)

    Cubasch, Ulrich


    Full text: Climate change simulation results for seven European state-of-the-art climate models, participating in the European research project ENSEMBLES (ENSEMBLE-based Predictions of Climate Changes and their Impacts), will be presented. Models from Norway, France, Germany, Denmark, and Great Britain, representing a sub-ensemble of the models contributing to the Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCC AR4), are included. Climate simulations are conducted with all the models for present-day climate and for future climate under the SRES A1B, A2, and B1 scenarios. The design of the simulations follows the guidelines of the IPCC AR4. The 21st century projections are compared to the corresponding present-day simulations. The ensemble mean global mean near surface temperature rise for the year 2099 compared to the 1961-1990 period amounts to 3.2Kforthe A1B scenario, to 4.1 K for the A2 scenario, and to 2.1 K for the B1 scenario. The spatial patterns of temperature change are robust among the contributing models with the largest temperature increase over the Arctic in boreal winter, stronger warming overland than over ocean, and little warming over the southern oceans. The ensemble mean globally averaged precipitation increases for the three scenarios (5.6%, 5.7%, and 3.8% for scenarios A1B, A2, and B1, respectively). The precipitation signals of the different models display a larger spread than the temperature signals. In general, precipitation increases in the Intertropical Convergence Zone and the mid- to high latitudes (most pronounced during the hemispheric winter) and decreases in the subtropics. Sea-level pressure decreases over the polar regions in all models and all scenarios, which is mainly compensated by a pressure increase in the subtropical highs. These changes imply an intensification of the Southern and Northern Annular Modes

  11. Greenland Surface Mass Balance as Simulated by the Community Earth System Model. Part II: Twenty-First-Century Changes

    NARCIS (Netherlands)

    Vizcaino, M.; Lipscomb, W.H.; Sacks, W.J.; van den Broeke, M.R.


    This study presents the first twenty-first-century projections of surface mass balance (SMB) changes for the Greenland Ice Sheet (GIS) with the Community Earth System Model (CESM), which includes a new ice sheet component. For glaciated surfaces, CESM includes a sophisticated calculation of energy

  12. Space science in the twenty-first century: imperatives for the decades 1995 to 2015 : life sciences

    National Research Council Canada - National Science Library


    Early in 1984, NASA asked the Space Science Board to undertake a study to determine the principal scientific issues that the disciplines of space science would face during the period from about 1995 to 2015...

  13. Changes of climate regimes during the last millennium and the twenty-first century simulated by the Community Earth System Model (United States)

    Huang, Wei; Feng, Song; Liu, Chang; Chen, Jie; Chen, Jianhui; Chen, Fahu


    This study examines the shifts in terrestrial climate regimes using the Köppen-Trewartha (K-T) climate classification by analyzing the Community Earth System Model Last Millennium Ensemble (CESM-LME) simulations for the period 850-2005 and CESM Medium Ensemble (CESM-ME), CESM Large Ensemble (CESM-LE) and CESM with fixed aerosols Medium Ensemble (CESM-LE_FixA) simulations for the period 1920-2080. We compare K-T climate types from the Medieval Climate Anomaly (MCA) (950-1250) with the Little Ice Age (LIA) (1550-1850), from present day (PD) (1971-2000) with the last millennium (LM) (850-1850), and from the future (2050-2080) with the LM in order to place anthropogenic changes in the context of changes due to natural forcings occurring during the last millennium. For CESM-LME, we focused on the simulations with all forcings, though the impacts of individual forcings (e.g., solar activities, volcanic eruptions, greenhouse gases, aerosols and land use changes) were also analyzed. We found that the climate types changed slightly between the MCA and the LIA due to weak changes in temperature and precipitation. The climate type changes in PD relative to the last millennium have been largely driven by greenhouse gas-induced warming, but anthropogenic aerosols have also played an important role on regional scales. At the end of the twenty-first century, the anthropogenic forcing has a much greater effect on climate types than the PD. Following the reduction of aerosol emissions, the impact of greenhouse gases will further promote global warming in the future. Compared to precipitation, changes in climate types are dominated by greenhouse gas-induced warming. The large shift in climate types by the end of this century suggests possible wide-spread redistribution of surface vegetation and a significant change in species distributions.

  14. Twenty-first century vaccines (United States)

    Rappuoli, Rino


    In the twentieth century, vaccination has been possibly the greatest revolution in health. Together with hygiene and antibiotics, vaccination led to the elimination of many childhood infectious diseases and contributed to the increase in disability-free life expectancy that in Western societies rose from 50 to 78–85 years (Crimmins, E. M. & Finch, C. E. 2006 Proc. Natl Acad. Sci. USA 103, 498–503; Kirkwood, T. B. 2008 Nat. Med 10, 1177–1185). In the twenty-first century, vaccination will be expected to eliminate the remaining childhood infectious diseases, such as meningococcal meningitis, respiratory syncytial virus, group A streptococcus, and will address the health challenges of this century such as those associated with ageing, antibiotic resistance, emerging infectious diseases and poverty. However, for this to happen, we need to increase the public trust in vaccination so that vaccines can be perceived as the best insurance against most diseases across all ages. PMID:21893537

  15. Twenty-first Century Space Science in The Urban High School Setting: The NASA/John Dewey High School Educational Outreach Partnership (United States)

    Fried, B.; Levy, M.; Reyes, C.; Austin, S.


    A unique and innovative partnership has recently developed between NASA and John Dewey High School, infusing Space Science into the curriculum. This partnership builds on an existing relationship with MUSPIN/NASA and their regional center at the City University of New York based at Medgar Evers College. As an outgrowth of the success and popularity of our Remote Sensing Research Program, sponsored by the New York State Committee for the Advancement of Technology Education (NYSCATE), and the National Science Foundation and stimulated by MUSPIN-based faculty development workshops, our science department has branched out in a new direction - the establishment of a Space Science Academy. John Dewey High School, located in Brooklyn, New York, is an innovative inner city public school with students of a diverse multi-ethnic population and a variety of economic backgrounds. Students were recruited from this broad spectrum, which covers the range of learning styles and academic achievement. This collaboration includes students of high, average, and below average academic levels, emphasizing participation of students with learning disabilities. In this classroom without walls, students apply the strategies and methodologies of problem-based learning in solving complicated tasks. The cooperative learning approach simulates the NASA method of problem solving, as students work in teams, share research and results. Students learn to recognize the complexity of certain tasks as they apply Earth Science, Mathematics, Physics, Technology and Engineering to design solutions. Their path very much follows the NASA model as they design and build various devices. Our Space Science curriculum presently consists of a one-year sequence of elective classes taken in conjunction with Regents-level science classes. This sequence consists of Remote Sensing, Planetology, Mission to Mars (NASA sponsored research program), and Microbiology, where future projects will be astronomy related. This

  16. Capital in the Twenty-First Century

    DEFF Research Database (Denmark)

    Hansen, Per H.


    Review essay on: Capital in the Twenty-First Century. By Thomas Piketty . Translated by Arthur Goldhammer . Cambridge, Mass.: The Belknap Press of Harvard University Press, 2014. viii + 685 pp......Review essay on: Capital in the Twenty-First Century. By Thomas Piketty . Translated by Arthur Goldhammer . Cambridge, Mass.: The Belknap Press of Harvard University Press, 2014. viii + 685 pp...

  17. Early twenty-first-century droughts during the warmest climate

    Directory of Open Access Journals (Sweden)

    Felix Kogan


    Full Text Available The first 13 years of the twenty-first century have begun with a series of widespread, long and intensive droughts around the world. Extreme and severe-to-extreme intensity droughts covered 2%–6% and 7%–16% of the world land, respectively, affecting environment, economies and humans. These droughts reduced agricultural production, leading to food shortages, human health deterioration, poverty, regional disturbances, population migration and death. This feature article is a travelogue of the twenty-first-century global and regional droughts during the warmest years of the past 100 years. These droughts were identified and monitored with the National Oceanic and Atmospheric Administration operational space technology, called vegetation health (VH, which has the longest period of observation and provides good data quality. The VH method was used for assessment of vegetation condition or health, including drought early detection and monitoring. The VH method is based on operational satellites data estimating both land surface greenness (NDVI and thermal conditions. The twenty-first-century droughts in the USA, Russia, Australia and Horn of Africa were intensive, long, covered large areas and caused huge losses in agricultural production, which affected food security and led to food riots in some countries. This research also investigates drought dynamics presenting no definite conclusion about drought intensification or/and expansion during the time of the warmest globe.

  18. Increasing precipitation volatility in twenty-first-century California (United States)

    Swain, Daniel L.; Langenbrunner, Baird; Neelin, J. David; Hall, Alex


    Mediterranean climate regimes are particularly susceptible to rapid shifts between drought and flood—of which, California's rapid transition from record multi-year dryness between 2012 and 2016 to extreme wetness during the 2016-2017 winter provides a dramatic example. Projected future changes in such dry-to-wet events, however, remain inadequately quantified, which we investigate here using the Community Earth System Model Large Ensemble of climate model simulations. Anthropogenic forcing is found to yield large twenty-first-century increases in the frequency of wet extremes, including a more than threefold increase in sub-seasonal events comparable to California's `Great Flood of 1862'. Smaller but statistically robust increases in dry extremes are also apparent. As a consequence, a 25% to 100% increase in extreme dry-to-wet precipitation events is projected, despite only modest changes in mean precipitation. Such hydrological cycle intensification would seriously challenge California's existing water storage, conveyance and flood control infrastructure.

  19. Teachers' Critical Reflective Practice in the Context of Twenty-First Century Learning (United States)

    Benade, Leon


    In the twenty-first century, learning and teaching at school must prepare young people for engaging in a complex and dynamic world deeply influenced by globalisation and the revolution in digital technology. In addition to the use of digital technologies, is the development of flexible learning spaces. It is claimed that these developments demand,…

  20. Digital earth applications in the twenty-first century

    NARCIS (Netherlands)

    de By, R.A.; Georgiadou, P.Y.


    In these early years of the twenty-first century, we must look at how the truly cross-cutting information technology supports other innovations, and how it will fundamentally change the information positions of government, private sector and the scientific domain as well as the citizen. In those

  1. Afterword: Victorian Sculpture for the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    David J. Getsy


    Full Text Available Commenting on the directions proposed by this issue of '19', the afterword discusses the broad trends in twenty-first century studies of Victorian sculpture and the opportunity for debate arising from the first attempt at a comprehensive exhibition.


    African Journals Online (AJOL)


    $154,37 ( hardback). With the publication of Seapower: A Guide for the Twenty-First Century. Geoffrey Till has set the standard for publications on all things maritime. The updated and expanded new edition of the book is an essential guide for students of naval history and maritime strategy and provides ...

  3. Proceedings of the twenty-first LAMPF users group meeting

    International Nuclear Information System (INIS)


    The Twenty-First Annual LAMPF Users Group Meeting was held November 9-10, 1987, at the Clinton P. Anderson Meson Physics Facility. The program included a number of invited talks on various aspects of nuclear and particle physics as well as status reports on LAMPF and discussions of upgrade options. The LAMPF working groups met and discussed plans for the secondary beam lines, experimental programs, and computing facilities

  4. Why the American public supports twenty-first century learning. (United States)

    Sacconaghi, Michele


    Aware that constituent support is essential to any educational endeavor, the AOL Time Warner Foundation (now the Time Warner Foundation), in conjunction with two respected national research firms, measured Americans' attitudes toward the implementation of twenty-first century skills. The foundation's national research survey was intended to explore public perceptions of the need for changes in the educational system, in school and after school, with respect to the teaching of twenty-first century skills. The author summarizes the findings of the survey, which were released by the foundation in June 2003. One thousand adults were surveyed by telephone, including African Americans, Latinos, teachers, and business executives. In general, the survey found that Americans believe today's students need a "basics-plus" education, meaning communication, technology, and critical thinking skills in addition to the traditional basics of reading, writing, and math. In fact, 92 percent of respondents stated that students today need different skills from those of ten to twenty years ago. Also, after-school programs were found to be an appropriate vehicle to teach these skills. Furthermore, the survey explored how well the public perceives schools to be preparing youth for the workforce and postsecondary education, which twenty-first century skills are seen as being taught effectively, and the level of need for after-school and summer programs. The survey results provide conclusive evidence of national support for basics-plus education. Thus, a clear opportunity exists to build momentum for a new model of education for the twenty-first century.

  5. Technological sciences society of the twenty-first century

    International Nuclear Information System (INIS)


    This book introduces information-oriented society of the twenty-first century connected to computer network for example memory of dream : F-ram, information-oriented society : New media, communications network for next generation ; ISDN on what is IDSN?, development of information service industry, from office automation to an intelligent building in the future, home shopping and home banking and rock that hinders information-oriented society.

  6. NATO’s Relevance in the Twenty-First Century (United States)


    reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...5d. PROJECT NUMBER Colonel John K. Jones 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES...Christopher Coker, Globalisation and Insecurity in the Twenty-first Century: NATO and the Management of Risk (The International Institute for Strategic

  7. Designing Vaccines for the Twenty-First Century Society


    Finco, Oretta; Rappuoli, Rino


    The history of vaccination clearly demonstrates that vaccines have been highly successful in preventing infectious diseases, reducing significantly the incidence of childhood diseases and mortality. However, many infections are still not preventable with the currently available vaccines and they represent a major cause of mortality worldwide. In the twenty-first century, the innovation brought by novel technologies in antigen discovery and formulation together with a deeper knowledge of the h...

  8. Twenty-first-century medical microbiology services in the UK. (United States)

    Duerden, Brian


    With infection once again a high priority for the UK National Health Service (NHS), the medical microbiology and infection-control services require increased technology resources and more multidisciplinary staff. Clinical care and health protection need a coordinated network of microbiology services working to consistent standards, provided locally by NHS Trusts and supported by the regional expertise and national reference laboratories of the new Health Protection Agency. Here, I outline my thoughts on the need for these new resources and the ways in which clinical microbiology services in the UK can best meet the demands of the twenty-first century.

  9. Accelerators for the twenty-first century a review

    CERN Document Server

    Wilson, Edmund J N


    The development of the synchrotron, and later the storage ring, was based upon the electrical technology at the turn of this century, aided by the microwave radar techniques of World War II. This method of acceleration seems to have reached its limit. Even superconductivity is not likely to lead to devices that will satisfy physics needs into the twenty-first century. Unless a new principle for accelerating elementary particles is discovered soon, it is difficult to imagine that high-energy physics will continue to reach out to higher energies and luminosities.

  10. The Turn to Precarity in Twenty-First Century Fiction

    Directory of Open Access Journals (Sweden)

    Morrison Jago


    Full Text Available Recent years have seen several attempts by writers and critics to understand the changed sensibility in post-9/11 fiction through a variety of new -isms. This essay explores this cultural shift in a different way, finding a ‘turn to precarity’ in twenty-first century fiction characterised by a renewal of interest in the flow and foreclosure of affect, the resurgence of questions about vulnerability and our relationships to the other, and a heightened awareness of the social dynamics of seeing. The essay draws these tendencies together via the work of Judith Butler in Frames of War, in an analysis of Trezza Azzopardi’s quasi-biographical study of precarious life, Remember Me.

  11. Twenty-First Water Reaction Safety Information Meeting

    International Nuclear Information System (INIS)

    Monteleone, S.


    This three-volume report contains 90 papers out of the 102 that were presented at the Twenty-First Water Reactor Safety Information Meeting held at the Bethesda Marriott Hotel, Bethesda, Maryland, during the week of October 25--27, 1993. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Germany, Japan, Russia, Switzerland, Taiwan, and United Kingdom. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. Individual papers have been cataloged separately. This document, Volume 2, presents papers on severe accident research

  12. Twenty-First Water Reactor Safety Information Meeting

    International Nuclear Information System (INIS)

    Monteleone, S.


    This three-volume report contains 90 papers out of the 102 that were presented at the Twenty-First Water Reactor Safety Information Meeting held at the Bethesda Marriott Hotel, Bethesda, Maryland, during the week of October 25-27, 1993. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Germany, Japan, Russia, Switzerland, Taiwan, and United Kingdom. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. Selected papers were indexed separately for inclusion in the Energy Science and Technology Database

  13. The Dialectics of Discrimination in the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    John Stone


    Full Text Available This article explores some of the latest developments in the scholarship on race relations and nationalism that seek to address the impact of globalization and the changed geo-political relations of the first decade of the twenty-first century. New patterns of identification, some of which challenge existing group boundaries and others that reinforce them, can be seen to flow from the effects of global market changes and the political counter-movements against them. The impact of the “war on terrorism”, the limits of the utility of hard power, and the need for new mechanisms of inter-racial and inter-ethnic conflict resolution are evaluated to emphasize the complexity of these group relations in the new world disorder.

  14. Space plasma simulation chamber

    International Nuclear Information System (INIS)


    Scientific results of experiments and tests of instruments performed with the Space Plasma Simulation Chamber and its facility are reviewed in the following six categories. 1. Tests of instruments on board rockets, satellites and balloons. 2. Plasma wave experiments. 3. Measurements of plasma particles. 4. Optical measurements. 5. Plasma production. 6. Space plasms simulations. This facility has been managed under Laboratory Space Plasma Comittee since 1969 and used by scientists in cooperative programs with universities and institutes all over country. A list of publications is attached. (author)

  15. Strategies for Teaching Maritime Archaeology in the Twenty First Century (United States)

    Staniforth, Mark


    Maritime archaeology is a multi-faceted discipline that requires both theoretical learning and practical skills training. In the past most universities have approached the teaching of maritime archaeology as a full-time on-campus activity designed for ‘traditional’ graduate students; primarily those in their early twenties who have recently come from full-time undergraduate study and who are able to study on-campus. The needs of mature-age and other students who work and live in different places (or countries) and therefore cannot attend lectures on a regular basis (or at all) have largely been ignored. This paper provides a case study in the teaching of maritime archaeology from Australia that, in addition to ‘traditional’ on-campus teaching, includes four main components: (1) learning field methods through field schools; (2) skills training through the AIMA/NAS avocational training program; (3) distance learning topics available through CD-ROM and using the Internet; and (4) practicums, internships and fellowships. The author argues that programs to teach maritime archaeology in the twenty first century need to be flexible and to address the diverse needs of students who do not fit the ‘traditional’ model. This involves collaborative partnerships with other universities as well as government underwater cultural heritage management agencies and museums, primarily through field schools, practicums and internships.

  16. Nuclear energy into the twenty-first century

    International Nuclear Information System (INIS)

    Hammond, G.P.


    The historical development of the civil nuclear power generation industry is examined in the light of the need to meet conflicting energy-supply and environmental pressures over recent decades. It is suggested that fission (thermal and fast) reactors will dominate the market up to the period 2010-2030, with fusion being relegated to the latter part of the twenty-first century. A number of issues affecting the use of nuclear electricity generation in Western Europe are considered including its cost, industrial strategy needs, and the public acceptability of nuclear power. The contribution of nuclear power stations to achieving CO2 targets aimed at relieving global warming is discussed in the context of alternative strategies for sustainable development, including renewable energy sources and energy-efficiency measures. Trends in the generation of nuclear electricity from fission reactors are finally considered in terms of the main geopolitical groupings that make up the world in the mid-1990s. Several recent, but somewhat conflicting, forecasts of the role of nuclear power in the fuel mix to about 2020 are reviewed. It is argued that the only major expansion in generating capacity will take place on the Asia-Pacific Rim and not in the developing countries generally. Nevertheless, the global nuclear industry overall will continue to be dominated by a small number of large nuclear electricity generating countries; principally the USA, France and Japan. (UK)

  17. The twenty-first century challenges to sexuality and religion. (United States)

    Turner, Yolanda; Stayton, William


    Clergy and religious leaders are facing a wide variety of sexual needs and concerns within their faith communities. Conflicts over sexual issues are growing across the entire spectrum of religious denominations, and clerics remain ill prepared to deal with them. As religious communities work to remain influential in public policy debates, clergy and the institutions that train them need to be properly prepared for twenty-first century challenges that impact sexuality and religion. Clergy are often the first point of contact for sexual problems and concerns of their faith community members-complex issues centered on morals, spirituality, and ethics. Yet, there still exists a significant lack of sexual curricula in the programs that are educating our future religious leaders. The resulting paucity of knowledge leaves these leaders unprepared to address the needs and concerns of their congregants. However, with accurate, relevant human sexuality curricula integrated into theological formation programs, future leaders will be equipped to competently serve their constituencies. This paper provides a rationale for the need for such training, an overview of the faith- and theology-based history of a pilot training project, and a description of how the Christian faith and the social sciences intersect in a training pilot project's impetus and process.

  18. Twenty-first workshop on geothermal reservoir engineering: Proceedings

    Energy Technology Data Exchange (ETDEWEB)



    PREFACE The Twenty-First Workshop on Geothermal Reservoir Engineering was held at the Holiday Inn, Palo Alto on January 22-24, 1996. There were one-hundred fifty-five registered participants. Participants came from twenty foreign countries: Argentina, Austria, Canada, Costa Rica, El Salvador, France, Iceland, Indonesia, Italy, Japan, Mexico, The Netherlands, New Zealand, Nicaragua, the Philippines, Romania, Russia, Switzerland, Turkey and the UK. The performance of many geothermal reservoirs outside the United States was described in several of the papers. Professor Roland N. Horne opened the meeting and welcomed visitors. The key note speaker was Marshall Reed, who gave a brief overview of the Department of Energy's current plan. Sixty-six papers were presented in the technical sessions of the workshop. Technical papers were organized into twenty sessions concerning: reservoir assessment, modeling, geology/geochemistry, fracture modeling hot dry rock, geoscience, low enthalpy, injection, well testing, drilling, adsorption and stimulation. Session chairmen were major contributors to the workshop, and we thank: Ben Barker, Bobbie Bishop-Gollan, Tom Box, Jim Combs, John Counsil, Sabodh Garg, Malcolm Grant, Marcel0 Lippmann, Jim Lovekin, John Pritchett, Marshall Reed, Joel Renner, Subir Sanyal, Mike Shook, Alfred Truesdell and Ken Williamson. Jim Lovekin gave the post-dinner speech at the banquet and highlighted the exciting developments in the geothermal field which are taking place worldwide. The Workshop was organized by the Stanford Geothermal Program faculty, staff, and graduate students. We wish to thank our students who operated the audiovisual equipment. Shaun D. Fitzgerald Program Manager.

  19. Space robot simulator vehicle (United States)

    Cannon, R. H., Jr.; Alexander, H.


    A Space Robot Simulator Vehicle (SRSV) was constructed to model a free-flying robot capable of doing construction, manipulation and repair work in space. The SRSV is intended as a test bed for development of dynamic and static control methods for space robots. The vehicle is built around a two-foot-diameter air-cushion vehicle that carries batteries, power supplies, gas tanks, computer, reaction jets and radio equipment. It is fitted with one or two two-link manipulators, which may be of many possible designs, including flexible-link versions. Both the vehicle body and its first arm are nearly complete. Inverse dynamic control of the robot's manipulator has been successfully simulated using equations generated by the dynamic simulation package SDEXACT. In this mode, the position of the manipulator tip is controlled not by fixing the vehicle base through thruster operation, but by controlling the manipulator joint torques to achieve the desired tip motion, while allowing for the free motion of the vehicle base. One of the primary goals is to minimize use of the thrusters in favor of intelligent control of the manipulator. Ways to reduce the computational burden of control are described.

  20. Virtual Reality: Teaching Tool of the Twenty-First Century? (United States)

    Hoffman, Helene; Vu, Dzung


    Virtual reality-based procedural and surgical simulations promise to revolutionize medical training. A wide range of simulations representing diverse content areas and varied implementation strategies are under development or in early use. The new systems will make broad-based training experiences available for students at all levels without risks…

  1. Virtual reality: teaching tool of the twenty-first century? (United States)

    Hoffman, H; Vu, D


    Virtual reality (VR) is gaining recognition for its enormous educational potential. While not yet in the mainstream of academic medical training, many prototype and first-generation VR applications are emerging, with target audiences ranging from first- and second-year medical students to residents in advanced clinical training. Visualization tools that take advantage of VR technologies are being designed to provide engaging and intuitive environments for learning visually and spatially complex topics such as human anatomy, biochemistry, and molecular biology. These applications present dynamic, three-dimensional views of structures and their spatial relationships, enabling users to move beyond "real-world" experiences by interacting with or altering virtual objects in ways that would otherwise be difficult or impossible. VR-based procedural and surgical simulations, often compared with flight simulators in aviation, hold significant promise for revolutionizing medical training. Already a wide range of simulations, representing diverse content areas and utilizing a variety of implementation strategies, are either under development or in their early implementation stages. These new systems promise to make broad-based training experiences available for students at all levels, without the risks and ethical concerns typically associated with using animal and human subjects. Medical students could acquire proficiency and gain confidence in the ability to perform a wide variety of techniques long before they need to use them clinically. Surgical residents could rehearse and refine operative procedures, using an unlimited pool of virtual patients manifesting a wide range of anatomic variations, traumatic wounds, and disease states. Those simulated encounters, in combination with existing opportunities to work with real patients, could increase the depth and breadth of learners' exposure to medical problems, ensure uniformity of training experiences, and enhance the

  2. Uncertainty in Twenty-First-Century CMIP5 Sea Level Projections (United States)

    Little, Christopher M.; Horton, Radley M.; Kopp, Robert E.; Oppenheimer, Michael; Yip, Stan


    The representative concentration pathway (RCP) simulations included in phase 5 of the Coupled Model Intercomparison Project (CMIP5) quantify the response of the climate system to different natural and anthropogenic forcing scenarios. These simulations differ because of 1) forcing, 2) the representation of the climate system in atmosphere-ocean general circulation models (AOGCMs), and 3) the presence of unforced (internal) variability. Global and local sea level rise projections derived from these simulations, and the emergence of distinct responses to the four RCPs depend on the relative magnitude of these sources of uncertainty at different lead times. Here, the uncertainty in CMIP5 projections of sea level is partitioned at global and local scales, using a 164-member ensemble of twenty-first-century simulations. Local projections at New York City (NYSL) are highlighted. The partition between model uncertainty, scenario uncertainty, and internal variability in global mean sea level (GMSL) is qualitatively consistent with that of surface air temperature, with model uncertainty dominant for most of the twenty-first century. Locally, model uncertainty is dominant through 2100, with maxima in the North Atlantic and the Arctic Ocean. The model spread is driven largely by 4 of the 16 AOGCMs in the ensemble; these models exhibit outlying behavior in all RCPs and in both GMSL and NYSL. The magnitude of internal variability varies widely by location and across models, leading to differences of several decades in the local emergence of RCPs. The AOGCM spread, and its sensitivity to model exclusion and/or weighting, has important implications for sea level assessments, especially if a local risk management approach is utilized.

  3. Drone Warfare: Twenty-First Century Empire and Communications

    Directory of Open Access Journals (Sweden)

    Kevin Howley


    Full Text Available This paper, part of a larger project that examines drones from a social-construction of technology perspective, considers drone warfare in light of Harold Innis’s seminal work on empire and communication. Leveraging leading-edge aeronautics with advanced optics, data processing, and networked communication, drones represent an archetypal “space-biased” technology. Indeed, by allowing remote operators and others to monitor, select, and strike targets from half a world away, and in real-time, these weapon systems epitomize the “pernicious neglect of time” Innis sought to identify and remedy in his later writing. With Innis’s time-space dialectic as a starting point, then, the paper considers drones in light of a longstanding paradox of American culture: the impulse to collapse the geographical distance between the United States and other parts of the globe, while simultaneously magnifying the cultural difference between Americans and other peoples and societies. In the midst of the worldwide proliferation of drones, this quintessentially sublime technology embodies this (disconnect in important, profound, and ominous ways.

  4. Cyber Attacks and Terrorism: A Twenty-First Century Conundrum. (United States)

    Albahar, Marwan


    In the recent years, an alarming rise in the incidence of cyber attacks has made cyber security a major concern for nations across the globe. Given the current volatile socio-political environment and the massive increase in the incidence of terrorism, it is imperative that government agencies rapidly realize the possibility of cyber space exploitation by terrorist organizations and state players to disrupt the normal way of life. The threat level of cyber terrorism has never been as high as it is today, and this has created a lot of insecurity and fear. This study has focused on different aspects of cyber attacks and explored the reasons behind their increasing popularity among the terrorist organizations and state players. This study proposes an empirical model that can be used to estimate the risk levels associated with different types of cyber attacks and thereby provide a road map to conceptualize and formulate highly effective counter measures and cyber security policies.

  5. Continuing Professional Development in the Twenty-First Century. (United States)

    Sachdeva, Ajit K


    The critical role of continuing professional development (CPD) in supporting delivery of patient care of the highest quality and safety is receiving significant attention in the current era of monumental change. CPD is essential in efforts to ensure effectiveness of new models of health care delivery, improve outcomes and value in health care, address external regulations, and foster patient engagement. The unique features of CPD; the use of special mastery-based teaching, learning, and assessment methods, and other special interventions to promote excellence; and direct involvement of a variety of key stakeholders differentiate CPD from undergraduate medical education and graduate medical education. The needs of procedural specialties relating to CPD are different from those of primary care disciplines and require special attention for the greatest impact. Simulation-based education and training can be very useful in CPD aimed at improving outcomes and promoting patient safety. Preceptoring, proctoring, mentoring, and coaching should be used routinely to address specific needs in CPD. Distinct CPD strategies are necessary for retraining, reentry, and remediation. Participation in CPD programs can be encouraged by leveraging the joy of learning, which should drive physicians and surgeons to strive continually to be the best in their professional work.

  6. Analysis of the projected regional sea-ice changes in the Southern Ocean during the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Lefebvre, W.; Goosse, H. [Universite Catholique de Louvain, Institut d' Astronomie et de Geophysique Georges Lemaitre, Louvain-la-Neuve (Belgium)


    Using the set of simulations performed with atmosphere-ocean general circulation models (AOGCMs) for the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR4), the projected regional distribution of sea ice for the twenty-first century has been investigated. Averaged over all those model simulations, the current climate is reasonably well reproduced. However, this averaging procedure hides the errors from individual models. Over the twentieth century, the multimodel average simulates a larger sea-ice concentration decrease around the Antarctic Peninsula compared to other regions, which is in qualitative agreement with observations. This is likely related to the positive trend in the Southern Annular Mode (SAM) index over the twentieth century, in both observations and in the multimodel average. Despite the simulated positive future trend in SAM, such a regional feature around the Antarctic Peninsula is absent in the projected sea-ice change for the end of the twenty-first century. The maximum decrease is indeed located over the central Weddell Sea and the Amundsen-Bellingshausen Seas. In most models, changes in the oceanic currents could play a role in the regional distribution of the sea ice, especially in the Ross Sea, where stronger southward currents could be responsible for a smaller sea-ice decrease during the twenty-first century. Finally, changes in the mixed layer depth can be found in some models, inducing locally strong changes in the sea-ice concentration. (orig.)

  7. A Critical Feminist and Race Critique of Thomas Piketty's "Capital in the Twenty-First Century" (United States)

    Moeller, Kathryn


    Thomas Piketty's "Capital in the Twenty-first Century" documents the foreboding nature of rising wealth inequality in the twenty-first century. In an effort to promote a more just and democratic global society and rein in the unfettered accumulation of wealth by the few, Piketty calls for a global progressive annual tax on corporate…

  8. Guidelines to Design Engineering Education in the Twenty-First Century for Supporting Innovative Product Development (United States)

    Violante, Maria Grazia; Vezzetti, Enrico


    In the twenty-first century, meeting our technological challenges demands educational excellence, a skilled populace that is ready for the critical challenges society faces. There is widespread consensus, however, that education systems are failing to adequately prepare all students with the essential twenty-first century knowledge and skills…

  9. Twenty-First Century Literacy: A Matter of Scale from Micro to Mega (United States)

    Brown, Abbie; Slagter van Tryon, Patricia J.


    Twenty-first century technologies require educators to look for new ways to teach literacy skills. Current communication methods are combinations of traditional and newer, network-driven forms. This article describes the changes twenty-first century technologies cause in the perception of time, size, distance, audience, and available data, and…

  10. Projected Changes on the Global Surface Wave Drift Climate towards the END of the Twenty-First Century (United States)

    Carrasco, Ana; Semedo, Alvaro; Behrens, Arno; Weisse, Ralf; Breivik, Øyvind; Saetra, Øyvind; Håkon Christensen, Kai


    The global wave-induced current (the Stokes Drift - SD) is an important feature of the ocean surface, with mean values close to 10 cm/s along the extra-tropical storm tracks in both hemispheres. Besides the horizontal displacement of large volumes of water the SD also plays an important role in the ocean mix-layer turbulence structure, particularly in stormy or high wind speed areas. The role of the wave-induced currents in the ocean mix-layer and in the sea surface temperature (SST) is currently a hot topic of air-sea interaction research, from forecast to climate ranges. The SD is mostly driven by wind sea waves and highly sensitive to changes in the overlaying wind speed and direction. The impact of climate change in the global wave-induced current climate will be presented. The wave model WAM has been forced by the global climate model (GCM) ECHAM5 wind speed (at 10 m height) and ice, for present-day and potential future climate conditions towards the end of the end of the twenty-first century, represented by the Intergovernmental Panel for Climate Change (IPCC) CMIP3 (Coupled Model Inter-comparison Project phase 3) A1B greenhouse gas emission scenario (usually referred to as a ''medium-high emissions'' scenario). Several wave parameters were stored as output in the WAM model simulations, including the wave spectra. The 6 hourly and 0.5°×0.5°, temporal and space resolution, wave spectra were used to compute the SD global climate of two 32-yr periods, representative of the end of the twentieth (1959-1990) and twenty-first (1969-2100) centuries. Comparisons of the present climate run with the ECMWF (European Centre for Medium-Range Weather Forecasts) ERA-40 reanalysis are used to assess the capability of the WAM-ECHAM5 runs to produce realistic SD results. This study is part of the WRCP-JCOMM COWCLIP (Coordinated Ocean Wave Climate Project) effort.

  11. Twenty-first nuclear accident dosimetry intercomparison study, August 6-10, 1984

    International Nuclear Information System (INIS)

    Swaja, R.E.; Ragan, G.E.; Sims, C.S.


    The twenty-first in a series of nuclear accident dosimetry (NAD) intercomparison (NAD) studies was conducted at the Oak Ridge National Laboratory's Dosimetry Applications Research Facility during August 6-10, 1984. The Health Physics Research Reactor operated in the pulse mode was used to simulate three criticality accidents with different radiation fields. Participants from five organizations measured neutron doses between 0.53 and 4.36 Gy and gamma doses between 0.19 and 1.01 Gy at area monitoring stations and on phantoms. About 75% of all neutron dose estimates based on foil activation, hair activation, simulated blood sodium activation, and thermoluminescent methods were within +-25% of reference values. Approximately 86% of all gamma results measured using thermoluminescent (TLD-700 or CaSO 4 ) systems were within +-20% of reference doses which represents a significant improvement over previous studies. Improvements observed in the ability of intercomparison participants to estimate neutron and gamma doses under criticality accident conditions can be partly attributed to experience in previous NAD studies which have provided practical tests of dosimetry systems, enabled participants to improve evaluation methods, and standardized dose reporting conventions. 16 refs., 15 tabs

  12. Guidelines to design engineering education in the twenty-first century for supporting innovative product development (United States)

    Violante, Maria Grazia; Vezzetti, Enrico


    In the twenty-first century, meeting our technological challenges demands educational excellence, a skilled populace that is ready for the critical challenges society faces. There is widespread consensus, however, that education systems are failing to adequately prepare all students with the essential twenty-first century knowledge and skills necessary to succeed in life, career, and citizenship. The purpose of this paper is to understand how twenty-first century knowledge and skills can be appropriately embedded in engineering education finalised to innovative product development by using additive manufacturing (AM). The study designs a learning model by which to achieve effective AM education to address the requirements of twenty-first century and to offer students the occasion to experiment with STEM (Science, technology, engineering, and mathematics) concepts. The study is conducted using the quality function deployment (QFD) methodology.

  13. Book Review: Africa and Europe in the Twenty-First Century ...

    African Journals Online (AJOL)

    Abstract. Title: Africa and Europe in the Twenty-First Century. Author: Osita C. Eze and Amadu Sesay. Publisher: Nigerian Institute of International Affairs, 2010, xvi + 397pp, Tables, Index. ISBN: 978-002-102-7 ...

  14. China's iGeneration - Cinema and Moving Image Culture for the Twenty-First Century


    Johnson, Matthew D.; Wagner, Keith B.; Yu, Tianqui; Vulpiani, Luke


    Collection of essays on twenty-first century Chinese cinema and moving image culture. This innovative collection of essays on twenty-first century Chinese cinema and moving image culture features contributions from an international community of scholars, critics, and practitioners. Taken together, their perspectives make a compelling case that the past decade has witnessed a radical transformation of conventional notions of cinema. Following China's accession to the WTO in 2001, personal ...

  15. Nuclear power in the twenty-first century - An assessment (Part 1)


    von Hirschhausen, Christian


    Nuclear power was one of the most important discoveries of the twentieth century, and it continues to play an important role in twenty-first century discussions about the future energy mix, climate change, innovation, proliferation, geopolitics, and many other crucial policy topics. This paper addresses some key issues around the emergence of nuclear power in the twentieth century and perspectives going forward in the twenty-first, including questions of economics and competitiveness, the str...

  16. Fusion energy from the Moon for the twenty-first century

    International Nuclear Information System (INIS)

    Kulcinski, G.L.; Cameron, E.N.; Santarius, J.F.; Sviatoslavsky, I.N.; Wittenberg, L.J.; Schmitt, H.H.


    It is shown in this paper that the D-He-3 fusion fuel cycle is not only credible from a physics standpoint, but that its breakeven and ignition characteristics could be developed on roughly the same time schedule as the DT cycle. It was also shown that the extremely low fraction of power in neutrons, the lack of significant radioactivity in the reactants, and the potential for very high conversion efficiencies, can result in definite advantages for the D-He-3 cycle with respect to DT fusion and fission reactors in the twenty-first century. More specifically, the D-He-3 cycle can accomplish the following: (1) eliminate the need for deep geologic waste burial facilities and the wastes can qualify for Class A, near-surface land burial; (2) allow inherently safe reactors to be built that, under the worst conceivable accident, cannot cause a civilian fatality or result in a significant (greater than 100 mrem) exposure to a member of the public; (3) reduce the radiation damage levels to a point where no scheduled replacement of reactor structural components is required, i.e., full reactor lifetimes (approximately 30 FPY) can be credibly claimed; (4) increase the reliability and availability of fusion reactors compared to DT systems because of the greatly reduced radioactivity, the low neutron damage, and the elimination of T breeding; and (5) greatly reduce the capital costs of fusion power plants (compared to DT systems) by as much as 50 percent and present the potential for a significant reduction on the COE. The concepts presented in this paper tie together two of the most ambitious high-technology endeavors of the twentieth century: the development of controlled thermonuclear fusion for civilian power applications and the utilization of outer space for the benefit of mankind on Earth

  17. Border Crossing in Contemporary Brazilian Culture: Global Perspectives from the Twenty-First Century Literary Scene

    Directory of Open Access Journals (Sweden)

    Cimara Valim de Melo


    Full Text Available Abstract: This paper investigates the process of internationalisation of Brazilian literature in the twenty-first century from the perspective of the publishing market. For this, we analyse how Brazil has responded to globalisation and what effects of cultural globalisation can be seen in the Brazilian literary scene, focusing on the novel. Observing the movement of the novelists throughout the globe, the reception of Brazilian literature in the United Kingdom and the relations between art and the literary market in Brazil, we intend to provoke some reflections on Brazilian cultural history in the light of the twenty-first century.

  18. Theoretical Contexts and Conceptual Frames for the Study of Twenty-First Century Capitalism

    DEFF Research Database (Denmark)

    Hull Kristensen, Peer; Morgan, Glenn


    This chapter argues that the comparative institutionalist approach requires rethinking in the light of developments in the twenty-first century. The chapter emphasizes the following features of the new environment: first, the rise of the BRIC and the emerging economies; secondly, the changed...

  19. Visual Literacy: Does It Enhance Leadership Abilities Required for the Twenty-First Century? (United States)

    Bintz, Carol


    The twenty-first century hosts a well-established global economy, where leaders are required to have increasingly complex skills that include creativity, innovation, vision, relatability, critical thinking and well-honed communications methods. The experience gained by learning to be visually literate includes the ability to see, observe, analyze,…

  20. 76 FR 21741 - Twenty-First Century Communications and Video Programming Accessibility Act; Announcement of Town... (United States)


    ... equipment distribution program for people who are deaf-blind. In addition, the law will fill accessibility... Programming Accessibility Act; Announcement of Town Hall Meeting AGENCY: Federal Communications Commission... The Twenty-First Century Communications and Video Programming Accessibility Act (the Act or CVAA...

  1. How Do Students Value the Importance of Twenty-First Century Skills? (United States)

    Ahonen, Arto Kalevi; Kinnunen, Päivi


    Frameworks of twenty-first century skills have attained a central role in school development and curriculum changes all over the world. There is a common understanding of the need for meta-skills such as problem solving, reasoning, collaboration, and self-regulation. This article presents results from a Finnish study, in which 718 school pupils…

  2. Speaking American: Comparing Supreme Court and Hollywood Racial Interpretation in the Early Twenty-First Century (United States)

    Hawkins, Paul Henry


    Apprehending that race is social, not biological, this study examines U.S. racial formation in the early twenty-first century. In particular, Hollywood and Supreme Court texts are analyzed as media for gathering, shaping and transmitting racial ideas. Representing Hollywood, the 2004 film "Crash" is analyzed. Representing the Supreme Court, the…

  3. Testing Students under Cognitive Capitalism: Knowledge Production of Twenty-First Century Skills (United States)

    Morgan, Clara


    Scholars studying the global governance of education have noted the increasingly important role corporations play in educational policy making. I contribute to this scholarship by examining the Assessment and Teaching of twenty-first century skills (ATC21S™) project, a knowledge production apparatus operating under cognitive capitalism. I analyze…

  4. Humanities: The Unexpected Success Story of the Twenty-First Century (United States)

    Davis, Virginia


    Humanities within universities faced challenges in the latter half of the twentieth century as their value in the modern world was questioned. This paper argues that there is strong potential for the humanities to thrive in the twenty-first century university sector. It outlines some of the managerial implications necessary to ensure that this…

  5. Movies to the Rescue: Keeping the Cold War Relevant for Twenty-First-Century Students (United States)

    Gokcek, Gigi; Howard, Alison


    What are the challenges of teaching Cold War politics to the twenty-first-century student? How might the millennial generation be educated about the political science theories and concepts associated with this period in history? A college student today, who grew up in the post-Cold War era with the Internet, Facebook, Twitter, smart phones,…

  6. Critical Remarks on Piketty's 'Capital in the Twenty-first Century'


    Homburg, Stefan


    This paper discusses the central macroeconomic claims that are made in Thomas Piketty's book 'Capital in the Twenty-first Century'. The paper aims to show that Piketty's contentions are not only logically flawed but also contradicted by his own data.

  7. Thomas Piketty – The Adam Smith of the Twenty-First Century?

    Directory of Open Access Journals (Sweden)

    Jacob Dahl Rendtorff


    Full Text Available Piketty’s book, Capital in the Twenty-First Century (2014 has become a bestseller in the world. Two month after its publication, it had sold more than 200.000 copies, and this success will surely continue for a long time. Piketty has established a new platform to discuss political economy.

  8. TPACK Updated to Measure Pre-Service Teachers' Twenty-First Century Skills (United States)

    Valtonen, Teemu; Sointu, Erkko; Kukkonen, Jari; Kontkanen, Sini; Lambert, Matthew C.; Mäkitalo-Siegl, Kati


    Twenty-first century skills have attracted significant attention in recent years. Students of today and the future are expected to have the skills necessary for collaborating, problem solving, creative and innovative thinking, and the ability to take advantage of information and communication technology (ICT) applications. Teachers must be…

  9. 2010 Critical Success Factors for the North Carolina Community College System. Twenty First Annual Report (United States)

    North Carolina Community College System (NJ1), 2010


    First mandated by the North Carolina General Assembly in 1989 (S.L. 1989; C. 752; S. 80), the Critical Success Factors report has evolved into the major accountability document for the North Carolina Community College System. This twenty first annual report on the critical success factors is the result of a process undertaken to streamline and…


    Directory of Open Access Journals (Sweden)

    Akosz Ozan


    Full Text Available Tourism is one of the fastest growing industries in the world. Besides its sustained growth the tourism industry has shown in the first years of the twenty first century that it can deal with political, military and natural disasters. The present paper ac

  11. Leadership for Twenty-First-Century Schools and Student Achievement: Lessons Learned from Three Exemplary Cases (United States)

    Schrum, Lynne; Levin, Barbara B.


    The purpose of this research was to understand ways exemplary award winning secondary school leaders have transformed their schools for twenty-first-century education and student achievement. This article presents three diverse case studies and identifies ways that each school's leader and leadership team reconfigured its culture and expectations,…

  12. Synthesis of Carbon Nano tubes: A Revolution in Material Science for the Twenty-First Century

    International Nuclear Information System (INIS)

    Allaf, Abd. W.


    The aim of this work is to explain the preparation procedures of single walled carbon nano tubes using arc discharge technique. The optimum conditions of carbon nano tubes synthesis are given. It should be pointed out that this sort of materials would be the twenty-first century materials

  13. Twenty First Century Education: Transformative Education for Sustainability and Responsible Citizenship (United States)

    Bell, David V. J.


    Many ministries of education focus on twenty-first century education but unless they are looking at this topic through a sustainability lens, they will be missing some of its most important elements. The usual emphasis on developing skills for employability in the current global economy begs the question whether the global economy is itself…

  14. The conundrum of religious schools in twenty-first-century Europe

    NARCIS (Netherlands)

    Merry, M.S.


    In this paper Merry examines in detail the continued - and curious - popularity of religious schools in an otherwise ‘secular’ twenty-first century Europe. To do this he considers a number of motivations underwriting the decision to place one's child in a religious school and delineates what are

  15. Twenty-first Semiannual Report of the Commission to the Congress, January 1957

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, Lewis L.


    The document represents the twenty-first semiannual Atomic Energy Commission (AEC) report to Congress. The report sums up the major activities and developments in the national atomic energy program covering the period July - December 1956. A special part two of this semiannual report addresses specifically Radiation Safety in Atomic Energy Activities.

  16. Teaching and Learning in the Twenty-First Century: What Is an "Institute of Education" for? (United States)

    Husbands, Chris


    As we begin the twenty-first century, schools and teachers are subject to enormous pressures for change. The revolution in digital technologies, the pressure to develop consistently high-performing schools systems, and the drive between excellence and equity all combine to raise profound questions about the nature of successful teaching and…

  17. Establishing the R&D Agenda for Twenty-First Century Learning (United States)

    Kay, Ken; Honey, Margaret


    Much ink has flowed over the past few years describing the need to incorporate twenty-first century skills into K-12 education. Preparing students to succeed as citizens, thinkers, and workers--the bedrock of any educational system--in this environment means arming them with more than a list of facts and important dates. Infusing twenty-first…

  18. Twenty-first century learning for teachers: helping educators bring new skills into the classroom. (United States)

    Wilson, John I


    The motivation behind every educator's dedication and hard work in the classroom is the knowledge that his or her teaching will result in students' success in life. Educators are committed to implementing twenty-first century skills; they have no question that students need such skills to be equipped for life beyond school. Members of the National Education Association are enthusiastic about the Partnership for 21st Century Skills framework, yet express frustration that many schools do not have adequate resources to make the necessary changes. Teaching these skills poses significant new responsibilities for schools and educators. To make it possible for teachers to build twenty-first century skills into the curriculum, physical and policy infrastructures must exist, professional development and curriculum materials must be offered, and meaningful assessments must be available. With an established understanding of what skills need to be infused into the classroom-problem solving, analysis, and com- munications-and educators' commitment to the new skill set, this chapter explores how to make such a dramatic reform happen. The author discusses existing strategies that will guide educators in infusing twenty-first century skills into traditional content areas such as math, English, geography, and science. Ultimately, public policy regarding educational standards, professional development, assessments, and physical school structures must exist to enable educators to employ twenty-first century skills, leading to student success in contemporary life. Any concern about the cost of bringing this nation's educational system up to par internationally should be offset by the price that not making twenty-first century skills a priority in the classroom will have on future economic well-being.

  19. Taking Up Space: Museum Exploration in the Twenty-First Century (United States)

    Sutton, Tiffany


    Museums have become a crucible for questions of the role that traditional art and art history should play in contemporary art. Friedrich Nietzsche argued in the nineteenth century that museums can be no more than mausoleums for effete (fine) art. Over the course of the twentieth century, however, curators dispelled such blanket pessimism by…

  20. Makerspace in STEM for Girls: A Physical Space to Develop Twenty-First-Century Skills (United States)

    Sheffield, Rachel; Koul, Rekha; Blackley, Susan; Maynard, Nicoleta


    "Makerspace" has been lauded as a new way forward to create communities, empower students and bring together enthusiasts of all ages and skill levels "to tinker" and create. Makerspace education has been touted as having the potential to empower young people to become agents of change in their communities. This paper examines…

  1. Autonomous Robotic Weapons: US Army Innovation for Ground Combat in the Twenty-First Century (United States)


    1 Introduction Today the robot is an accepted fact, but the principle has not been pushed far enough. In the twenty-first century the...2013, accessed March 29, 2015, 113 Steven Kotler , “Say Hello to Comrade Terminator: Russia’s...of autonomous robotic weapons, black- marketed directed energy weapons, and or commercially available software, potential adversaries may find

  2. Why American business demands twenty-first century skills: an industry perspective. (United States)

    Bruett, Karen


    Public education is the key to individual and business prosperity. With a vested stake in education, educators, employers, parents, policymakers, and the public should question how this nation's public education system is faring. Knowing that recent international assessments have shown little or no gains in American students' achievement, the author asserts the clear need for change. As both a large American corporate employer and a provider of technology for schools, Dell is concerned with ensuring that youth will thrive in their adult lives. Changing workplace expectations lead to a new list of skills students will need to acquire before completing their schooling. Through technology, Dell supports schools in meeting educational goals, striving to supply students with the necessary skills, referred to as twenty-first century skills. The Partnership for 21st Century Skills, of which Dell is a member, has led an initiative to define what twenty-first century learning should entail. Through extensive research, the partnership has built a framework outlining twenty-first century skills: analytical thinking, communication, collaboration, global awareness, and technological and economic literacy. Dell and the partnership are working state by state to promote the integration of these skills into curricula, professional development for teachers, and classroom environments. The authors describe two current initiatives, one in Virginia, the other in Texas, which both use technology to help student learning. All stakeholders can take part in preparing young people to compete in the global economy. Educators and administrators, legislators, parents, and employers must play their role in helping students be ready for what the workforce and the world has in store for them.

  3. A history of meniscal surgery: from ancient times to the twenty-first century. (United States)

    Di Matteo, B; Moran, C J; Tarabella, V; Viganò, A; Tomba, P; Marcacci, M; Verdonk, R


    The science and surgery of the meniscus have evolved significantly over time. Surgeons and scientists always enjoy looking forward to novel therapies. However, as part of the ongoing effort at optimizing interventions and outcomes, it may also be useful to reflect on important milestones from the past. The aim of the present manuscript was to explore the history of meniscal surgery across the ages, from ancient times to the twenty-first century. Herein, some of the investigations of the pioneers in orthopaedics are described, to underline how their work has influenced the management of the injured meniscus in modern times. Level of evidence V.

  4. Neurogenetics in Child Neurology: Redefining a Discipline in the Twenty-first Century. (United States)

    Kaufmann, Walter E


    Increasing knowledge on genetic etiology of pediatric neurologic disorders is affecting the practice of the specialty. I reviewed here the history of pediatric neurologic disorder classification and the role of genetics in the process. I also discussed the concept of clinical neurogenetics, with its role in clinical practice, education, and research. Finally, I propose a flexible model for clinical neurogenetics in child neurology in the twenty-first century. In combination with disorder-specific clinical programs, clinical neurogenetics can become a home for complex clinical issues, repository of genetic diagnostic advances, educational resource, and research engine in child neurology.

  5. Managing the twenty-first century reference department challenges and prospects

    CERN Document Server

    Katz, Linda S


    Learn the skills needed to update and manage a reference department that efficiently meets the needs of clients today?and tomorrow! Managing the Twenty-First Century Reference Department: Challenges and Prospects provides librarians with the knowledge and skills they need to manage an effective reference service. Full of useful and practical ideas, this book presents successful methods for recruiting and retaining capable reference department staff and management, training new employees and adapting current services to an evolving field. Expert practitioners address the changing role of the r

  6. Report of the twenty-first session, London, 18-22 February 1991

    International Nuclear Information System (INIS)


    The Joint Group of Experts on the Scientific Aspects of Marine Pollution (GESAMP) held its twenty-first session at the Headquarters of the International Maritime Organization (IMO), London, from 18 to 22 February 1991. Marine pollution is primarily linked to coastal development. The most serious problems are those associated with inadequately controlled coastal development and intensive human settlement of the coastal zone. GESAMP emphasizes the importance of the following problems and issues: State of the marine environment; comprehensive framework for the assessment and regulation of waste disposal in the marine environment; information on preparations for the United Nations Conference on Environment and Development; review of potentially harmful substances: 1. Carcinogenic substances. 2. Mutagenic substances. 3. Teratogenic substances. 4. Organochlorine compounds. 5. Oil, and other hydrocarbons including used lubricating oils, oil spill dispersants and chemicals used in offshore oil exploration and exploitation; environmental impacts of coastal aquaculture; global change and the air/sea exchange of chemicals; future work programme

  7. A Farewell to Innocence? African Youth and Violence in the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    Charles Ugochukwu Ukeje


    Full Text Available This is a broad examination of the issue of youth violence in twenty-first-century Africa, looking at the context within which a youth culture of violence has evolved and attempting to understand the underlining discourses of hegemony and power that drive it. The article focuses specifically on youth violence as apolitical response to the dynamics of (disempowerment, exclusion, and economic crisis and uses (postconflict states like Liberia, Sierra Leone, and Nigeriato explain not just the overall challenge of youth violence but also the nature of responses that it has elicited from established structures of authority. Youth violence is in many ways an expression of youth agency in the context of a social and economic system that provides little opportunity.

  8. Ecological restoration should be redefined for the twenty-first century. (United States)

    Martin, David M


    Forty years ago, ecological restoration was conceptualized through a natural science lens. Today, ecological restoration has evolved into a social and scientific concept. The duality of ecological restoration is acknowledged in guidance documents on the subject but is not apparent in its definition. Current definitions reflect our views about what ecological restoration does but not why we do it. This viewpoint does not give appropriate credit to contributions from social sciences, nor does it provide compelling goals for people with different motivating rationales to engage in or support restoration. In this study, I give a concise history of the conceptualization and definition of ecological restoration, and I propose an alternative definition and corresponding viewpoint on restoration goal-setting to meet twenty-first century scientific and public inquiry.

  9. Global threats from invasive alien species in the twenty-first century and national response capacities (United States)

    Early, Regan; Bradley, Bethany A.; Dukes, Jeffrey S.; Lawler, Joshua J.; Olden, Julian D.; Blumenthal, Dana M.; Gonzalez, Patrick; Grosholz, Edwin D.; Ibañez, Ines; Miller, Luke P.; Sorte, Cascade J. B.; Tatem, Andrew J.


    Invasive alien species (IAS) threaten human livelihoods and biodiversity globally. Increasing globalization facilitates IAS arrival, and environmental changes, including climate change, facilitate IAS establishment. Here we provide the first global, spatial analysis of the terrestrial threat from IAS in light of twenty-first century globalization and environmental change, and evaluate national capacities to prevent and manage species invasions. We find that one-sixth of the global land surface is highly vulnerable to invasion, including substantial areas in developing economies and biodiversity hotspots. The dominant invasion vectors differ between high-income countries (imports, particularly of plants and pets) and low-income countries (air travel). Uniting data on the causes of introduction and establishment can improve early-warning and eradication schemes. Most countries have limited capacity to act against invasions. In particular, we reveal a clear need for proactive invasion strategies in areas with high poverty levels, high biodiversity and low historical levels of invasion. PMID:27549569

  10. Niels Bohr and the philosophy of physics twenty-first century perspectives

    CERN Document Server

    Folse, Henry


    Niels Bohr and Philosophy of Physics: Twenty-First Century Perspectives examines the philosophical views, influences and legacy of the Nobel Prize physicist and philosophical spokesman of the quantum revolution, Niels Bohr. The sixteen contributions in this collection by some of the best contemporary philosophers and physicists writing on Bohr's philosophy today all carefully distinguish his subtle and unique interpretation of quantum mechanics from views often imputed to him under the banner of the “Copenhagen Interpretation.” With respect to philosophical influences on Bohr's outlook, the contributors analyse prominent similarities between his viewpoint and Kantian ways of thinking, the views of the Danish philosopher Harald Høffding, and themes characteristic of American pragmatism. In recognizing the importance of Bohr's epistemological naturalism they examine his defence of the indispensability of classical concepts from a variety of different perspectives. This collection shows us that Bohr's int...

  11. Golf science research at the beginning of the twenty-first century. (United States)

    Farrally, M R; Cochran, A J; Crews, D J; Hurdzan, M J; Price, R J; Snow, J T; Thomas, P R


    At the beginning of the twenty-first century, there are 30,000 golf courses and 55 million people who play golf worldwide. In the USA alone, the value of golf club memberships sold in the 1990s was US dollar 3.2 billion. Underpinning this significant human activity is a wide variety of people researching and applying science to sustain and develop the game. The 11 golf science disciplines recognized by the World Scientific Congress of Golf have reported 311 papers at four world congresses since 1990. Additionally, scientific papers have been published in discipline-specific peer-reviewed journals, research has been sponsored by the two governing bodies of golf, the Royal and Ancient Golf Club of St. Andrews and the United States Golf Association, and confidential research is undertaken by commercial companies, especially equipment manufacturers. This paper reviews much of this human endeavour and points the way forward for future research into golf.

  12. Civil Rights Laws as Tools to Advance Health in the Twenty-First Century. (United States)

    McGowan, Angela K; Lee, Mary M; Meneses, Cristina M; Perkins, Jane; Youdelman, Mara


    To improve health in the twenty-first century, to promote both access to and quality of health care services and delivery, and to address significant health disparities, legal and policy approaches, specifically those focused on civil rights, could be used more intentionally and strategically. This review describes how civil rights laws, and their implementation and enforcement, help to encourage health in the United States, and it provides examples for peers around the world. The review uses a broad lens to define health for both classes of individuals and their communities--places where people live, learn, work, and play. Suggestions are offered for improving health and equity broadly, especially within societal groups and marginalized populations. These recommendations include multisectorial approaches that focus on the social determinants of health.

  13. Twenty-first century learning after school: the case of Junior Achievement Worldwide. (United States)

    Box, John M


    Efforts to increase after-school programming indicate the nation's concern about how youth are engaged during out-of-school time. There are clear benefits to extending the learning that goes on during the school day. Research from the U.S. Departments of Education and Justice shows that after-school participants do better in school and have stronger expectations for the future than youth who are not occupied after school. And the need is evident: 14.3 million students return to an empty house after school, yet only 6.5 million children are currently enrolled in after-school programs. If an after-school program were available, parents of 15.3 million would enroll their child. JA Worldwide began in 1919 and has been rooted in the afterschool arena from its origins. Its after-school programs teach students about the free enterprise system through curriculum focusing on business, citizenship, economics, entrepreneurship, ethics and character, financial literacy, and career development. At the same time, JA Worldwide incorporates hands-on learning and engagement with adults as role models, both key elements to a successful after-school program. Now focused on developing curriculum emphasizing skills needed for the twenty-first century, JA adopted the key elements laid out for after-school programs by the Partnership for 21st Century Skills. To ensure that the next generation of students enters the workforce prepared, America's education system must provide the required knowledge, skills, and attitudes. Programs such as JA Worldwide serve as models of how to provide the twenty-first century skills that all students need to succeed.

  14. Twenty-first century learning in states: the case of the Massachusetts educational system. (United States)

    Driscoll, David P


    A current crisis in education is leaving students less prepared to succeed in the working world than any generation before them. Increasingly complex external, nonacademic pressures have an impact on many of today's students, often causing them to drop out of school. Only 76 percent of Massachusetts high school students graduate, and only 29 percent earn a college degree. National figures are worse. Most educational institutions share a common goal to support students in becoming skilled, productive, successful members of society, but the author argues that this goal is not being met. Despite the constant changes in the world, educational practices have remained static. Most public schools are not adapting to meet the shifting needs of students. Universities are not able to prepare the right mix of prospective employees for the demands of the job market; for example, schools are graduating only 10 percent of the needed engineers. Institutions of higher learning cannot keep up with employers' needs in an evolving global market: strong math, science, and writing abilities; critical thinking skills; and the ability to work in teams. The author draws on exemplary efforts at work in his home state of Massachusetts--whose improvements in student achievement outcomes have been some of the best in the nation--to suggest there is promise in twenty-first century learning. Middle school students involved in a NASA-funded project write proposals, work in teams, and engage in peer review. Older students participate in enhanced, hands-on cooperative school-to-work and after-school programs. Schools are starting to offer expanded day learning, increasing the number of hours they are engaged in formal learning. Yet such programs have not reached significant levels of scale. The author calls for a major shift in education to help today's students be successful in the twenty-first century.

  15. Watershed-scale response to climate change through the twenty-first century for selected basins across the United States (United States)

    Hay, Lauren E.; Markstrom, Steven; Ward-Garrison, Christian D.


    The hydrologic response of different climate-change emission scenarios for the twenty-first century were evaluated in 14 basins from different hydroclimatic regions across the United States using the Precipitation-Runoff Modeling System (PRMS), a process-based, distributed-parameter watershed model. This study involves four major steps: 1) setup and calibration of the PRMS model in 14 basins across the United States by local U.S. Geological Survey personnel; 2) statistical downscaling of the World Climate Research Programme’s Coupled Model Intercomparison Project phase 3 climate-change emission scenarios to create PRMS input files that reflect these emission scenarios; 3) run PRMS for the climate-change emission scenarios for the 14 basins; and 4) evaluation of the PRMS output.This paper presents an overview of this project, details of the methodology, results from the 14 basin simulations, and interpretation of these results. A key finding is that the hydrological response of the different geographical regions of the United States to potential climate change may be very different, depending on the dominant physical processes of that particular region. Also considered is the tremendous amount of uncertainty present in the climate emission scenarios and how this uncertainty propagates through the hydrologic simulations. This paper concludes with a discussion of the lessons learned and potential for future work.

  16. Evolution and modulation of tropical heating from the last glacial maximum through the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Hoyos, Carlos D.; Webster, Peter J. [Georgia Institute of Technology, School of Earth and Atmospheric Sciences, Atlanta, GA (United States)


    Twentieth century observations show that during the last 50 years the sea-surface temperature (SST) of the tropical oceans has increased by {proportional_to}0.5 C and the area of SST >26.5 and 28 C (arbitrarily referred to as the oceanic warm pool: OWP) by 15 and 50% respectively in association with an increase in green house gas concentrations, with non-understood natural variability or a combination of both. Based on CMIP3 projections the OWP is projected to double during twenty-first century in a moderate CO{sub 2} forcing scenario (IPCC A1B scenario). However, during the observational period the area of positive atmospheric heating (referred to as the dynamic warm pool, DWP), has remained constant. The threshold SST (T{sub H}), which demarks the region of net heating and cooling, has increased from 26.6 C in the 1950s to 27.1 C in the last decade and it is projected to increase to {proportional_to}28.5 C by 2100. Based on climate model simulations, the area of the DWP is projected to remain constant during the twenty-first century. Analysis of the paleoclimate model intercomparison project (PMIP I and II) simulations for the Last Glacial maximum and the Mid-Holocene periods show a very similar behaviour, with a larger OWP in periods of elevated tropical SST, and an almost constant DWP associated with a varying T{sub H}. The constancy of the DWP area, despite shifts in the background SST, is shown to be the result of a near exact matching between increases in the integrated convective heating within the DWP and the integrated radiative cooling outside the DWP as SST changes. Although the area of the DWP remains constant, the total tropical atmospheric heating is a strong function of the SST. For example the net heating has increased by about 10% from 1950 to 2000 and it is projected to increase by a further 20% by 2100. Such changes must be compensated by a more vigorous atmospheric circulation, with growth in convective heating within the warm pool, and an

  17. Projected status of the Pacific walrus (Odobenus rosmarus divergens) in the twenty-first century (United States)

    Jay, Chadwick V.; Marcot, Bruce G.; Douglas, David C.


    Extensive and rapid losses of sea ice in the Arctic have raised conservation concerns for the Pacific walrus (Odobenus rosmarus divergens), a large pinniped inhabiting arctic and subarctic continental shelf waters of the Chukchi and Bering seas. We developed a Bayesian network model to integrate potential effects of changing environmental conditions and anthropogenic stressors on the future status of the Pacific walrus population at four periods through the twenty-first century. The model framework allowed for inclusion of various sources and levels of knowledge, and representation of structural and parameter uncertainties. Walrus outcome probabilities through the century reflected a clear trend of worsening conditions for the subspecies. From the current observation period to the end of century, the greatest change in walrus outcome probabilities was a progressive decrease in the outcome state of robust and a concomitant increase in the outcome state of vulnerable. The probabilities of rare and extirpated states each progressively increased but remained level of 10% in 2004 to 22% by 2050 and 40% by 2095. The degree of uncertainty in walrus outcomes increased monotonically over future periods. In the model, sea ice habitat (particularly for summer/fall) and harvest levels had the greatest influence on future population outcomes. Other potential stressors had much smaller influences on walrus outcomes, mostly because of uncertainty in their future states and our current poor understanding of their mechanistic influence on walrus abundance.

  18. New and newer[The New Physics for the Twenty-First Century

    Energy Technology Data Exchange (ETDEWEB)

    Clark, C. [Electron and Optical Physics Division, National Institute of Standards and Technology, MD (United States)]. E-mail:


    Stephen Hawking's inaugural lecture as Lucasian Professor of Mathematics at Cambridge University in 1980 caused quite a stir. Its title - 'Is the end in sight for theoretical physics?' - raised the prospect of a unified 'theory of everything'. Hawking suggested that there was a good chance of resolving the remaining inconsistencies between the two big 'theories of something' - quantum mechanics and general relativity - before the turn of the century. My first impression on reading The New Physics for the Twenty-First Century, a collection of essays edited by science journalist Gordon Fraser, is that a theory of everything may still be attainable by the turn of the century. However, there is now 20 times more of everything in the universe than there was in the past century, 95% of which no-one has ever actually seen, or had even heard of until a few years ago - as summarized in articles by Wendy Freedman, Edward Kolb and Ronald Adler. Despite this, Michael Green describes amazing developments in string theory that could tie everything together, if one could just figure out which, if any, of the apparently infinite varieties of string theory applies to our world, and why. (U.K.)

  19. Indication to Open Anatrophic Nephrolithotomy in the Twenty-First Century: A Case Report

    Directory of Open Access Journals (Sweden)

    Alfredo Maria Bove


    Full Text Available Introduction. Advances in endourology have greatly reduced indications to open surgery in the treatment of staghorn kidney stones. Nevertheless in our experience, open surgery still represents the treatment of choice in rare cases. Case Report. A 71-year-old morbidly obese female patient complaining about occasional left flank pain, and recurrent cystitis for many years, presented bilateral staghorn kidney stones. Comorbidities were obesity (BMI 36.2, hypertension, type II diabetes, and chronic obstructive pulmunary disease (COPD hyperlipidemia. Due to these comorbidities, endoscopic and laparoscopic approaches were not indicated. We offered the patient staged open anatrophic nephrolithotomy. Results. Operative time was 180 minutes. Blood loss was 500 cc. requiring one unit of packed red blood cells. Hospital stay was 7 days. The renal function was unaffected based on preoperative and postoperative serum creatinine levels. Stone-free status of the left kidney was confirmed after surgery with CT scan. Conclusions. Open surgery can represent a valid alterative in the treatment of staghorn kidney stones of very selected cases. A discussion of the current indications in the twenty-first century is presented.

  20. Diverging seasonal extremes for ocean acidification during the twenty-first century (United States)

    Kwiatkowski, Lester; Orr, James C.


    How ocean acidification will affect marine organisms depends on changes in both the long-term mean and the short-term temporal variability of carbonate chemistry1-8. Although the decadal-to-centennial response to atmospheric CO2 and climate change is constrained by observations and models1, 9, little is known about corresponding changes in seasonality10-12, particularly for pH. Here we assess the latter by analysing nine earth system models (ESMs) forced with a business-as-usual emissions scenario13. During the twenty-first century, the seasonal cycle of surface-ocean pH was attenuated by 16 ± 7%, on average, whereas that for hydrogen ion concentration [H+] was amplified by 81 ± 16%. Simultaneously, the seasonal amplitude of the aragonite saturation state (Ωarag) was attenuated except in the subtropics, where it was amplified. These contrasting changes derive from regionally varying sensitivities of these variables to atmospheric CO2 and climate change and to diverging trends in seasonal extremes in the primary controlling variables (temperature, dissolved inorganic carbon and alkalinity). Projected seasonality changes will tend to exacerbate the impacts of increasing [H+] on marine organisms during the summer and ameliorate the impacts during the winter, although the opposite holds in the high latitudes. Similarly, over most of the ocean, impacts from declining Ωarag are likely to be intensified during the summer and dampened during the winter.

  1. Between vanguard and exclusion- young people of the twenty-first century

    Directory of Open Access Journals (Sweden)

    Agnieszka Gil


    Full Text Available This study has been narrowed down to reveal a paradox. Here the vanguard of culture and civilization - which is regarded as young people of the twenty-first century – is embroiled in a discourse of exclusion: economic, political and cultural life. In secondary school programs and high schools we do not find specific references and studies, primarily based on the needs of students, about the theory of popular culture and cultural education in the area of pop culture. The paradox of exclusion of mainstream culture from educational discourse is schizophrenic. The political exclusion of young people of the XXI century I consider all the disparaging scientific discourse, which skips the actual media and communication competence of young people. Prosumers, cognitarchy, digital natives, C-generation – they are for the modern economy “Silicon Valley” - their market power to exclude is already unstoppable. In other areas it remains to be considered whether excluding young people from the cultural discourse will not deprive our future teachers and translators of the next civilization revolution of social reality...

  2. Civil engineering at the crossroads in the twenty-first century. (United States)

    Ramírez, Francisco; Seco, Andres


    The twenty-first century presents a major challenge for civil engineering. The magnitude and future importance of some of the problems perceived by society are directly related to the field of the civil engineer, implying an inescapable burden of responsibility for a group whose technical soundness, rational approach and efficiency is highly valued and respected by the citizen. However, the substantial changes in society and in the way it perceives the problems that it considers important call for a thorough review of our structures, both professional and educational; so that our profession, with its undeniable historical prestige, may modernize certain approaches and attitudes in order to continue to be a reliable instrument in the service of society, giving priority from an ethical standpoint to its actions in pursuit of "the public good". It possesses important tools to facilitate this work (new technologies, the development of communications, the transmission of scientific thought.···); but there is nevertheless a need for deep reflection on the very essence of civil engineering: what we want it to be in the future, and the ability and willingness to take the lead at a time when society needs disinterested messages, technically supported, reasonably presented and dispassionately transmitted.

  3. Challenges and Opportunities for Occupational Epidemiology in the Twenty-first Century. (United States)

    Stayner, L T; Collins, J J; Guo, Y L; Heederik, D; Kogevinas, M; Steenland, K; Wesseling, C; Demers, P A


    There are many opportunities and challenges for conducting occupational epidemiologic studies today. In this paper, we summarize the discussion of a symposium held at the Epidemiology in Occupational Health (EPICOH) conference, Chicago 2014, on challenges for occupational epidemiology in the twenty-first century. The increasing number of publications and attendance at our conferences suggests that worldwide interest in occupational epidemiology has been growing. There are clearly abundant opportunities for new research in occupational epidemiology. Areas ripe for further work include developing improved methods for exposure assessment, statistical analysis, studying migrant workers and other vulnerable populations, the use of biomarkers, and new hazards. Several major challenges are also discussed such as the rapidly changing nature and location of work, lack of funding, and political/legal conflicts. As long as work exists there will be occupational diseases that demand our attention, and a need for epidemiologic studies designed to characterize these risks and to support the development of preventive strategies. Despite the challenges and given the important past contribution in this field, we are optimistic about the importance and continued vitality of the research field of occupational epidemiology.

  4. Developing twenty-first century skills: insights from an intensive interdisciplinary workshop Mosaic of Life

    Directory of Open Access Journals (Sweden)

    Tamara Milosevic


    Full Text Available The Baltic Sea, one of the world’s largest semi-enclosed seas, which, with its very low salinity and quasi-isolation from the big oceans cannot decide whether it is a sea or a large lake. This geologically-unique environment supports an even more surprising and delicate marine ecosystem, where a complex community of fishes, marine mammals and important microscopic organisms creates a magical mosaic of life. Humans have enjoyed the abundance of life in the Baltic Sea for thousands of years, and major Scandinavian and Baltic cities have oriented themselves towards this geo-ecosystem in order to develop and seek ecological, economical and cultural inspiration and wealth. The ‘Mosaic of Life’ workshop aimed at going beyond the obvious in examining the meaning of the Baltic Sea by gathering together a selection of young, creative minds from different backgrounds ranging from the arts and economics to geology and life sciences. This intensive workshop was designed as a unique training opportunity to develop essential twenty-first century skills – to introduce and develop creative, critical and interdisciplinary thinking and collaborative teamwork, as well as to foster a visual and scientific literacy, using project-based learning and hands-on activities. Our final goal has been to be inspired by the resulting connections, differences and unifying concepts, creating innovative, interdisciplinary projects which would look further than the sea – further than the eye can see and further into the future.

  5. Twenty-first century learning in schools: A case study of New Technology High School in Napa, California. (United States)

    Pearlman, Bob


    The most pertinent question concerning teaching and learning in the twenty-first century is not what knowledge and skills students need--that laundry list was identified over a decade ago--but rather how to foster twenty-first century learning. What curricula, experiences, assessments, environments, and technology best support twenty-first century learning? New Technology High School (NTHS) in Napa, California, is one example of a successful twenty-first century school. In this chapter, the author describes the components of this exemplary high school, illustrating an environment that will cultivate twenty-first century student learning. New Technology High School began by defining eight learning outcomes, aligned with the standards of the Partnership for 21st Century Skills; to graduate, students demonstrate mastery of these outcomes through an online portfolio. To help students achieve the outcomes, NTHS employs project- and problem-based learning. Whereas in traditional classrooms students work alone on short-term assignments that do not lend themselves to deep understanding, the project-based learning approach has students working in teams on long-term, in-depth, rigorous projects. Students' work is supported by the school's workplace-like environment and effectiv use of technology. Meaningful assessment is essential to project-based learning; students receive continuous feedback, helping them become self-directed learners. In fact, NTHS uses outcome-based grading through which students constantly know how they are performing on the twenty-first century outcomes. Research has shown that NTHS graduates are better prepared for postsecondary education, careers, and citizenship than their peers from other schools. To facilitate twenty-first century learning, all schools need to rethink their approach to teaching and learning. New Technology High School is one way to do so.

  6. Strong Inference in Mathematical Modeling: A Method for Robust Science in the Twenty-First Century. (United States)

    Ganusov, Vitaly V


    While there are many opinions on what mathematical modeling in biology is, in essence, modeling is a mathematical tool, like a microscope, which allows consequences to logically follow from a set of assumptions. Only when this tool is applied appropriately, as microscope is used to look at small items, it may allow to understand importance of specific mechanisms/assumptions in biological processes. Mathematical modeling can be less useful or even misleading if used inappropriately, for example, when a microscope is used to study stars. According to some philosophers (Oreskes et al., 1994), the best use of mathematical models is not when a model is used to confirm a hypothesis but rather when a model shows inconsistency of the model (defined by a specific set of assumptions) and data. Following the principle of strong inference for experimental sciences proposed by Platt (1964), I suggest "strong inference in mathematical modeling" as an effective and robust way of using mathematical modeling to understand mechanisms driving dynamics of biological systems. The major steps of strong inference in mathematical modeling are (1) to develop multiple alternative models for the phenomenon in question; (2) to compare the models with available experimental data and to determine which of the models are not consistent with the data; (3) to determine reasons why rejected models failed to explain the data, and (4) to suggest experiments which would allow to discriminate between remaining alternative models. The use of strong inference is likely to provide better robustness of predictions of mathematical models and it should be strongly encouraged in mathematical modeling-based publications in the Twenty-First century.

  7. Gendering inequality: a note on Piketty's Capital in the twenty-first century. (United States)

    Perrons, Diane


    Thomas Piketty's Capital in the Twenty-First Century is remarkable for moving inequality from the margins to mainstream debate through detailed analysis of longitudinal statistics and, for an economist, by advocating an interdisciplinary perspective and writing in a witty and accessible style. With reference to the post 1970 period, when wage increases are largely responsible for the increase in inequality, Piketty shows how patrimonial capitalists (elite managers) in the top decile and centile of the distribution appropriate a growing share of social wealth as a consequence of their 'power to set their own remuneration' in the context of tolerant social norms rather than through their productive contributions. Piketty raises but defers the question of where these social norms come from to other disciplines. A Feminist Economics perspective indicates that these questions are central to a more inclusive form of economic analysis and such an approach would enrich Piketty's analysis in two main ways. First, by paying greater attention to the processes and social norms through which inequalities are produced and justified and second by highlighting the ways in which inequality is experienced differently depending not only on class, but also on other aspects of identity including gender. This approach also suggests that it is necessary to supplement the ex-post redistributive policies recommended by Piketty: a global wealth tax and more steeply progressive income tax, with ex-ante measures to stop the rise in wage inequality in the first place, especially by bridging the huge gulf that exists between those who care for people and those who manage money. © London School of Economics and Political Science 2014.

  8. Latvian Security and Defense Policy within the Twenty-First Century Security Environment

    Directory of Open Access Journals (Sweden)

    Rublovskis Raimonds


    Full Text Available The aim of this paper is to analyze fundamental factors which form and profoundly shape security and defense policy of the Republic of Latvia. One can argue that historical background, geographical location, common institutional history within the former Soviet Union, the Russia factor, the relative smallness of the territory of state and the population, the ethnic composition of the population, the low density of the population and rather limited financial and manpower resources available for the defense of the Republic of Latvia are the key factors of influence on the state security and defense policy. The core principles of the security and defense policy of Latvia are the membership in powerful global military alliance of NATO and bilateral strategic partnership with the United States. However, security and defense cooperation among the three Baltic States as well as enhanced cooperation within the Baltic-Nordic framework is seen as an important supplementary factor for the increased security of the Republic of Latvia. Latvia has developed a sustainable legal and institutional framework in order to contribute to state security and defense; however, security challenges and significant changes within the global security environment of the twenty-first century will further challenge the ability of the Republic of Latvia to sustain its current legal framework, and more importantly, current institutional structure of Latvian security and defense architecture. Significant internal and external challenges will impact the fundamental pillars of Latvian security and defense policy, such as American strategic shift to the Pacific, and lack of political will to increase defense budgets in European part of NATO. It has to be clear that very independence, security and defense of the Republic of Latvia depend on the ability of NATO to remain an effective organization with timely and efficient decision-making, and the ability of the United States to remain

  9. Strong Inference in Mathematical Modeling: A Method for Robust Science in the Twenty-First Century (United States)

    Ganusov, Vitaly V.


    While there are many opinions on what mathematical modeling in biology is, in essence, modeling is a mathematical tool, like a microscope, which allows consequences to logically follow from a set of assumptions. Only when this tool is applied appropriately, as microscope is used to look at small items, it may allow to understand importance of specific mechanisms/assumptions in biological processes. Mathematical modeling can be less useful or even misleading if used inappropriately, for example, when a microscope is used to study stars. According to some philosophers (Oreskes et al., 1994), the best use of mathematical models is not when a model is used to confirm a hypothesis but rather when a model shows inconsistency of the model (defined by a specific set of assumptions) and data. Following the principle of strong inference for experimental sciences proposed by Platt (1964), I suggest “strong inference in mathematical modeling” as an effective and robust way of using mathematical modeling to understand mechanisms driving dynamics of biological systems. The major steps of strong inference in mathematical modeling are (1) to develop multiple alternative models for the phenomenon in question; (2) to compare the models with available experimental data and to determine which of the models are not consistent with the data; (3) to determine reasons why rejected models failed to explain the data, and (4) to suggest experiments which would allow to discriminate between remaining alternative models. The use of strong inference is likely to provide better robustness of predictions of mathematical models and it should be strongly encouraged in mathematical modeling-based publications in the Twenty-First century. PMID:27499750

  10. A Tale within a Tale: Mise en Abyme Adaptations of the Twenty-first Century

    Directory of Open Access Journals (Sweden)

    Željka Flegar


    Full Text Available In accord with the promise made by Henry Jenkins that “old and new media will interact in ever more complex ways” (Convergence Culture 6, this research observes metamodern fairy tale adaptations of the twenty-first century in light of Christina Bacchilega’s construct of the fairy-tale web and Henry Jenkins’ theory of convergence culture and transmedia storytelling. The research will address the growing trend of embedding “wonder tale” collections within the context of a larger narrative as an artefact of significance, power, and material value. Although original tales with known authorship, these fairy tale adaptations are appended to the mythology and culture of the fantastic secondary worlds. Such texts tend to be parodic, subversive, and even carnivalesque (Bakhtin; Stephens, providing a commentary on the culture of their origin, as well as our own. By blending cultures, styles, and formats, mise en abyme wonder tales also result in the empowerment of specifically marginalised groups. Generally defined as spin-offs that are otherwise a part of a complex inter- and hypertextual web, these fairy tale collections constitute a metafictional body of knowledge and wisdom. In the digital era much focus is placed on multimodal, hypertextual, and transmedia narratives with a significant influence of fandom on the production of such literary works. The study will focus on the popular examples of such practice, J.K. Rowling’s Tales of Beedle the Bard (2007/2008 and Ransom Riggs’ Tales of the Peculiar (2016, in order to define mise en abyme fairy tale adaptations, as well as to discuss their cultural significance and function.

  11. Assessing twenty-first century skills through a teacher created video game for high school biology students (United States)

    Annetta, Leonard A.; Cheng, Meng-Tzu; Holmes, Shawn


    As twenty-first century skills become a greater focus in K-12 education, an infusion of technology that meets the needs of today's students is paramount. This study looks at the design and creation of a Multiplayer Educational Gaming Application (MEGA) for high school biology students. The quasi-experimental, qualitative design assessed the twenty-first century skills of digital age literacy, inventive thinking, high productivity, and effective communication techniques of the students exposed to a MEGA. Three factors, as they pertained to these skills, emerged from classroom observations. Interaction with the teacher, discussion with peers, and engagement/time-on-task while playing the MEGA suggested that students playing an educational video game exhibited all of the projected twenty-first century skills while being engrossed in the embedded science content.

  12. Agriculture in West Africa in the Twenty-First Century: Climate Change and Impacts Scenarios, and Potential for Adaptation (United States)

    Sultan, Benjamin; Gaetani, Marco


    West Africa is known to be particularly vulnerable to climate change due to high climate variability, high reliance on rain-fed agriculture, and limited economic and institutional capacity to respond to climate variability and change. In this context, better knowledge of how climate will change in West Africa and how such changes will impact crop productivity is crucial to inform policies that may counteract the adverse effects. This review paper provides a comprehensive overview of climate change impacts on agriculture in West Africa based on the recent scientific literature. West Africa is nowadays experiencing a rapid climate change, characterized by a widespread warming, a recovery of the monsoonal precipitation, and an increase in the occurrence of climate extremes. The observed climate tendencies are also projected to continue in the twenty-first century under moderate and high emission scenarios, although large uncertainties still affect simulations of the future West African climate, especially regarding the summer precipitation. However, despite diverging future projections of the monsoonal rainfall, which is essential for rain-fed agriculture, a robust evidence of yield loss in West Africa emerges. This yield loss is mainly driven by increased mean temperature while potential wetter or drier conditions as well as elevated CO2 concentrations can modulate this effect. Potential for adaptation is illustrated for major crops in West Africa through a selection of studies based on process-based crop models to adjust cropping systems (change in varieties, sowing dates and density, irrigation, fertilizer management) to future climate. Results of the cited studies are crop and region specific and no clear conclusions can be made regarding the most effective adaptation options. Further efforts are needed to improve modeling of the monsoon system and to better quantify the uncertainty in its changes under a warmer climate, in the response of the crops to such

  13. Facilities Inventory and Utilization Study, Fall of 1987. Twenty-First Edition. (United States)

    North Carolina Commission on Higher Education Facilities, Chapel Hill.

    The status of space in North Carolina institutions of higher education at the end of the drop-add period of the 1987 fall term at each college is presented. Indications of the uses being made of the space are given, and norms and historical information are presented for the past 5 years to enable institutions to make their own assessments of their…

  14. A Dialogue Worth Having: Vocational Competence, Career Identity and a Learning Environment for Twenty-First Century Success at Work

    NARCIS (Netherlands)

    Meijers, Frans; Lengelle, Reinekke; Winters, Annemie; Kuijpers, Marinka


    The cultivation of intrinsic motivation is key in the twenty first century, but most students in Dutch vocational education lack this quality. To foster intrinsic motivation, a strong career-learning environment is needed that enables students to develop career competencies and a career identity.

  15. Twenty-First Century Instructional Classroom Practices and Reading Motivation: Probing the Effectiveness of Interventional Reading Programs (United States)

    Boulhrir, Taoufik


    Twenty-first century education has undoubtedly witnessed changes of the definition of literacy to cope with the economic, social, and intellectual trends. Technological advances, which include skills of communication, creativity, critical thinking, and collaboration have become key in education, especially when dealing with literacy and reading…

  16. Science Teacher Education in the Twenty-First Century: a Pedagogical Framework for Technology-Integrated Social Constructivism (United States)

    Barak, Miri


    Changes in our global world have shifted the skill demands from acquisition of structured knowledge to mastery of skills, often referred to as twenty-first century competencies. Given these changes, a sequential explanatory mixed methods study was undertaken to (a) examine predominant instructional methods and technologies used by teacher educators, (b) identify attributes for learning and teaching in the twenty-first century, and (c) develop a pedagogical framework for promoting meaningful usage of advanced technologies. Quantitative and qualitative data were collected via an online survey, personal interviews, and written reflections with science teacher educators and student teachers. Findings indicated that teacher educators do not provide sufficient models for the promotion of reform-based practice via web 2.0 environments, such as Wikis, blogs, social networks, or other cloud technologies. Findings also indicated four attributes for teaching and learning in the twenty-first century: (a) adapting to frequent changes and uncertain situations, (b) collaborating and communicating in decentralized environments, (c) generating data and managing information, and (d) releasing control by encouraging exploration. Guided by social constructivist paradigms and twenty-first century teaching attributes, this study suggests a pedagogical framework for fostering meaningful usage of advanced technologies in science teacher education courses.

  17. Essential Soft Skills for Success in the Twenty-First Century Workforce as Perceived by Business Educators (United States)

    Mitchell, Geana W.; Skinner, Leane B.; White, Bonnie J.


    Background: Soft skills describe career attributes that individuals should possess, such as team skills, communication skills, ethics, time-management skills, and an appreciation for diversity. In the twenty-first century workforce, soft skills are important in every business sector. However, employers in business continuously report that new…

  18. Predicting climate change impacts on native and invasive tree species using radial growth and twenty-first century climate scenarios

    NARCIS (Netherlands)

    González-Muñoz, N.; Linares, J.C.; Castro-Díez, P.; Sass-Klaassen, U.G.W.


    The climatic conditions predicted for the twenty-first century may aggravate the extent and impacts of plant invasions, by favouring those invaders more adapted to altered conditions or by hampering the native flora. We aim to predict the fate of native and invasive tree species in the oak forests

  19. Rethinking Teaching and Learning Pedagogy for Education in the Twenty-First Century: Blended Learning in Music Education (United States)

    Crawford, Renée


    In an increasingly technologically driven world, there is proliferate discussion among education and government authorities about the necessity to rethink education in the twenty-first century. The evolution of technology and its pervasive influence on the needs and requirements of society is central to this mindset. Innovations in online…

  20. Transformative Pedagogy, Leadership and School Organisation for the Twenty-First-Century Knowledge-Based Economy: The Case of Singapore (United States)

    Dimmock, Clive; Goh, Jonathan W. P.


    Singapore has a high performing school system; its students top international tests in maths and science. Yet while the Singapore government cherishes its world class "brand", it realises that in a globally competitive world, its schools need to prepare students for the twenty-first-century knowledge-based economy (KBE). Accordingly,…

  1. Index to the Twenty-first Semiannual Report of the Commission to the Congress. July 1956 - December 1956

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, Lewis L.


    This volume contains a name and subject indext for the twenty-first semiannual report of the United States Atomic Energy Commission to Congress. The full semiannual report covers the major unclassified activities of the Commission from July 1956 through December 1956.

  2. Nonlinear Pedagogy and Its Role in Encouraging Twenty-First Century Competencies through Physical Education: A Singapore Experience (United States)

    Lee, Miriam Chang Yi; Chow, Jia Yi; Button, Chris; Tan, Clara Wee Keat


    Nonlinear Pedagogy is an exploratory approach to teaching and learning Physical Education that can be potentially effective to help children acquire relevant twenty-first century competencies. Underpinned by Ecological Dynamics, the focus of Nonlinear Pedagogy is on the learner and includes the provision of less prescriptive instructions and…

  3. Thinking Like Twenty-First Century Learners: An Exploration of Blog Use in a Skills-Based Counselor Education Course (United States)

    Buono, Lisa L.


    Twenty-first century learners and millennial generation students have integrated technology into their personal lives; there is a growing expectation for technology to be integrated into their classroom experiences as well. Incorporating technology, including the use of blogs, into teaching and learning is receiving attention in the literature.…

  4. A Commentary on "Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century" (United States)

    Brandt, Steffen


    This article presents the author's commentary on "Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century," in which Isaac I. Bejar and E. Aurora Graf propose the application of a test design--the duplex design (which was proposed in 1988 by Bock and Mislevy) for application in current accountability assessments.…

  5. From School to Cafe and Back Again: Responding to the Learning Demands of the Twenty-First Century (United States)

    McWilliam, Erica


    This paper traces the historical origins of formal and informal lifelong learning to argue that optimal twenty-first-century education can and should draw on the traditions of both the school and the coffee house or cafe. For some time now, educational policy documents and glossy school brochures have come wrapped in the mantle of lifelong…

  6. Remapping simulated halo catalogues in redshift space


    Mead, Alexander; Peacock, John


    We discuss the extension to redshift space of a rescaling algorithm, designed to alter the effective cosmology of a pre-existing simulated particle distribution or catalogue of dark matter haloes. The rescaling approach was initially developed by Angulo & White and was adapted and applied to halo catalogues in real space in our previous work. This algorithm requires no information other than the initial and target cosmological parameters, and it contains no tuned parameters. It is shown here ...

  7. The Art of Negotiation: What the Twenty-First Century Business Student Should Know (United States)

    McClendon, Bill; Burke, Debra D.; Willey, Lorrie


    Negotiation skills are vital for concluding international treaties on subjects ranging from arms agreements, and rights in outer space to trade agreements. Yet the importance of being able to negotiate effectively is not limited to international treaties or crises situations. Using negotiation exercises represents a student-centered approach to…

  8. Laboratory simulation of space plasma phenomena* (United States)

    Amatucci, B.; Tejero, E. M.; Ganguli, G.; Blackwell, D.; Enloe, C. L.; Gillman, E.; Walker, D.; Gatling, G.


    Laboratory devices, such as the Naval Research Laboratory's Space Physics Simulation Chamber, are large-scale experiments dedicated to the creation of large-volume plasmas with parameters realistically scaled to those found in various regions of the near-Earth space plasma environment. Such devices make valuable contributions to the understanding of space plasmas by investigating phenomena under carefully controlled, reproducible conditions, allowing for the validation of theoretical models being applied to space data. By working in collaboration with in situ experimentalists to create realistic conditions scaled to those found during the observations of interest, the microphysics responsible for the observed events can be investigated in detail not possible in space. To date, numerous investigations of phenomena such as plasma waves, wave-particle interactions, and particle energization have been successfully performed in the laboratory. In addition to investigations such as plasma wave and instability studies, the laboratory devices can also make valuable contributions to the development and testing of space plasma diagnostics. One example is the plasma impedance probe developed at NRL. Originally developed as a laboratory diagnostic, the sensor has now been flown on a sounding rocket, is included on a CubeSat experiment, and will be included on the DoD Space Test Program's STP-H6 experiment on the International Space Station. In this presentation, we will describe several examples of the laboratory investigation of space plasma waves and instabilities and diagnostic development. *This work supported by the NRL Base Program.

  9. Virtual Libraries and Education in Virtual Worlds: Twenty-First Century Library Services (United States)

    Bell, Lori; Lindbloom, Mary-Carol; Peters, Tom; Pope, Kitty


    As the use of the Internet and time spent on the Internet by individuals grows, and the use of virtual worlds like Active Worlds and Second Life increases, the library needs to have an interactive place and role in these worlds as well as a bricks and mortar space. This article provides an overview of what some libraries are doing in these worlds,…

  10. Navigation simulator for the Space Tug vehicle (United States)

    Colburn, B. K.; Boland, J. S., III; Peters, E. G.


    A general simulation program (GSP) for state estimation of a nonlinear space vehicle flight navigation system is developed and used as a basis for evaluating the performance of a Space Tug navigation system. An explanation of the iterative guidance mode (IGM) guidance law, derivation of the dynamics, coordinate frames and state estimation routines are given in order to clarify the assumptions and approximations made. A number of simulation and analytical studies are used to demonstrate the operation of the Tug system. Included in the simulation studies are (1) initial offset vector parameter study; (2) propagation time vs accuracy; (3) measurement noise parametric study and (4) reduction in computational burden of an on-board implementable scheme. From the results of these studies, conclusions and recommendations concerning future areas of practical and theoretical work are presented.

  11. Consideration of land-use and land-cover changes in the projection of climate extremes over North America by the end of the twenty-first century (United States)

    Alexandru, Adelina


    Changes in the essential climate extremes indices and surface variables for the end of the twenty-first century are assessed in this study based on two transient climate change simulations, with and without land-use and land-cover changes (LULCC), but identical atmospheric forcing. The two simulations are performed with the 5th generation of the Canadian Regional Climate Model (CRCM5) driven by the Canadian Earth System Model for the (2006-2100)-Representative Concentration Pathway 4.5 (RCP4.5) scenario. For the simulation with LULCC, land-cover data sets are taken from the global change assessment model (GCAM) representing the RCP4.5 scenario for the period 2006-2100. LULCC in RCP4.5 scenario suggest significant reduction in cultivated land (e.g. Canadian Prairies and Mississippi basin) due to afforestation. CRCM5 climate projections imply a general warming by the end of the twenty-first century, especially over the northern regions in winter. CRCM5 projects more warm spell-days per year over most areas of the continent, and implicitly more summer days and tropical nights at the expense of cold-spell, frost and ice days whose number is projected to decrease by up to 40% by the end of the twenty-first century with respect to the baseline period 1971-2000. Most land areas north of 45°N, in all seasons, as well as the southeastern United States in summer, exhibit increases in mean precipitation under the RCP4.5 scenario. In contrast, central parts of the continent in summer and much of Mexico in all seasons show reduced precipitation. In addition, large areas of North America exhibit changes of 10 to 40% (depending on the season and geographical location) in the number of heavy precipitation days. Results also suggest that the biogeophysical effects of LULCC on climate, assessed through differences between the two simulations, lead to warmer regional climates, especially in winter. The investigation of processes leading to this response shows high sensitivity of the

  12. Gravity's ghost and big dog scientific discovery and social analysis in the twenty-first century

    CERN Document Server

    Collins, Harry


    Gravity's Ghost and Big Dog brings to life science's efforts to detect cosmic gravitational waves. These ripples in space-time are predicted by general relativity, and their discovery will not only demonstrate the truth of Einstein's theories but also transform astronomy. Although no gravitational wave has ever been directly detected, the previous five years have been an especially exciting period in the field. Here sociologist Harry Collins offers readers an unprecedented view of gravitational wave research and explains what it means for an analyst to do work of this kind.

  13. A Painter's View of the Cosmos In the Twenty-first Century (United States)

    Cro-Ken, K.


    I am an ecosystem artist who uses paint to bring nature's “invisible forces” into view. My eco-sensitive palette recreates the push-pull forces that shape and mold all things. As a result, I create microscopic and telescopic views of earth and places scattered throughout our universe. Self-similarity led me to realize that if I want my mind to wonder into the far reaches of the universe, I must draw closer to nature. I show how space looks and appears and, more importantly, how it moves. My speed element palette is a portal through which I peer into the universe at scales great and small using paint as my lens. Microscopes, telescopes, the Internet, and even eyeglasses are portals through which technology affords us the ability to see that which is unseen to the unaided eye. Rather than see the world and then paint, the opposite is true for me. My work is revelatory, not representational and, as such, seeks similar occurrences in nature. Just as a planet's surface is a visual record of past events, so too do speed element experiments reveal traces of the past. It would be more accurate to call a painting that comes to rest a “painted.” It is video that captures images that eluded capture by the canvas and could more accurately be called a “painting. ” Simply put, I manipulate space, time, and matter—and the matter is never just paint.

  14. Proceedings of the twenty-first symposium of atomic energy research on WWER physics and reactor safety

    International Nuclear Information System (INIS)

    Vidovszky, I.


    The present volume contains 61 papers, presented on the twenty-first symposium of atomic energy research, held in Dresden, Germany, 19-23 September 2011. The papers are presented in their original form, i. e. no corrections or modifications were carried out. The content of this volume is divided into thematic groups: Improvement, extension and validation of parameterized few-group libraries for WWER-440 and WWER-1000.

  15. The Return of "Patrimonial Capitalism": A Review of Thomas Piketty's Capital in the Twenty-First Century


    Branko Milanovic


    Capital in the Twenty-First Century by Thomas Piketty provides a unified theory of the functioning of the capitalist economy by linking theories of economic growth and functional and personal income distributions. It argues, based on the long-run historical data series, that the forces of economic divergence (including rising income inequality) tend to dominate in capitalism. It regards the twentieth century as an exception to this rule and proposes policies that would make capitalism sustain...

  16. A Vision for ARES in the Twenty-First Century: The Virtual Community of Real Estate Thought Leaders


    Stephen E. Roulac


    In the twenty-first century the American Real Estate Society (ARES) is a virtual community of real estate thought leaders, electronically interconnected and linked through the International Real Estate Society to counterpart organizations on all major continents as well as numerous country-specific societies. ARES growth is attributable to its emphasis on rigorous applied microeconomic decisionmaking and an inclusive, open style. The initiatives of the Strategic Planning Task Force, whose rep...

  17. An experimental public: heterogeneous groups defining embodiment in the early twenty-first century. (United States)

    Laki, Julia


    In this paper, I take a look at certain forms of contemporary art as practices that allow meanings within biomedical science and medical practice to emerge in novel ways. I claim that conceptual art and biological art are two unique spaces within which the understanding of embodiment and disease comes to be shaped actively and reflexively, sometimes on the very level of the materiality of the body, sometimes through the articulation and representation of medical images and technologies. I link these developments to Paul Rabinow's notion of biosociality and argue that the molecularization and geneticization of the medical gaze, conjoined with certain social and cultural shifts, results in the formation of an experimental public of artists, scientists and lay people, all invested in actively shaping the conceptualization of bodies and diseases. This will take me to a consideration of the intertwining of art and medicine beyond the domain of the visual.

  18. Impact of climate change on mid-twenty-first century growing seasons in Africa

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Kerry H.; Vizy, Edward K. [The University of Texas at Austin, Department of Geological Sciences, Jackson School of Geosciences, Austin, TX (United States)


    Changes in growing seasons for 2041-2060 across Africa are projected using a regional climate model at 90-km resolution, and confidence in the predictions is evaluated. The response is highly regional over West Africa, with decreases in growing season days up to 20% in the western Guinean coast and some regions to the east experiencing 5-10% increases. A longer growing season up to 30% in the central and eastern Sahel is predicted, with shorter seasons in parts of the western Sahel. In East Africa, the short rains (boreal fall) growing season is extended as the Indian Ocean warms, but anomalous mid-tropospheric moisture divergence and a northward shift of Sahel rainfall severely curtails the long rains (boreal spring) season. Enhanced rainfall in January and February increases the growing season in the Congo basin by 5-15% in association with enhanced southwesterly moisture transport from the tropical Atlantic. In Angola and the southern Congo basin, 40-80% reductions in austral spring growing season days are associated with reduced precipitation and increased evapotranspiration. Large simulated reductions in growing season over southeastern Africa are judged to be inaccurate because they occur due to a reduction in rainfall in winter which is over-produced in the model. Only small decreases in the actual growing season are simulated when evapotranspiration increases in the warmer climate. The continent-wide changes in growing season are primarily the result of increased evapotranspiration over the warmed land, changes in the intensity and seasonal cycle of the thermal low, and warming of the Indian Ocean. (orig.)

  19. Lunar-based optical telescopes: Planning astronomical tools of the twenty-first century (United States)

    Hilchey, J. D.; Nein, M. E.


    A succession of optical telescopes, ranging in aperture from 1 to 16 m or more, can be deployed and operated on the lunar surface over the next half-century. These candidates to succeed NASA's Great Observatories would capitalize on the unique observational advantages offered by the Moon. The Lunar Telescope Working Group and the LUTE Task Team of the George C. Marshall Space Flight Center (MSFC) have assessed the feasibility of developing and deploying these facilities. Studies include the 16-m Large Lunar Telescope (LLT); the Lunar Cluster Telescope Experiment (LCTE), a 4-m precursor to the LLT; the 2-m Lunar Transit Telescope (LTT); and its precursor, the 1-m Lunar Ultraviolet Telescope Experiment (LUTE). The feasibility of developing and deploying each telescope was assessed and system requirements and options for supporting technologies, subsystems, transportation, and operations were detailed. Influences of lunar environment factors and site selection on telescope design and operation were evaluated, and design approaches and key tradeoffs were established. This paper provides an overview of the study results. Design concepts and brief system descriptions are provided, including subsystem and mission options selected for the concepts.

  20. The renaissance of word-of-mouth marketing: A new standard in twenty-first century marketing management?!


    Meiners, Norbert H.; Schwarting, Ulf; Seeberger, Bernd


    In this paper the importance of word of mouth for marketing management in the twenty-first century will be discussed. After a short introduction, there will be a focus on the demarcations and problems of traditional marketing. Then, in the third section, word of mouth (WOM) and word-of-mouth marketing (WOMM) as a 'new' standard in modern marketing are described. The fourth section broaches the importance of word of mouth and word-of-mouth marketing from the point of view of business and consu...

  1. The restructuring of the Argentina Navy between the end of the twentieth century and early twenty-first.

    Directory of Open Access Journals (Sweden)

    Germán Soprano


    Full Text Available The definition of a policy of national defense and internal security in democracy, created conditions to advance in the process of restructuring of the Argentina Navy, introducing changes in its organization and functions. In this article we will focus this process analyzing, on the one hand, the relationship between the definitions of defense policy and the configuration of naval military instrument between the end of the twentieth century and early twenty-first century; and, on the other hand, understanding their development in the case of two components of the force: the marine corps and the division of maritime patrol.

  2. Public Heath in Colonial and Post-Colonial Ghana: Lesson-Drawing for The Twenty-First Century

    Directory of Open Access Journals (Sweden)

    Adu-Gyamfi, Samuel


    Full Text Available Public health in twenty-first century Ghana is mired with several issues ranging from the inadequacy of public health facilities, improper settlement planning, insanitary conditions, and the inadequacy of laws and their implementation. This situation compared to the colonial era is a direct contradiction. Development in the pre-colonial era to the colonial era sought to make the prevention of diseases a priority in the colonial administration. This was begun with the establishment of the health branch in 1909 as a response to the bubonic plague that was fast spreading in the colony. From here public health policies and strategies were enacted to help the diseases prevention cause. Various public health boards, the medical research institute or the laboratory branch, the waste management department, the use of preventive medicine and maintenance of good settlement planning and sanitation were public health measures in the colonial era. This research seeks to analyse the public health system in the colonial era so as to draw basic lessons for twenty-first century Ghana. Archival data and other secondary sources are reviewed and analysed to help draw these lessons. Richard Rose’s lesson-drawing approach was used to draw the lessons.

  3. A needs assessment for DOE's packaging and transportation activities - a look into the twenty-first century

    International Nuclear Information System (INIS)

    Pope, R.; Turi, G.; Brancato, R.; Blalock, L.; Merrill, O.


    The U.S. Department of Energy (DOE) has performed a department-wide scoping of its packaging and transportation needs and has arrived at a projection of these needs for well into the twenty-first century. The assessment, known as the Transportation Needs Assessment (TNA) was initiated during August 1994 and completed in December 1994. The TNA will allow DOE to better prepare for changes in its transportation requirements in the future. The TNA focused on projected, quantified shipping needs based on forecasts of inventories of materials which will ultimately require transport by the DOE for storage, treatment and/or disposal. In addition, experts provided input on the growing needs throughout DOE resulting from changes in regulations, in DOE's mission, and in the sociopolitical structure of the United States. Through the assessment, DOE's transportation needs have been identified for a time period extending from the present through the first three decades of the twenty-first century. The needs assessment was accomplished in three phases: (1) defining current packaging, shipping, resource utilization, and methods of managing packaging and transportation activities; (2) establishing the inventory of materials which DOE will need to transport on into the next century and scenarios which project when, from where, and to where these materials will need to be transported; and (3) developing requirements and projected changes for DOE to accomplish the necessary transport safely and economically

  4. Preparing Teacher-Students for Twenty-First-Century Learning Practices (PREP 21): A Framework for Enhancing Collaborative Problem-Solving and Strategic Learning Skills (United States)

    Häkkinen, Päivi; Järvelä, Sanna; Mäkitalo-Siegl, Kati; Ahonen, Arto; Näykki, Piia; Valtonen, Teemu


    With regard to the growing interest in developing teacher education to match the twenty-first-century skills, while many assumptions have been made, there has been less theoretical elaboration and empirical research on this topic. The aim of this article is to present our pedagogical framework for the twenty-first-century learning practices in…

  5. Laboratory simulation of erosion by space plasma

    International Nuclear Information System (INIS)

    Kristoferson, L.; Fredga, K.


    A laboratory experiment has been made where a plasma stream collides with targets made of different materials of cosmic interest. The experiment can be viewed as a process simulation of the solar wind particle interaction with solid surfaces in space, e.g. cometary dust. Special interest is given to sputtering of OH and Na. It is shown that the erosion of solid particles in interplanetary space at large heliocentric distances is most likely dominated by sputtering and by sublimation near the sun. The heliocentric distance of the limit between the two regions is determined mainly by the material properties of the eroded surface, e.g. heat of sublimation and sputtering yield, a typical distance being 0,5 a.u. It is concluded that the observations of Na in comets at large solar distances, in some cases also near the sun, is most likely to be explained by solar wind sputtering. OH emission in space could be of importance also from 'dry', water-free, matter by means of molecule sputtering. The observed OH production rates in comets are however too large to be explained in this way and are certainly the results of sublimation and dissociation of H 2 O from an icy nucleus. (Auth.)

  6. Twenty-First Century Instructional Classroom Practices and Reading Motivation: Probing the Effectiveness of Interventional Reading Programs

    Directory of Open Access Journals (Sweden)

    Taoufik Boulhrir


    Full Text Available Twenty-first century education has undoubtedly witnessed changes of the definition of literacy to cope with the economic, social, and intellectual trends. Technological advances, which include skills of communication, creativity, critical thinking, and collaboration have become key in education, especially when dealing with literacy and reading motivation. As motivation hinges around two major theoretical approaches, intrinsic and extrinsic, numerous studies argue for the first to be more sustainable in enhancing reading motivation. Accordingly, many research-based interventional programs have emerged since the late nineties with increasing popularity to offer answers to the dwindling rates in reading among youth. This article discusses traits of 21st century education in light of trends and challenges as it probes the effectiveness of some interventional programs that are meant, and argued for, to enhance literacy skills and reading motivation.

  7. Us, them, and others: reflections on Canadian multiculturalism and national identity at the turn of the twenty-first century. (United States)

    Winter, Elke


    The John Porter Lecture at the annual meeting of the Canadian Sociological Association in Victoria 2013 draws upon my book Us, Them, and Others: Pluralism and National Identity in Diverse Societies. Incorporating the findings from an analysis of Canadian English-language newspaper discourses during the 1990s into a theoretical framework inspired by Weberian sociology, the book argues that pluralism is best understood as a dynamic set of triangular relations where the compromise between unequal groups--"us" and "others"--is rendered meaningful through the confrontation with real or imagined outsiders ("them"). The lecture summarizes the theoretical contribution and explains how multiculturalism became consolidated in dominant Canadian discourses in the late 1990s. The lecture then discusses changes to Canadian multicultural identity at the beginning of the twenty-first century.

  8. The horror of stigma: psychosis and mental health care environments in twenty-first-century horror film (part II). (United States)

    Goodwin, John


    This paper highlights the specific manner in which twenty-first-century horror films stigmatize psychosis and mental health care environments (MHCEs) A search on various film forums using the terms "mental/psychiatric patient," "psychosis/psychoses," and "mental/psychiatric hospital" (limited from 2000 to 2012) revealed 55 films. A literature review revealed criteria for a checklist. Subsequent to viewings, salient recurring criteria were added to the checklist. Films were systematically analyzed under these criteria. Homicidal maniacs are the most common stereotypes. Misinformation is often communicated. Familiar horror tropes are used to stigmatize MHCEs. Practitioners should be aware of the specific manner in which clients are being stigmatized by the media. This paper highlights specific ways in which psychosis and MHCEs are stigmatized, and encourages practitioners to challenge these depictions. © 2013 Wiley Periodicals, Inc.

  9. Towards a Rational Kingdom in Africa: Knowledge, Critical Rationality and Development in a Twenty-First Century African Cultural Context

    Directory of Open Access Journals (Sweden)

    Lawrence Ogbo Ugwuanyi


    Full Text Available This paper seeks to locate the kind of knowledge that is relevant for African development in the twenty-first century African cultural context and to propose the paradigm for achieving such knowledge. To do this, it advances the view that the concept of twenty-first century in an African context must be located with the colonial and post-colonial challenges of the African world and applied to serve the African demand. Anchored on this position, the paper outlines and critiques the wrong assumption on which modern state project was anchored in post-colonial Africa and its development dividend to suggest that this is an outcome of a wrong knowledge design that is foundational to the state project and which the project did not address. It proposes a shift in the knowledge paradigm in Africa and suggests critical self-consciousness as a more desirable knowledge design for Africa. It applies the term ‘rational kingdom’ (defined as a community of reason marked by critical conceptual self-awareness driven by innovation and constructivism to suggest this paradigm. ‘Innovation’ is meant as the application of reason with an enlarged capacity to anticipate and address problems with fresh options and ‘constructivism’ is meant as the disposition to sustain innovation by advancing an alternative but more reliable worldview that can meet the exigencies of modernity in an African cultural context. The paper then proceeds to outline the nature of the rational kingdom and its anticipated gains and outcomes. It applies the method of inductive reasoning to advance its position. To do this it invokes selected but crucial areas of African life to locate how the developmental demands of these aspects of life suggest a critical turn in African rationality.

  10. Sub-Saharan Northern African climate at the end of the twenty-first century: forcing factors and climate change processes

    Energy Technology Data Exchange (ETDEWEB)

    Patricola, C.M. [Cornell University, Department of Earth and Atmospheric Sciences, Ithaca, NY (United States); Texas A and M University, Department of Atmospheric Sciences, College Station, TX (United States); Cook, K.H. [The University of Texas at Austin, Department of Geological Sciences, Jackson School of Geosciences, Austin, TX (United States)


    A regional climate model, the Weather Research and Forecasting (WRF) Model, is forced with increased atmospheric CO{sub 2} and anomalous SSTs and lateral boundary conditions derived from nine coupled atmosphere-ocean general circulation models to produce an ensemble set of nine future climate simulations for northern Africa at the end of the twenty-first century. A well validated control simulation, agreement among ensemble members, and a physical understanding of the future climate change enhance confidence in the predictions. The regional model ensembles produce consistent precipitation projections over much of northern tropical Africa. A moisture budget analysis is used to identify the circulation changes that support future precipitation anomalies. The projected midsummer drought over the Guinean Coast region is related partly to weakened monsoon flow. Since the rainfall maximum demonstrates a southward bias in the control simulation in July-August, this may be indicative of future summer drying over the Sahel. Wetter conditions in late summer over the Sahel are associated with enhanced moisture transport by the West African westerly jet, a strengthening of the jet itself, and moisture transport from the Mediterranean. Severe drought in East Africa during August and September is accompanied by a weakened Indian monsoon and Somali jet. Simulations with projected and idealized SST forcing suggest that overall SST warming in part supports this regional model ensemble agreement, although changes in SST gradients are important over West Africa in spring and fall. Simulations which isolate the role of individual climate forcings suggest that the spatial distribution of the rainfall predictions is controlled by the anomalous SST and lateral boundary conditions, while CO{sub 2} forcing within the regional model domain plays an important secondary role and generally produces wetter conditions. (orig.)

  11. Future projections of synoptic weather types over the Arabian Peninsula during the twenty-first century using an ensemble of CMIP5 models

    KAUST Repository

    El Kenawy, Ahmed M.; McCabe, Matthew


    An assessment of future change in synoptic conditions over the Arabian Peninsula throughout the twenty-first century was performed using 20 climate models from the Coupled Model Intercomparison Project Phase 5 (CMIP5) database. We employed the mean

  12. Observation and simulation of AGW in Space (United States)

    Kunitsyn, Vyacheslav; Kholodov, Alexander; Andreeva, Elena; Nesterov, Ivan; Padokhin, Artem; Vorontsov, Artem


    Examples are presented of satellite observations and imaging of AGW and related phenomena in space travelling ionospheric disturbances (TID). The structure of AGW perturbations was reconstructed by satellite radio tomography (RT) based on the signals of Global Navigation Satellite Systems (GNSS). The experiments use different GNSS, both low-orbiting (Russian Tsikada and American Transit) and high-orbiting (GPS, GLONASS, Galileo, Beidou). The examples of RT imaging of TIDs and AGWs from anthropogenic sources such as ground explosions, rocket launching, heating the ionosphere by high-power radio waves are presented. In the latter case, the corresponding AGWs and TIDs were generated in response to the modulation in the power of the heating wave. The natural AGW-like wave disturbances are frequently observed in the atmosphere and ionosphere in the form of variations in density and electron concentration. These phenomena are caused by the influence of the near-space environment, atmosphere, and surface phenomena including long-period vibrations of the Earth's surface, earthquakes, explosions, temperature heating, seisches, tsunami waves, etc. Examples of experimental RT reconstructions of wave disturbances associated with the earthquakes and tsunami waves are presented, and RT images of TIDs caused by the variations in the corpuscular ionization are demonstrated. The results of numerical modeling of AGW generation by some surface and volume sources are discussed. The milli-Hertz AGWs generated by these sources induce perturbations with a typical scale of a few hundred of kilometers at the heights of the middle atmosphere and ionosphere. The numerical modeling is based on the solution of equations of geophysical hydrodynamics. The results of the numerical simulations agree with the observations. The authors acknowledge the support of the Russian Foundation for Basic Research (grants 14-05-00855 and 13-05-01122), grant of the President of Russian Federation MK-2670

  13. The Polar WRF Downscaled Historical and Projected Twenty-First Century Climate for the Coast and Foothills of Arctic Alaska

    Directory of Open Access Journals (Sweden)

    Lei Cai


    Full Text Available Climate change is most pronounced in the northern high latitude region. Yet, climate observations are unable to fully capture regional-scale dynamics due to the sparse weather station coverage, which limits our ability to make reliable climate-based assessments. A set of simulated data products was therefore developed for the North Slope of Alaska through a dynamical downscaling approach. The polar-optimized Weather Research and Forecast (Polar WRF model was forced by three sources: The ERA-interim reanalysis data (for 1979–2014, the Community Earth System Model 1.0 (CESM1.0 historical simulation (for 1950–2005, and the CESM1.0 projected (for 2006–2100 simulations in two Representative Concentration Pathways (RCP4.5 and RCP8.5 scenarios. Climatic variables were produced in a 10-km grid spacing and a 3-h interval. The ERA-interim forced WRF (ERA-WRF proves the value of dynamical downscaling, which yields more realistic topographical-induced precipitation and air temperature, as well as corrects underestimations in observed precipitation. In summary, dry and cold biases to the north of the Brooks Range are presented in ERA-WRF, while CESM forced WRF (CESM-WRF holds wet and warm biases in its historical period. A linear scaling method allowed for an adjustment of the biases, while keeping the majority of the variability and extreme values of modeled precipitation and air temperature. CESM-WRF under RCP 4.5 scenario projects smaller increase in precipitation and air temperature than observed in the historical CESM-WRF product, while the CESM-WRF under RCP 8.5 scenario shows larger changes. The fine spatial and temporal resolution, long temporal coverage, and multi-scenario projections jointly make the dataset appropriate to address a myriad of physical and biological changes occurring on the North Slope of Alaska.

  14. Deep Space Navigation and Timing Architecture and Simulation, Phase I (United States)

    National Aeronautics and Space Administration — Microcosm will develop a deep space navigation and timing architecture and associated simulation, incorporating state-of-the art radiometric, x-ray pulsar, and laser...

  15. Building Interdisciplinary Leadership Skills among Health Practitioners in the Twenty-First Century: An Innovative Training Model. (United States)

    Negandhi, Preeti; Negandhi, Himanshu; Tiwari, Ritika; Sharma, Kavya; Zodpey, Sanjay P; Quazi, Zahiruddin; Gaidhane, Abhay; Jayalakshmi N; Gijare, Meenakshi; Yeravdekar, Rajiv


    Transformational learning is the focus of twenty-first century global educational reforms. In India, there is a need to amalgamate the skills and knowledge of medical, nursing, and public health practitioners and to develop robust leadership competencies among them. This initiative proposed to identify interdisciplinary leadership competencies among Indian health practitioners and to develop a training program for interdisciplinary leadership skills through an Innovation Collaborative. Medical, nursing, and public health institutions partnered in this endeavor. An exhaustive literature search was undertaken to identify leadership competencies in these three professions. Published evidence was utilized in searching for the need for interdisciplinary training of health practitioners, including current scenarios in interprofessional health education and the key competencies required. The interdisciplinary leadership competencies identified were self-awareness, vision, self-regulation, motivation, decisiveness, integrity, interpersonal communication skills, strategic planning, team building, innovation, and being an effective change agent. Subsequently, a training program was developed, and three training sessions were piloted with 66 participants. Each cohort comprised a mix of participants from different disciplines. The pilot training guided the development of a training model for building interdisciplinary leadership skills and organizing interdisciplinary leadership workshops. The need for interdisciplinary leadership competencies is recognized. The long-term objective of the training model is integration into the regular medical, nursing, and public health curricula, with the aim of developing interdisciplinary leadership skills among them. Although challenging, formal incorporation of leadership skills into health professional education is possible within the interdisciplinary classroom setting using principles of transformative learning.

  16. Shelter and indoor air in the twenty-first century--radon, smoking, and lung cancer risks

    International Nuclear Information System (INIS)

    Fabrikant, J.I.


    Recognition that radon and its daughter products may accumulate to high levels in homes and in the workplace has led to concern about the potential lung cancer risk resulting from indoor domestic exposure. While such risks can be estimated with current dosimetric and epidemiological models for excess relative risks, it must be recognized that these models are based on data from occupational exposure and from underground miners' mortality experience. Several assumptions are required to apply risk estimates from an occupational setting to the indoor domestic environment. Analyses of the relevant data do not lead to a conclusive description of the interaction between radon daughters and cigarette smoking for the induction of lung cancer. The evidence compels the conclusion that indoor radon daughter exposure in homes represents a potential life-threatening public health hazard, particularly in males, and in cigarette smokers. Resolution of complex societal interactions will require public policy decisions involving the governmental, scientific, financial, and industrial sectors. These decisions impact the home, the workplace, and the marketplace, and they extend beyond the constraints of science. Risk identification, assessment, and management require scientific and engineering approaches to guide policy decisions to protect the public health. Mitigation and control procedures are only beginning to receive attention. Full acceptance for protection against what could prove to be a significant public health hazard in the twenty-first century will certainly involve policy decisions, not by scientists, but rather by men and women of government and law

  17. The era of the wandering mind? Twenty-first century research on self-generated mental activity

    Directory of Open Access Journals (Sweden)

    Felicity eCallard


    Full Text Available The first decade of the twenty-first century was characterized by renewed scientific interest in self-generated mental activity (activity largely generated by the individual, rather than in response to experimenters’ instructions or specific external sensory inputs. To understand this renewal of interest, we interrogated the peer-reviewed literature from 2003–2012 (i to explore recent changes in use of terms for self-generated mental activity; (ii to investigate changes in the topics on which mind wandering research, specifically, focuses; and (iii to visualize co-citation communities amongst researchers working on self-generated mental activity. Our analyses demonstrated that there has been a dramatic increase in the term mind wandering, and a significant crossing-over of psychological investigations of mind wandering, specifically, into cognitive neuroscience. If this is, indeed, the ‘era of the wandering mind’, our paper calls for more explicit reflection to be given by mind wandering researchers to the terms they use, the topics and brain regions they focused on, and the research literatures that they implicitly foreground or ignore as not relevant.

  18. Macro Level Simulation Model Of Space Shuttle Processing (United States)


    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  19. Deep Space Storm Shelter Simulation Study (United States)

    Dugan, Kathryn; Phojanamongkolkij, Nipa; Cerro, Jeffrey; Simon, Matthew


    Missions outside of Earth's magnetic field are impeded by the presence of radiation from galactic cosmic rays and solar particle events. To overcome this issue, NASA's Advanced Exploration Systems Radiation Works Storm Shelter (RadWorks) has been studying different radiation protective habitats to shield against the onset of solar particle event radiation. These habitats have the capability of protecting occupants by utilizing available materials such as food, water, brine, human waste, trash, and non-consumables to build short-term shelters. Protection comes from building a barrier with the materials that dampens the impact of the radiation on astronauts. The goal of this study is to develop a discrete event simulation, modeling a solar particle event and the building of a protective shelter. The main hallway location within a larger habitat similar to the International Space Station (ISS) is analyzed. The outputs from this model are: 1) the total area covered on the shelter by the different materials, 2) the amount of radiation the crew members receive, and 3) the amount of time for setting up the habitat during specific points in a mission given an event occurs.

  20. A Process for Comparing Dynamics of Distributed Space Systems Simulations (United States)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.


    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  1. Growth of Global Publishing Output of Health Economics in the Twenty-First Century: A Bibliographic Insight. (United States)

    Jakovljevic, Mihajlo Michael; Pejcic, Ana V


    Strong growth of interdisciplinary sciences might find exceptional example in academic health economics. We decided to observe the quantitative output in this science since the beginning of the twenty-first century. Electronic search of the published literature was conducted in four different databases: one medical database-MEDLINE/PubMed, two general databases-Scopus/Elsevier and Web of Science (WoS), and one specialized health economic database-NHS Economic Evaluation Database (EED). The applied combination of key words was carefully chosen to cover the most commonly used terms in titles of publications dealing with conceptual areas of health economics. All bibliographic units were taken into account. Within the time horizon from January 1, 2000 to December 31, 2016, without language or limitations on bibliographic unit types, we identified an output ranging approximately from 60,345 to 88,246 records with applied search strategy in MEDLINE/PubMed, Scopus/Elsevier, and WoS. In NHS EED, we detected 14,761 records of economic evaluations of health interventions during the period in which database was maintained and regularly updated. With slightly more than one-third of the identified records, USA clearly dominates in this field. United Kingdom takes a strong second place with about 12% of identified records. Consistently, USA and UK universities are the most frequent among the top 15 affiliations/organizations of the authors of the identified records. Authors from Harvard University contributed to the largest number of the identified records. There is a clear evidence of both the upward stream of blossoming in health economics publications and its acceleration. Based on this bibliographic data set, it is difficult to distinguish the actual impact growth of this output provided dominantly by academia with modest contribution by pharmaceutical/medicinal device industry and diverse national government-based agencies. Further insight into the citation track record of

  2. The traditional commons of England and Wales in the twenty-first century: meeting new and old challenges

    Directory of Open Access Journals (Sweden)

    Chris Short


    Full Text Available The commons literature makes much of the changes within the traditional land use sectors of developed countries. This largely focuses on the decline of the economic function of commons that threaten their existence, the emergence of multiple use patterns, and the resilience and policy adaptation needed to continue. The situation in England and Wales is used to illustrate that commons are increasingly important to a number of ‘new’ rural functions and that the associated policy developments may hold an important message for progress towards sustainable multifunctional land management more generally. This article reviews and updates what is meant by the term common land within England and Wales, while outlining its current importance and threats. The commons literature is investigated to see if the approach is useful in revealing the current issues associated with the incorporation of new stakeholders and functions within a traditional structure. Recent changes and developments surrounding the Commons Act 2006 are assessed to see if they are likely to assist in sustaining these commons through the twenty-first century. The article argues that any new approach requires long term planning and a commitment to support local participation among commoners and others who are involved in the governance and management of these areas of land. In order for these challenges to be met there needs to be an understanding of the functions and cultural traditions of common land as well as of the changes in society associated with the decline in traditional agrarian management in developed countries. Such challenges can rarely if ever be achieved through legislation and policy developments, requiring an investment in developing locally based solutions.

  3. Galactic cosmic ray simulation at the NASA Space Radiation Laboratory (United States)

    Norbury, John W.; Schimmerling, Walter; Slaba, Tony C.; Azzam, Edouard I.; Badavi, Francis F.; Baiocco, Giorgio; Benton, Eric; Bindi, Veronica; Blakely, Eleanor A.; Blattnig, Steve R.; Boothman, David A.; Borak, Thomas B.; Britten, Richard A.; Curtis, Stan; Dingfelder, Michael; Durante, Marco; Dynan, William S.; Eisch, Amelia J.; Elgart, S. Robin; Goodhead, Dudley T.; Guida, Peter M.; Heilbronn, Lawrence H.; Hellweg, Christine E.; Huff, Janice L.; Kronenberg, Amy; La Tessa, Chiara; Lowenstein, Derek I.; Miller, Jack; Morita, Takashi; Narici, Livio; Nelson, Gregory A.; Norman, Ryan B.; Ottolenghi, Andrea; Patel, Zarana S.; Reitz, Guenther; Rusek, Adam; Schreurs, Ann-Sofie; Scott-Carnell, Lisa A.; Semones, Edward; Shay, Jerry W.; Shurshakov, Vyacheslav A.; Sihver, Lembit; Simonsen, Lisa C.; Story, Michael D.; Turker, Mitchell S.; Uchihori, Yukio; Williams, Jacqueline; Zeitlin, Cary J.


    Most accelerator-based space radiation experiments have been performed with single ion beams at fixed energies. However, the space radiation environment consists of a wide variety of ion species with a continuous range of energies. Due to recent developments in beam switching technology implemented at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL), it is now possible to rapidly switch ion species and energies, allowing for the possibility to more realistically simulate the actual radiation environment found in space. The present paper discusses a variety of issues related to implementation of galactic cosmic ray (GCR) simulation at NSRL, especially for experiments in radiobiology. Advantages and disadvantages of different approaches to developing a GCR simulator are presented. In addition, issues common to both GCR simulation and single beam experiments are compared to issues unique to GCR simulation studies. A set of conclusions is presented as well as a discussion of the technical implementation of GCR simulation. PMID:26948012

  4. Space Science Investigation: NASA ISS Stowage Simulator (United States)

    Crawford, Gary


    During this internship the opportunity was granted to work with the Integrated, Graphics, Operations and Analysis Laboratory (IGOAL) team. The main assignment was to create 12 achievement patches for the Space Station training simulator called the "NASA ISS Stowage Training Game." This project was built using previous IGOAL developed software. To accomplish this task, Adobe Photoshop and Adobe Illustrator were used to craft the badges and other elements required. Blender, a 3D modeling software, was used to make the required 3D elements. Blender was a useful tool to make things such as a CTB bag for the "No More Bob" patch which shows a gentleman kicking a CTB bag into the distance. It was also used to pose characters to the positions that was optimal for their patches as in the "Station Sanitation" patch which portrays and astronaut waving on a U.S module on a truck. Adobe Illustrator was the main piece of software for this task. It was used to craft the badges and upload them when they were completed. The style of the badges were flat, meaning that they shouldn't look three dimensional in any way, shape or form. Adobe Photoshop was used when any pictures need brightening and was where the texture for the CTB bag was made. In order for the patches to be ready for the game's next major release, they have to go under some critical reviewing, revising and re-editing to make sure the other artists and the rest of the staff are satisfied with the final products. Many patches were created and revamped to meet the flat setting and incorporate suggestions from the IGOAL team. After the three processes were completed, the badges were implemented into the game (reference fig1 for badges). After a month of designing badges, the finished products were placed into the final game build via the programmers. The art was the final piece in showcasing the latest build to the public for testing. Comments from the testers were often exceptional and the feedback on the badges were

  5. Traditional knowledge hiding in plain sight - twenty-first century ethnobotany of the Chácobo in Beni, Bolivia. (United States)

    Paniagua Zambrana, Narel Y; Bussmann, Rainer W; Hart, Robbie E; Moya Huanca, Araceli L; Ortiz Soria, Gere; Ortiz Vaca, Milton; Ortiz Álvarez, David; Soria Morán, Jorge; Soria Morán, María; Chávez, Saúl; Chávez Moreno, Bertha; Chávez Moreno, Gualberto; Roca, Oscar; Siripi, Erlin


    The Chácobo are a Panoan speaking tribe of about 1000 members (300+ adults) in Beni, Bolivia. Originally nomadic, the Chácabo were relocated to their current main location in the 1960s. Researchers have visited the Chácabo since 1911. A first more detailed anthropological report exists from the late 1960s, and ecological-ethnobotanical studies were conducted in the 1980s and 1990s. The presented work represents a complete ethnobotanical inventory of the entire adult Chácobo population, with interviews and plant collection conducted directly by Chácobo counterparts. Based on previous reports and our preliminary studies, we hypothesized that twenty-first century Chácobo plant use centered on income generation, and that traditional plant use related to household utensils, medicine and traditional crop varieties had almost disappeared. To test this hypothesis, we started the "Chácobo Ethnobotany Project," training 10 indigenous Chácobo participants in ethnobotanical interview and plant collection techniques, in order to more fully document Chácobo knowledge and avoid the influence of foreign interviewers. Our study found 331 useful plant species in 241genera of 95 plant families, with leaves, roots and bark being the most commonly used plant parts The comprehensive documentation that these methods enabled completely nullified our initial hypothesis of knowledge loss. Traditional crop varieties are still widely grown and traditional knowledge is alive. Moreover, it is being actively recuperated in certain domains by the younger generation. Most Chácobo know, and can name, traditional utensils and tools, although only the older generation has still the skills to manufacture them. While many Chácobo still know the names and uses of medicinal species, the younger generation is however often unsure how to identify them. In this paper we illustrate the complexity of perspectives on knowledge at different ages, and the persistence of knowledge over almost a century

  6. Status Report of Simulated Space Radiation Environment Facility

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Phil Hyun; Nho, Young Chang; Jeun, Joon Pyo; Choi, Jae Hak; Lim, Youn Mook; Jung, Chan Hee; Jeon, Young Kyu


    The technology for performance testing and improvement of materials which are durable at space environment is a military related technology and veiled and securely regulated in advanced countries such as US and Russia. This core technology cannot be easily transferred to other country too. Therefore, this technology is the most fundamental and necessary research area for the successful establishment of space environment system. Since the task for evaluating the effects of space materials and components by space radiation plays important role in satellite lifetime extension and running failure percentage decrease, it is necessary to establish simulated space radiation facility and systematic testing procedure. This report has dealt with the status of the technology to enable the simulation of space environment effects, including the effect of space radiation on space materials. This information such as the fundamental knowledge of space environment and research status of various countries as to the simulation of space environment effects of space materials will be useful for the research on radiation hardiness of the materials. Furthermore, it will be helpful for developer of space material on deriving a better choice of materials, reducing the design cycle time, and improving safety.

  7. Status Report of Simulated Space Radiation Environment Facility

    International Nuclear Information System (INIS)

    Kang, Phil Hyun; Nho, Young Chang; Jeun, Joon Pyo; Choi, Jae Hak; Lim, Youn Mook; Jung, Chan Hee; Jeon, Young Kyu


    The technology for performance testing and improvement of materials which are durable at space environment is a military related technology and veiled and securely regulated in advanced countries such as US and Russia. This core technology cannot be easily transferred to other country too. Therefore, this technology is the most fundamental and necessary research area for the successful establishment of space environment system. Since the task for evaluating the effects of space materials and components by space radiation plays important role in satellite lifetime extension and running failure percentage decrease, it is necessary to establish simulated space radiation facility and systematic testing procedure. This report has dealt with the status of the technology to enable the simulation of space environment effects, including the effect of space radiation on space materials. This information such as the fundamental knowledge of space environment and research status of various countries as to the simulation of space environment effects of space materials will be useful for the research on radiation hardiness of the materials. Furthermore, it will be helpful for developer of space material on deriving a better choice of materials, reducing the design cycle time, and improving safety

  8. Proliferation and Nonproliferation in the Early Twenty-First Century. The Permanent Five Hold the Key to Success

    International Nuclear Information System (INIS)

    Santoro, David


    are. Its core finding is that much of the success against proliferation will be determined by the role played by the permanent members of the Security Council, the so-called Permanent Five or P-5 (China, France, Russia, the United Kingdom, and the United States). It is unclear, however, whether the Five will be able and willing to play this role adequately. The developments of the first decade of the twenty-first century have not been comforting for nonproliferation. Proliferation challenges have risen and grown more complex. In response, policy tools have been developed, but their effectiveness has suffered from divisions among the P-5 and between them and the NAM states. Half a century since Ikle's article and a decade since Roberts' review, the major powers have remained at a loss to address the threat of proliferation. Winning is still possible, but it will require more than wishful thinking. In the years ahead, the challenge will be to reconcile policy effectiveness with policy legitimacy, be it to restore compliance altogether or to prevent proliferation, counter it, detect and expose noncompliance, and manage nonproliferation failures. Meeting this challenge places the P-5 at the center-stage. Much of the success against proliferation will be determined by the role that the Five choose to play. But given current shifts in international power structures (what Joseph Nye calls 'the rise of the rest') the prospects appear uncertain.68 It is important, therefore, that further research focuses on how the P-5 role can be strengthened to address proliferation, and how this role can be better aligned with today's evolving international trends

  9. The Challenges of Teaching and Learning about Science in the Twenty-First Century: Exploring the Abilities and Constraints of Adolescent Learners (United States)

    Anderman, Eric M.; Sinatra, Gale M.; Gray, DeLeon L.


    In this article, we critically examine skills that are necessary for the effective learning of science in adolescent populations. We argue that a focus on twenty-first-century skills among adolescents within the context of science instruction must be considered in light of research on cognitive and social development. We first review adolescents'…

  10. Solving the problems we face: the United States Environmental Protection Agency, sustainability, and the challenges of the twenty-first century (United States)

    Addressing the problems of the twenty-first century will require new initiatives that complement traditional regulatory activities. Existing regulations, such as the Clean Air Act and Clean Water Act are important safety nets in the United States for protecting human health and t...

  11. Formatively Assessing Teamwork in Technology-Enabled Twenty-First Century Classrooms: Exploratory Findings of a Teamwork Awareness Programme in Singapore (United States)

    Koh, Elizabeth; Hong, Helen; Tan, Jennifer Pei-Ling


    Teamwork, one of the core competencies for the twenty-first century learner, is a critical skill for work and learning. However, assessing teamwork is complex, in particular, developing a measure of teamwork that is domain-generic and applicable across a wide range of learners. This paper documents one such study that leverages technology to help…

  12. Sea-level rise and its possible impacts given a ‘beyond 4°C world’ in the twenty-first century

    NARCIS (Netherlands)

    Nicholls, R.; Marinova, N.A.; Lowe, J.; Brown, S.; Vellinga, P.


    The range of future climate-induced sea-level rise remains highly uncertain with continued concern that large increases in the twenty-first century cannot be ruled out. The biggest source of uncertainty is the response of the large ice sheets of Greenland and west Antarctica. Based on our analysis,

  13. Bruce's Magnificent Quartet: Inquiry, Community, Technology and Literacy--Implications for Renewing Qualitative Research in the Twenty-First Century (United States)

    Davidson, Judith


    Bruce and Bishop's community informatics work brings forward four critical concepts: inquiry, community, technology, and literacy. These four terms serve as the basis for a discussion of qualitative research in the twenty-first century--what is lacking and what is needed. The author suggests that to resolve the tensions or challenges…

  14. Use of Comics to Enhance Students' Learning for the Development of the Twenty-First Century Competencies in the Mathematics Classroom (United States)

    Toh, Tin Lam; Cheng, Lu Pien; Ho, Siew Yin; Jiang, Heng; Lim, Kam Ming


    This paper discusses the use of comics in teaching mathematics in the secondary mathematics classroom. We explicate how the use of comics in teaching mathematics can prepare students for the twenty-first century competencies. We developed an alternative teaching package using comics for two lower secondary mathematics topics. This alternative…

  15. EASCON '88; Proceedings of the Twenty-first Annual Electronics and Aerospace Conference, Arlington, VA, Nov. 9-11, 1988 (United States)

    The capabilities of present and future space and terrestrial communication systems are examined in reviews and reports. Topics addressed include competition between space and terrestrial technologies, remote sensing, carrier services in public switched telephone networks, surveillance and warning systems, telescience and telerobotics, integrated networks and systems, and military communication systems. Consideration is given to navigation and geolocation services; high-definition TV broadcasting; technical, economic, marketing, and strategic aspects of VSATs; future technology drivers; and SDI technologies.

  16. Development of space simulation / net-laboratory system (United States)

    Usui, H.; Matsumoto, H.; Ogino, T.; Fujimoto, M.; Omura, Y.; Okada, M.; Ueda, H. O.; Murata, T.; Kamide, Y.; Shinagawa, H.; Watanabe, S.; Machida, S.; Hada, T.

    A research project for the development of space simulation / net-laboratory system was approved by Japan Science and Technology Corporation (JST) in the category of Research and Development for Applying Advanced Computational Science and Technology(ACT-JST) in 2000. This research project, which continues for three years, is a collaboration with an astrophysical simulation group as well as other space simulation groups which use MHD and hybrid models. In this project, we develop a proto type of unique simulation system which enables us to perform simulation runs by providing or selecting plasma parameters through Web-based interface on the internet. We are also developing an on-line database system for space simulation from which we will be able to search and extract various information such as simulation method and program, manuals, and typical simulation results in graphic or ascii format. This unique system will help the simulation beginners to start simulation study without much difficulty or effort, and contribute to the promotion of simulation studies in the STP field. In this presentation, we will report the overview and the current status of the project.

  17. A Simulation and Modeling Framework for Space Situational Awareness

    International Nuclear Information System (INIS)

    Olivier, S.S.


    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated

  18. Chinese Woman in New York City: Transcultural Travel and Postsocialist Cosmopolitanism in Twenty-first Century China


    Berg, Daria; Kunze, Rui


    This paper explores transcultural travel as the new space of Chinese women and culture in motion in a globalizing postsocialist China. We adopt Lisa Rofel’s concept of ‘postsocialist cosmopolitanism’ to examine how a new generation of Chinese women writers fashions a new female self in their writings about lived experiences in transnational and transcultural environments. According to Rofel, postsocialist cosmopolitanism combines first, a self-conscious transcendence of locality accomplished ...

  19. Future change of climate in South America in the late twenty-first century: intercomparison of scenarios from three regional climate models (United States)

    Marengo, Jose A.; Ambrizzi, Tercio; Da Rocha, Rosmeri P.; Alves, Lincoln M.; Cuadra, Santiago V.; Valverde, Maria C.; Torres, Roger R.; Santos, Daniel C.; Ferraz, Simone E. T.


    Regional climate change projections for the last half of the twenty-first century have been produced for South America, as part of the CREAS (Cenarios REgionalizados de Clima Futuro da America do Sul) regional project. Three regional climate models RCMs (Eta CCS, RegCM3 and HadRM3P) were nested within the HadAM3P global model. The simulations cover a 30-year period representing present climate (1961-1990) and projections for the IPCC A2 high emission scenario for 2071-2100. The focus was on the changes in the mean circulation and surface variables, in particular, surface air temperature and precipitation. There is a consistent pattern of changes in circulation, rainfall and temperatures as depicted by the three models. The HadRM3P shows intensification and a more southward position of the subtropical Pacific high, while a pattern of intensification/weakening during summer/winter is projected by the Eta CCS/RegCM3. There is a tendency for a weakening of the subtropical westerly jet from the Eta CCS and HadRM3P, consistent with other studies. There are indications that regions such of Northeast Brazil and central-eastern and southern Amazonia may experience rainfall deficiency in the future, while the Northwest coast of Peru-Ecuador and northern Argentina may experience rainfall excesses in a warmer future, and these changes may vary with the seasons. The three models show warming in the A2 scenario stronger in the tropical region, especially in the 5°N-15°S band, both in summer and especially in winter, reaching up to 6-8°C warmer than in the present. In southern South America, the warming in summer varies between 2 and 4°C and in winter between 3 and 5°C in the same region from the 3 models. These changes are consistent with changes in low level circulation from the models, and they are comparable with changes in rainfall and temperature extremes reported elsewhere. In summary, some aspects of projected future climate change are quite robust across this set of

  20. Simulating cosmic microwave background maps in multiconnected spaces

    International Nuclear Information System (INIS)

    Riazuelo, Alain; Uzan, Jean-Philippe; Lehoucq, Roland; Weeks, Jeffrey


    This paper describes the computation of cosmic microwave background (CMB) anisotropies in a universe with multiconnected spatial sections and focuses on the implementation of the topology in standard CMB computer codes. The key ingredient is the computation of the eigenmodes of the Laplacian with boundary conditions compatible with multiconnected space topology. The correlators of the coefficients of the decomposition of the temperature fluctuation in spherical harmonics are computed and examples are given for spatially flat spaces and one family of spherical spaces, namely, the lens spaces. Under the hypothesis of Gaussian initial conditions, these correlators encode all the topological information of the CMB and suffice to simulate CMB maps

  1. Next Generation Simulation Framework for Robotic and Human Space Missions (United States)

    Cameron, Jonathan M.; Balaram, J.; Jain, Abhinandan; Kuo, Calvin; Lim, Christopher; Myint, Steven


    The Dartslab team at NASA's Jet Propulsion Laboratory (JPL) has a long history of developing physics-based simulations based on the Darts/Dshell simulation framework that have been used to simulate many planetary robotic missions, such as the Cassini spacecraft and the rovers that are currently driving on Mars. Recent collaboration efforts between the Dartslab team at JPL and the Mission Operations Directorate (MOD) at NASA Johnson Space Center (JSC) have led to significant enhancements to the Dartslab DSENDS (Dynamics Simulator for Entry, Descent and Surface landing) software framework. The new version of DSENDS is now being used for new planetary mission simulations at JPL. JSC is using DSENDS as the foundation for a suite of software known as COMPASS (Core Operations, Mission Planning, and Analysis Spacecraft Simulation) that is the basis for their new human space mission simulations and analysis. In this paper, we will describe the collaborative process with the JPL Dartslab and the JSC MOD team that resulted in the redesign and enhancement of the DSENDS software. We will outline the improvements in DSENDS that simplify creation of new high-fidelity robotic/spacecraft simulations. We will illustrate how DSENDS simulations are assembled and show results from several mission simulations.

  2. Changing ideas in forestry: A comparison of concepts in Swedish and American forestry journals during the early twentieth and twenty-first centuries. (United States)

    Mårald, Erland; Langston, Nancy; Sténs, Anna; Moen, Jon


    By combining digital humanities text-mining tools and a qualitative approach, we examine changing concepts in forestry journals in Sweden and the United States (US) in the early twentieth and early twenty-first centuries. Our first hypothesis is that foresters at the beginning of the twentieth century were more concerned with production and less concerned with ecology than foresters at the beginning of the twenty-first century. Our second hypothesis is that US foresters in the early twentieth century were less concerned with local site conditions than Swedish foresters. We find that early foresters in both countries had broader-and often ecologically focused-concerns than hypothesized. Ecological concerns in the forestry literature have increased, but in the Nordic countries, production concerns have increased as well. In both regions and both time periods, timber management is closely connected to concerns about governance and state power, but the forms that governance takes have changed.

  3. Historical Approach to the Role of Women in the Legislation of Iran: A Case Study on the Twenty-First Parliament

    Directory of Open Access Journals (Sweden)

    Sarah Sheibani


    Full Text Available One hundred and ten years ago, men and women took constitutionalism to achieve justice in Iran. National Council was the result of the Iranian people's struggle for justice, both women and men. Men policies from the beginning of legislation put women as minors and lunatics and bankrupted and banned them from vote. However, the Constitutional Revolution as a turning point and a national revolution played a key role in changing attitudes to women and structural context of their participation provided. In this paper, with the use of descriptive-analytical as well as quantitative methods, we sought to answer the question that what was the position of women in the twenty-first Parliament. The results of this study suggest that when Iranian women were allowed to participate politics, they have achieved to show their ability in politics as we saw examples in the twenty-first Parliament in which women had twenty-two percent participation.

  4. The need of formation anthropocosmos pedagogy in the twenty-first century (philosophical and educational, pedagogical and spiritual aspects

    Directory of Open Access Journals (Sweden)

    Natalia V. Polischuk


    Full Text Available The article presents a definition of the subject of «anthropocosmic pedagogy» in contemporary philosophical and pedagogical discourse. The author’s interpretation of the approaches to the explication of key terms and concepts anthropocosmos pedagogy and disclosure of their filosofico-educational, pedagogical and spiritual essence. It is proved that due to the need to promote high-quality transition of the intelligent matter of the Earth from its planetary state into a cosmic force, it is necessary to ensure resettlement and reproduction of intelligent matter of the Earth on the scale of the Solar system with the prospect of reaching the galactic and metagalactic spaces. But for this you need to implement cosmic education, which means that we need to form a concept of highly spiritual and moral personality of the future highly advanced space-ekzoplanete civilization on the basis of anthropocosmos philosophy of education and pedagogy. Content components of the new anthropocosmos concepts anthropocosmos information and high-tech civilization in the framework of professional, phenomenal-ideological, synergistic approaches, as well as their synthesis. On the basis of the comparative analysis presents the main characteristics of the newly introduced terms and concepts anthropocosmos philosophy of education and pedagogy.

  5. Proceedings of the Fifth Seminar of High Temperature Reactor: The Role and Challenge with HTR Opportunity in the Twenty-first Century

    International Nuclear Information System (INIS)

    As-Natio-Lasman; Zaki-Su'ud; Bambang-Sugiono


    The Seminar in HTR Reactor has become routine activities held in BATAN since 1994. This Seminar is a continuation of the Seminar on Technology and HTR Application held by Centre for Development of Advanced Reactor System. The theme of the seminar is Role, Challenge, Opportunity of HTR in the Twenty-first Century. Thirteen papers presented in the seminar were collected into proceedings. The aims of the proceedings is to provide information and references on nuclear technology, mainly on HTR technology. (DII)

  6. Change and Continuity in Librarianship: Approaching the Twenty-First Century. Proceedings of the 40th Military Librarians Workshop, 20-22 November 1996, Annapolis, Maryland, (United States)


    Novembecr 1996 Arinarolis, Maryland1 rDIO QUALMTY DZEOTN I VIBYKUTON UrtAIK=yg A Change and Continuity in Librarianship : Approaching the Twenty-first...speakers Walt Crawford (Keynote), speaking on "Millennial Librarianship ;" Dr. Keith Swigger, Dean of the Graduate School of Library and Information...1 --Richard Hume Werking Millennial Librarianship : Maintaining the Mix and Avoiding the Hype .................. 2 --Walt Crawford

  7. Desdemona and a ticket to space; training for space flight in a 3g motion simulator

    NARCIS (Netherlands)

    Wouters, M.


    On October 5, 2013, Marijn Wouters and two other contestants of a nation-wide competition ‘Nederland Innoveert’ underwent a space training exercise. One by one, the trainees were pushed to their limits in the Desdemona motion simulator, an experience that mimicked the Space Expedition Corporation

  8. Monte Carlo simulation of continuous-space crystal growth

    International Nuclear Information System (INIS)

    Dodson, B.W.; Taylor, P.A.


    We describe a method, based on Monte Carlo techniques, of simulating the atomic growth of crystals without the discrete lattice space assumed by conventional Monte Carlo growth simulations. Since no lattice space is assumed, problems involving epitaxial growth, heteroepitaxy, phonon-driven mechanisms, surface reconstruction, and many other phenomena incompatible with the lattice-space approximation can be studied. Also, use of the Monte Carlo method circumvents to some extent the extreme limitations on simulated timescale inherent in crystal-growth techniques which might be proposed using molecular dynamics. The implementation of the new method is illustrated by studying the growth of strained-layer superlattice (SLS) interfaces in two-dimensional Lennard-Jones atomic systems. Despite the extreme simplicity of such systems, the qualitative features of SLS growth seen here are similar to those observed experimentally in real semiconductor systems

  9. Simulation of space charge effects in a synchrotron

    International Nuclear Information System (INIS)

    Machida, Shinji; Ikegami, Masanori


    We have studied space charge effects in a synchrotron with multi-particle tracking in 2-D and 3-D configuration space (4-D and 6-D phase space, respectively). First, we will describe the modelling of space charge fields in the simulation and a procedure of tracking. Several ways of presenting tracking results will be also mentioned. Secondly, it is discussed as a demonstration of the simulation study that coherent modes of a beam play a major role in beam stability and intensity limit. The incoherent tune in a resonance condition should be replaced by the coherent tune. Finally, we consider the coherent motion of a beam core as a driving force of halo formation. The mechanism is familiar in linac, and we apply it in a synchrotron

  10. Planetary and Space Simulation Facilities PSI at DLR for Astrobiology (United States)

    Rabbow, E.; Rettberg, P.; Panitz, C.; Reitz, G.


    Ground based experiments, conducted in the controlled planetary and space environment simulation facilities PSI at DLR, are used to investigate astrobiological questions and to complement the corresponding experiments in LEO, for example on free flying satellites or on space exposure platforms on the ISS. In-orbit exposure facilities can only accommodate a limited number of experiments for exposure to space parameters like high vacuum, intense radiation of galactic and solar origin and microgravity, sometimes also technically adapted to simulate extraterrestrial planetary conditions like those on Mars. Ground based experiments in carefully equipped and monitored simulation facilities allow the investigation of the effects of simulated single environmental parameters and selected combinations on a much wider variety of samples. In PSI at DLR, international science consortia performed astrobiological investigations and space experiment preparations, exposing organic compounds and a wide range of microorganisms, reaching from bacterial spores to complex microbial communities, lichens and even animals like tardigrades to simulated planetary or space environment parameters in pursuit of exobiological questions on the resistance to extreme environments and the origin and distribution of life. The Planetary and Space Simulation Facilities PSI of the Institute of Aerospace Medicine at DLR in Köln, Germany, providing high vacuum of controlled residual composition, ionizing radiation of a X-ray tube, polychromatic UV radiation in the range of 170-400 nm, VIS and IR or individual monochromatic UV wavelengths, and temperature regulation from -20°C to +80°C at the sample size individually or in selected combinations in 9 modular facilities of varying sizes are presented with selected experiments performed within.

  11. Planetary and Space Simulation Facilities (PSI) at DLR (United States)

    Panitz, Corinna; Rabbow, E.; Rettberg, P.; Kloss, M.; Reitz, G.; Horneck, G.


    The Planetary and Space Simulation facilities at DLR offer the possibility to expose biological and physical samples individually or integrated into space hardware to defined and controlled space conditions like ultra high vacuum, low temperature and extraterrestrial UV radiation. An x-ray facility stands for the simulation of the ionizing component at the disposal. All of the simulation facilities are required for the preparation of space experiments: - for testing of the newly developed space hardware - for investigating the effect of different space parameters on biological systems as a preparation for the flight experiment - for performing the 'Experiment Verification Tests' (EVT) for the specification of the test parameters - and 'Experiment Sequence Tests' (EST) by simulating sample assemblies, exposure to selected space parameters, and sample disassembly. To test the compatibility of the different biological and chemical systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed among many others for the ESA facilities of the ongoing missions EXPOSE-R and EXPOSE-E on board of the International Space Station ISS . Several experiment verification tests EVTs and an experiment sequence test EST have been conducted in the carefully equipped and monitored planetary and space simulation facilities PSI of the Institute of Aerospace Medicine at DLR in Cologne, Germany. These ground based pre-flight studies allowed the investigation of a much wider variety of samples and the selection of the most promising organisms for the flight experiment. EXPOSE-E had been attached to the outer balcony of the European Columbus module of the ISS in February 2008 and stayed for 1,5 years in space; EXPOSE-R has been attached to the Russian Svezda module of the ISS in spring 2009 and mission duration will be approx. 1,5 years. The missions will give new insights into the survivability of terrestrial

  12. Ocean (de)oxygenation from the Last Glacial Maximum to the twenty-first century: insights from Earth System models (United States)

    Bopp, L.; Resplandy, L.; Untersee, A.; Le Mezo, P.; Kageyama, M.


    All Earth System models project a consistent decrease in the oxygen content of oceans for the coming decades because of ocean warming, reduced ventilation and increased stratification. But large uncertainties for these future projections of ocean deoxygenation remain for the subsurface tropical oceans where the major oxygen minimum zones are located. Here, we combine global warming projections, model-based estimates of natural short-term variability, as well as data and model estimates of the Last Glacial Maximum (LGM) ocean oxygenation to gain some insights into the major mechanisms of oxygenation changes across these different time scales. We show that the primary uncertainty on future ocean deoxygenation in the subsurface tropical oceans is in fact controlled by a robust compensation between decreasing oxygen saturation (O2sat) due to warming and decreasing apparent oxygen utilization (AOU) due to increased ventilation of the corresponding water masses. Modelled short-term natural variability in subsurface oxygen levels also reveals a compensation between O2sat and AOU, controlled by the latter. Finally, using a model simulation of the LGM, reproducing data-based reconstructions of past ocean (de)oxygenation, we show that the deoxygenation trend of the subsurface ocean during deglaciation was controlled by a combination of warming-induced decreasing O2sat and increasing AOU driven by a reduced ventilation of tropical subsurface waters. This article is part of the themed issue 'Ocean ventilation and deoxygenation in a warming world'.

  13. A New Paradigm Is Needed for Medical Education in the Mid-Twenty-First Century and Beyond: Are We Ready?

    Directory of Open Access Journals (Sweden)

    Dan E. Benor


    Full Text Available The twentieth century witnessed profound changes in medical education. All these changes, however, took place within the existing framework, suggested by Flexner a century ago. The present paper suggests that we are approaching a singularity point, where we shall have to change the paradigm and be prepared for an entirely new genre of medical education. This suggestion is based upon analysis of existing and envisaged trends: first, in technology, such as availability of information and sophisticated simulations; second, in medical practice, such as far-reaching interventions in life and death that create an array of new moral dilemmas, as well as a change in patient mix in hospitals and a growing need of team work; third, in the societal attitude toward higher education. The structure of the future medical school is delineated in a rough sketch, and so are the roles of the future medical teacher. It is concluded that we are presently not prepared for the approaching changes, neither from practical nor from attitudinal points of view, and that it is now high time for both awareness of and preparation for these changes.

  14. Ocean (de)oxygenation from the Last Glacial Maximum to the twenty-first century: insights from Earth System models. (United States)

    Bopp, L; Resplandy, L; Untersee, A; Le Mezo, P; Kageyama, M


    All Earth System models project a consistent decrease in the oxygen content of oceans for the coming decades because of ocean warming, reduced ventilation and increased stratification. But large uncertainties for these future projections of ocean deoxygenation remain for the subsurface tropical oceans where the major oxygen minimum zones are located. Here, we combine global warming projections, model-based estimates of natural short-term variability, as well as data and model estimates of the Last Glacial Maximum (LGM) ocean oxygenation to gain some insights into the major mechanisms of oxygenation changes across these different time scales. We show that the primary uncertainty on future ocean deoxygenation in the subsurface tropical oceans is in fact controlled by a robust compensation between decreasing oxygen saturation (O 2sat ) due to warming and decreasing apparent oxygen utilization (AOU) due to increased ventilation of the corresponding water masses. Modelled short-term natural variability in subsurface oxygen levels also reveals a compensation between O 2sat and AOU, controlled by the latter. Finally, using a model simulation of the LGM, reproducing data-based reconstructions of past ocean (de)oxygenation, we show that the deoxygenation trend of the subsurface ocean during deglaciation was controlled by a combination of warming-induced decreasing O 2sat and increasing AOU driven by a reduced ventilation of tropical subsurface waters.This article is part of the themed issue 'Ocean ventilation and deoxygenation in a warming world'. © 2017 The Author(s).

  15. High Level Architecture Distributed Space System Simulation for Simulation Interoperability Standards Organization Simulation Smackdown (United States)

    Li, Zuqun


    Modeling and Simulation plays a very important role in mission design. It not only reduces design cost, but also prepares astronauts for their mission tasks. The SISO Smackdown is a simulation event that facilitates modeling and simulation in academia. The scenario of this year s Smackdown was to simulate a lunar base supply mission. The mission objective was to transfer Earth supply cargo to a lunar base supply depot and retrieve He-3 to take back to Earth. Federates for this scenario include the environment federate, Earth-Moon transfer vehicle, lunar shuttle, lunar rover, supply depot, mobile ISRU plant, exploratory hopper, and communication satellite. These federates were built by teams from all around the world, including teams from MIT, JSC, University of Alabama in Huntsville, University of Bordeaux from France, and University of Genoa from Italy. This paper focuses on the lunar shuttle federate, which was programmed by the USRP intern team from NASA JSC. The shuttle was responsible for provide transportation between lunar orbit and the lunar surface. The lunar shuttle federate was built using the NASA standard simulation package called Trick, and it was extended with HLA functions using TrickHLA. HLA functions of the lunar shuttle federate include sending and receiving interaction, publishing and subscribing attributes, and packing and unpacking fixed record data. The dynamics model of the lunar shuttle was modeled with three degrees of freedom, and the state propagation was obeying the law of two body dynamics. The descending trajectory of the lunar shuttle was designed by first defining a unique descending orbit in 2D space, and then defining a unique orbit in 3D space with the assumption of a non-rotating moon. Finally this assumption was taken away to define the initial position of the lunar shuttle so that it will start descending a second after it joins the execution. VPN software from SonicWall was used to connect federates with RTI during testing

  16. Simulation of Martian surface-atmosphere interaction in a space-simulator: Technical considerations and feasibility (United States)

    Moehlmann, D.; Kochan, H.


    The Space Simulator of the German Aerospace Research Establishment at Cologne, formerly used for testing satellites, is now, since 1987, the central unit within the research sub-program 'Comet-Simulation' (KOSI). The KOSI team has investigated physical processes relevant to comets and their surfaces. As a byproduct we gained experience in sample-handling under simulated space conditions. In broadening the scope of the research activities of the DLR Institute of Space Simulation an extension to 'Laboratory-Planetology' is planned. Following the KOSI-experiments a Mars Surface-Simulation with realistic minerals and surface soil in a suited environment (temperature, pressure, and CO2-atmosphere) is foreseen as the next step. Here, our main interest is centered on thermophysical properties of the Martian surface and energy transport (and related gas transport) through the surface. These laboratory simulation activities can be related to space missions as typical pre-mission and during-the-mission support of the experiments design and operations (simulation in parallel). Post mission experiments for confirmation and interpretation of results are of great value. The physical dimensions of the Space Simulator (cylinder of about 2.5 m diameter and 5 m length) allows for testing and qualification of experimental hardware under realistic Martian conditions.


    Directory of Open Access Journals (Sweden)

    Stelian MANOLACHE


    Full Text Available Upon the dawn of postmodernity, in the twenty-first century, we witness the emergence of a new way of thinking and of new forms of culture and life, under the ideology of globalism, whose dominance is given by the practicality and utility related to civilization, and under globality, which is the cultural aspect of globalization, pertaining to the field of culture. The two dimensions of globalization and globality, civilizational and cultural, will (requestion the principle relationship between Christianity and the new postmodern globalizing utopia, requiring to (reconsider the sense and presence of Christianity within the world, and the appropriate sociological figure of the Church, within the new reality of global and globalized humanity, in the postmodern public space. This paper deals with this ideology - globalism and the cultural manifestation of globality, and with the Orthodox answer to the new challenge of individualism and postmodern globalizing (neocollectivism.

  18. Saving time in a space-efficient simulation algorithm

    NARCIS (Netherlands)

    Markovski, J.


    We present an efficient algorithm for computing the simulation preorder and equivalence for labeled transition systems. The algorithm improves an existing space-efficient algorithm and improves its time complexity by employing a variant of the stability condition and exploiting properties of the

  19. Changes in seasonal and diurnal precipitation types during summer over South Korea in the late twenty-first century (2081-2100) projected by the RegCM4.0 based on four RCP scenarios (United States)

    Oh, Seok-Geun; Suh, Myoung-Seok


    Changes in seasonal and diurnal precipitation types over South Korea during summer in the late twenty-first century (2081-2100) were projected under four RCP scenarios using the Regional Climate Model (RegCM4.0) with a horizontal resolution of 12.5 km. Two boundary conditions, ERA-Interim and HadGEM2-AO, were used to drive the RegCM4.0 (jointly named RG4_ERA and RG4_HG2, respectively). In general, the RegCM4.0 reproduces the spatial distribution of summer precipitation over Northeast Asia for the current climate (1989-2008) reasonably well. The RG4_HG2 shows larger dry biases over South Korea, when compared with observations, than does the RG4_ERA. These strong dry biases result from the underestimation of convective precipitation (CPR) and are particularly noticeable in late afternoons during July and August. It is related to the performance of HadGEM2-AO which simulated southwesterly winds weakly in that time. However, interestingly, the RG4_HG2 simulates similar increases in the contribution of CPR to total precipitation after mid-July, resulting in comparable performance in the reproduction of heavy precipitation. In the late twenty-first century, a significant increase (decrease) in CPR (NCPR) is generally projected over South Korea, and particularly under the RCP8.5. During June, the total precipitation is affected primarily by changes in NCPR under RCP2.6 and RCP6.0. After mid-July, increasing total precipitation is primarily caused by the distinct increases in CPR in the late afternoons; this pattern is particularly noticeable under RCP8.5, which is associated with more destabilized atmospheric conditions during July and August. Light and heavy precipitation are projected to decrease and increase, respectively, under RCP8.5.

  20. Twenty-First-Century Kids, Twenty-First-Century Librarians (United States)

    Walter, Virginia A.


    Inspired by a new generation of librarians and children, Walter reconsiders the legacy passed on by the matriarchs of children's services and examines more recent trends and challenges growing out of changes in educational philosophy and information technology. This thoroughly researched book includes the current issues and trends of: (1)…

  1. Extremophiles survival to simulated space conditions: an astrobiology model study. (United States)

    Mastascusa, V; Romano, I; Di Donato, P; Poli, A; Della Corte, V; Rotundi, A; Bussoletti, E; Quarto, M; Pugliese, M; Nicolaus, B


    In this work we investigated the ability of four extremophilic bacteria from Archaea and Bacteria domains to resist to space environment by exposing them to extreme conditions of temperature, UV radiation, desiccation coupled to low pressure generated in a Mars' conditions simulator. All the investigated extremophilic strains (namely Sulfolobus solfataricus, Haloterrigena hispanica, Thermotoga neapolitana and Geobacillus thermantarcticus) showed a good resistance to the simulation of the temperature variation in the space; on the other hand irradiation with UV at 254 nm affected only slightly the growth of H. hispanica, G. thermantarcticus and S. solfataricus; finally exposition to Mars simulated condition showed that H. hispanica and G. thermantarcticus were resistant to desiccation and low pressure.

  2. A Simulation Base Investigation of High Latency Space Systems Operations (United States)

    Li, Zu Qun; Crues, Edwin Z.; Bielski, Paul; Moore, Michael


    NASA's human space program has developed considerable experience with near Earth space operations. Although NASA has experience with deep space robotic missions, NASA has little substantive experience with human deep space operations. Even in the Apollo program, the missions lasted only a few weeks and the communication latencies were on the order of seconds. Human missions beyond the relatively close confines of the Earth-Moon system will involve missions with durations measured in months and communications latencies measured in minutes. To minimize crew risk and to maximize mission success, NASA needs to develop a better understanding of the implications of these types of mission durations and communication latencies on vehicle design, mission design and flight controller interaction with the crew. To begin to address these needs, NASA performed a study using a physics-based subsystem simulation to investigate the interactions between spacecraft crew and a ground-based mission control center for vehicle subsystem operations across long communication delays. The simulation, built with a subsystem modeling tool developed at NASA's Johnson Space Center, models the life support system of a Mars transit vehicle. The simulation contains models of the cabin atmosphere and pressure control system, electrical power system, drinking and waste water systems, internal and external thermal control systems, and crew metabolic functions. The simulation has three interfaces: 1) a real-time crew interface that can be use to monitor and control the vehicle subsystems; 2) a mission control center interface with data transport delays up to 15 minutes each way; 3) a real-time simulation test conductor interface that can be use to insert subsystem malfunctions and observe the interactions between the crew, ground, and simulated vehicle. The study was conducted at the 21st NASA Extreme Environment Mission Operations (NEEMO) mission between July 18th and Aug 3rd of year 2016. The NEEMO

  3. Book review of Capital in the Twenty-First Century, by Thomas Piketty. Cambridge, Massachusetts, London, England: The Belknap Press of Harvard Press, 2014, 605 pages


    Paul Dobrescu; Mălina Ciocea


    “Every now and then, the field of economics produces an important book; this is one of them” (Cowen, 2014). These are the opening words of Tyler Cowen’s presentation of Thomas Piketty’s work, “Capital in the Twenty-First Century” (Piketty, 2014), in Foreign Affairs. This is a book that is visibly placed in all important bookstores around the world, widely debated, acclaimed, sold (over 1 million copies have been sold so far). It has been favorably reviewed or quoted in all major journals. The...

  4. Causes and impacts of changes in the Arctic freshwater budget during the twentieth and twenty-first centuries in an AOGCM

    Energy Technology Data Exchange (ETDEWEB)

    Arzel, Olivier [University of New South Wales, Climate and Environmental Dynamics Laboratory, School of Mathematics and Statistics, Sydney, NSW (Australia); Fichefet, Thierry; Goosse, Hugues [Universite Catholique de Louvain, Institut d' Astronomie et de Geophysique G. Lemaitre, Louvain-la-Neuve (Belgium); Dufresne, Jean-Louis [Institut Pierre Simon Laplace UPMC/CNRS, Laboratoire de Meteorologie Dynamique, Paris (France)


    The fourth version of the atmosphere-ocean general circulation (AOGCM) model developed at the Institut Pierre-Simon Laplace (IPSL-CM4) is used to investigate the mechanisms influencing the Arctic freshwater balance in response to anthropogenic greenhouse gas forcing. The freshwater influence on the interannual variability of deep winter oceanic convection in the Nordic Seas is also studied on the basis of correlation and regression analyses of detrended variables. The model shows that the Fram Strait outflow, which is an important source of freshwater for the northern North Atlantic, experiences a rapid and strong transition from a weak state toward a relatively strong state during 1990-2010. The authors propose that this climate shift is triggered by the retreat of sea ice in the Barents Sea during the late twentieth century. This sea ice reduction initiates a positive feedback in the atmosphere-sea ice-ocean system that alters both the atmospheric and oceanic circulations in the Greenland-Iceland-Norwegian (GIN)-Barents Seas sector. Around year 2080, the model predicts a second transition threshold beyond which the Fram Strait outflow is restored toward its original weak value. The long-term freshening of the GIN Seas is invoked to explain this rapid transition. It is further found that the mechanism of interannual changes in deep mixing differ fundamentally between the twentieth and twenty-first centuries. This difference is caused by the dominant influence of freshwater over the twenty-first century. In the GIN Seas, the interannual changes in the liquid freshwater export out of the Arctic Ocean through Fram Strait combined with the interannual changes in the liquid freshwater import from the North Atlantic are shown to have a major influence in driving the interannual variability of the deep convection during the twenty-first century. South of Iceland, the other region of deep water renewal in the model, changes in freshwater import from the North Atlantic

  5. Does the Common Agricultural Policy still make sense in the twenty-first century? CAP after 2013 from the perspective of Poland and Hungary

    Directory of Open Access Journals (Sweden)

    Elżbieta Daszkowska


    Full Text Available The EU CAP has developed immensely since the 1960’s. However, its current determinants are completely different from those which formed the CAP foundations. This results mainly from the fact that the UE CAP must meet present-day challenges and threats. Moreover, further EU enlargements also significantly influenced performance of this sector of economy. It is important to determine whether the existence of the CAP in the twenty-first century still makes sense and to specify in more detail the CAP reform directions after 2013 from the perspective of Poland and Hungary.

  6. Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications (United States)

    Boghosian, Mary; Narvaez, Pablo; Herman, Ray


    The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.

  7. Simulated Space Environmental Effects on Thin Film Solar Array Components (United States)

    Finckenor, Miria; Carr, John; SanSoucie, Michael; Boyd, Darren; Phillips, Brandon


    The Lightweight Integrated Solar Array and Transceiver (LISA-T) experiment consists of thin-film, low mass, low volume solar panels. Given the variety of thin solar cells and cover materials and the lack of environmental protection typically afforded by thick coverglasses, a series of tests were conducted in Marshall Space Flight Center's Space Environmental Effects Facility to evaluate the performance of these materials. Candidate thin polymeric films and nitinol wires used for deployment were also exposed. Simulated space environment exposures were selected based on SSP 30425 rev. B, "Space Station Program Natural Environment Definition for Design" or AIAA Standard S-111A-2014, "Qualification and Quality Requirements for Space Solar Cells." One set of candidate materials were exposed to 5 eV atomic oxygen and concurrent vacuum ultraviolet (VUV) radiation for low Earth orbit simulation. A second set of materials were exposed to 1 MeV electrons. A third set of samples were exposed to 50, 100, 500, and 700 keV energy protons, and a fourth set were exposed to >2,000 hours of near ultraviolet (NUV) radiation. A final set was rapidly thermal cycled between -55 and +125degC. This test series provides data on enhanced power generation, particularly for small satellites with reduced mass and volume resources. Performance versus mass and cost per Watt is discussed.

  8. Psychosocial value of space simulation for extended spaceflight (United States)

    Kanas, N.


    There have been over 60 studies of Earth-bound activities that can be viewed as simulations of manned spaceflight. These analogs have involved Antarctic and Arctic expeditions, submarines and submersible simulators, land-based simulators, and hypodynamia environments. None of these analogs has accounted for all the variables related to extended spaceflight (e.g., microgravity, long-duration, heterogeneous crews), and some of the stimulation conditions have been found to be more representative of space conditions than others. A number of psychosocial factors have emerged from the simulation literature that correspond to important issues that have been reported from space. Psychological factors include sleep disorders, alterations in time sense, transcendent experiences, demographic issues, career motivation, homesickness, and increased perceptual sensitivities. Psychiatric factors include anxiety, depression, psychosis, psychosomatic symptoms, emotional reactions related to mission stage, asthenia, and postflight personality, and marital problems. Finally, interpersonal factors include tension resulting from crew heterogeneity, decreased cohesion over time, need for privacy, and issues involving leadership roles and lines of authority. Since future space missions will usually involve heterogeneous crews working on complicated objectives over long periods of time, these features require further study. Socio-cultural factors affecting confined crews (e.g., language and dialect, cultural differences, gender biases) should be explored in order to minimize tension and sustain performance. Career motivation also needs to be examined for the purpose of improving crew cohesion and preventing subgrouping, scapegoating, and territorial behavior. Periods of monotony and reduced activity should be addressed in order to maintain morale, provide meaningful use of leisure time, and prevent negative consequences of low stimulation, such as asthenia and crew member withdrawal

  9. A Conservation Ethic and the Collecting of Animals by Institutions of Natural Heritage in the Twenty-First Century: Case Study of the Australian Museum

    Directory of Open Access Journals (Sweden)

    Timothy Ikin


    Full Text Available Collecting of animals from their habitats for preservation by museums and related bodies is a core operation of such institutions. Conservation of biodiversity in the current era is a priority in the scientific agendas of museums of natural heritage in Australia and the world. Intuitively, to take animals from the wild, while engaged in scientific or other practices that are supposed to promote their ongoing survival, may appear be incompatible. The Australian Museum presents an interesting ground to consider zoological collecting by museums in the twenty-first century. Anderson and Reeves in 1994 argued that a milieu existed that undervalued native species, and that the role of natural history museums, up to as late as the mid-twentieth century, was only to make a record the faunal diversity of Australia, which would inevitably be extinct. Despite the latter, conservation of Australia’s faunal diversity is a key aspect of research programmes in Australia’s institutions of natural heritage in the current era. This paper analyses collecting of animals, a core task for institutions of natural heritage, and how this interacts with a professed “conservation ethic” in a twenty-first century Australian setting.

  10. Book review of Capital in the Twenty-First Century, by Thomas Piketty. Cambridge, Massachusetts, London, England: The Belknap Press of Harvard Press, 2014, 605 pages

    Directory of Open Access Journals (Sweden)

    Paul Dobrescu


    Full Text Available “Every now and then, the field of economics produces an important book; this is one of them” (Cowen, 2014. These are the opening words of Tyler Cowen’s presentation of Thomas Piketty’s work, “Capital in the Twenty-First Century” (Piketty, 2014, in Foreign Affairs. This is a book that is visibly placed in all important bookstores around the world, widely debated, acclaimed, sold (over 1 million copies have been sold so far. It has been favorably reviewed or quoted in all major journals. The assessment of “Capital in the Twenty-First Century” by Paul Krugman, Nobel Economics Prize Laureate as a “magnificent, sweeping meditation on inequality”, is highly relevant: “This is a book that will change both the way we think about society and the way we do economics” (Krugman, 2014. Finally, Piketty’s book is included in the list of the year’s best books by prestigious journals, such as The Economist, Financial Times, The Washington Post, Observer, The Independent, Daily Telegraph; Financial Times and McKinsey have hailed it as the best book of 2014.

  11. High School Students' Perceptions of the Effects of International Science Olympiad on Their STEM Career Aspirations and Twenty-First Century Skill Development (United States)

    Sahin, Alpaslan; Gulacar, Ozcan; Stuessy, Carol


    Social cognitive theory guided the design of a survey to investigate high school students' perceptions of factors affecting their career contemplations and beliefs regarding the influence of their participation in the international Science Olympiad on their subject interests and twenty-first century skills. In addition, gender differences in students' choice of competition category were studied. Mixed methods analysis of survey returns from 172 Olympiad participants from 31 countries showed that students' career aspirations were affected most by their teachers, personal interests, and parents, respectively. Students also indicated that they believed that their participation in the Olympiad reinforced their plan to choose a science, technology, engineering, and mathematics (STEM) major at college and assisted them in developing and improving their twenty-first century skills. Furthermore, female students' responses indicated that their project choices were less likely to be in the engineering category and more likely to be in the environment or energy categories. Findings are discussed in the light of increasing the awareness of the role and importance of Science Olympiads in STEM career choice and finding ways to attract more female students into engineering careers.

  12. How Has Elderly Migration Changed in the Twenty-First Century? What the Data Can-and Cannot-Tell Us. (United States)

    Conway, Karen Smith; Rork, Jonathan C


    Interstate elderly migration has strong implications for state tax policies and health care systems, yet little is known about how it has changed in the twenty-first century. Its relative rarity requires a large data set with which to construct reliable measures, and the replacement of the U.S. Census long form (CLF) with the American Community Survey (ACS) has made such updates difficult. Two commonly used alternative migration data sources-the Current Population Survey (CPS) and the Statistics of Income (SOI) program of the Internal Revenue Service (IRS)-suffer serious limitations in studying the migration of any subpopulation, including the elderly. Our study informs migration research in the post-2000 era by identifying methodological differences between data sources and devising strategies for reconciling the CLF and ACS. Our investigation focusing on the elderly suggests that the ACS can generate comparable migration data that reveal a continuation of previously identified geographic patterns as well as changes unique to the 2000s. However, its changed definition of residence and survey timing leaves us unable to construct a comparable national migration rate, suggesting that one must use national trends in the smaller CPS to investigate whether elderly migration has increased or decreased in the twenty-first century.

  13. Wicked Female Characters in Roddy Doyle’s “The Pram”: Revisiting Celtic and Polish Myths in the Context of Twenty-First Century Ireland

    Directory of Open Access Journals (Sweden)

    Burcu Gülüm Tekin


    Full Text Available “The Pram” is the only horror story in Roddy Doyle’s collection The Deportees and Other Stories (2007. It is also unique in terms of its approach to Ireland’s multicultural scene in the twenty-first century. Doyle turns the other side of the coin and introduces a migrant caretaker (Alina, who loses her mind due to her employees’ (the O’Reilly family ill-treatment. As a reaction to their scornful attitude, Alina becomes a murderer. Set in the context of twenty-first century Dublin, “The Pram” contains various references to Celtic and Polish mythological female figures (in particular, the Old Hag of Beara and Boginka, which strengthen the thrilling, mythical elements in the plot. This paper aims to examine the characters’ negative attitude towards migrants in Ireland in the light of the racist discourse present in the story. Also, I will focus on the story’s female characters and discuss the handicaps of being a female migrant in Ireland. The parallels between the mythical female figures and the protagonist Alina will be another point to be analyzed. The argument of this paper is that Doyle does not always portray the positive outcomes of a multicultural society. On the contrary, he conveys the perspective of the incoming migrant. “The Pram” stages the obstacles that a female outsider may experience in Ireland and her subsequent transformation as a result of the racism she encounters there.

  14. Interplanetary Transit Simulations Using the International Space Station (United States)

    Charles, J. B.; Arya, Maneesh


    It has been suggested that the International Space Station (ISS) be utilized to simulate the transit portion of long-duration missions to Mars and near-Earth asteroids (NEA). The ISS offers a unique environment for such simulations, providing researchers with a high-fidelity platform to study, enhance, and validate technologies and countermeasures for these long-duration missions. From a space life sciences perspective, two major categories of human research activities have been identified that will harness the various capabilities of the ISS during the proposed simulations. The first category includes studies that require the use of the ISS, typically because of the need for prolonged weightlessness. The ISS is currently the only available platform capable of providing researchers with access to a weightless environment over an extended duration. In addition, the ISS offers high fidelity for other fundamental space environmental factors, such as isolation, distance, and accessibility. The second category includes studies that do not require use of the ISS in the strictest sense, but can exploit its use to maximize their scientific return more efficiently and productively than in ground-based simulations. In addition to conducting Mars and NEA simulations on the ISS, increasing the current increment duration on the ISS from 6 months to a longer duration will provide opportunities for enhanced and focused research relevant to long-duration Mars and NEA missions. Although it is currently believed that increasing the ISS crew increment duration to 9 or even 12 months will pose little additional risk to crewmembers, additional medical monitoring capabilities may be required beyond those currently used for the ISS operations. The use of the ISS to simulate aspects of Mars and NEA missions seems practical, and it is recommended that planning begin soon, in close consultation with all international partners.

  15. A Coordinated Initialization Process for the Distributed Space Exploration Simulation (United States)

    Crues, Edwin Z.; Phillips, Robert G.; Dexter, Dan; Hasan, David


    A viewgraph presentation on the federate initialization process for the Distributed Space Exploration Simulation (DSES) is described. The topics include: 1) Background: DSES; 2) Simulation requirements; 3) Nine Step Initialization; 4) Step 1: Create the Federation; 5) Step 2: Publish and Subscribe; 6) Step 3: Create Object Instances; 7) Step 4: Confirm All Federates Have Joined; 8) Step 5: Achieve initialize Synchronization Point; 9) Step 6: Update Object Instances With Initial Data; 10) Step 7: Wait for Object Reflections; 11) Step 8: Set Up Time Management; 12) Step 9: Achieve startup Synchronization Point; and 13) Conclusions

  16. A Study on the Commercialization of Space-Based Remote Sensing in the Twenty-First Century and Its Implications to United States National Security (United States)


    services • retail marketing • facilities placement • facilities monitoring • peacekeeping and treaty monitoring 6 • law enforcement • news services...tactical targets, including Soviet and Chinese missile locations, the site of the detonation of the first Chinese atomic weapon, submarine ports...Africa, Thailand, Chile, Pakistan, and Malaysia either have or are developing miniaturized remote sensing systems (Glackin, 1999). 7. Smaller

  17. Scalable space-time adaptive simulation tools for computational electrocardiology


    Krause, Dorian; Krause, Rolf


    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  18. Simulated Space Environment Effects on a Candidate Solar Sail Material (United States)

    Kang, Jin Ho; Bryant, Robert G.; Wilkie, W. Keats; Wadsworth, Heather M.; Craven, Paul D.; Nehls, Mary K.; Vaughn, Jason A.


    For long duration missions of solar sails, the sail material needs to survive harsh space environments and the degradation of the sail material controls operational lifetime. Therefore, understanding the effects of the space environment on the sail membrane is essential for mission success. In this study, we investigated the effect of simulated space environment effects of ionizing radiation, thermal aging and simulated potential damage on mechanical, thermal and optical properties of a commercial off the shelf (COTS) polyester solar sail membrane to assess the degradation mechanisms on a feasible solar sail. The solar sail membrane was exposed to high energy electrons (about 70 keV and 10 nA/cm2), and the physical properties were characterized. After about 8.3 Grad dose, the tensile modulus, tensile strength and failure strain of the sail membrane decreased by about 20 95%. The aluminum reflective layer was damaged and partially delaminated but it did not show any significant change in solar absorbance or thermal emittance. The effect on mechanical properties of a pre-cracked sample, simulating potential impact damage of the sail membrane, as well as thermal aging effects on metallized PEN (polyethylene naphthalate) film will be discussed.

  19. Efficient Neural Network Modeling for Flight and Space Dynamics Simulation

    Directory of Open Access Journals (Sweden)

    Ayman Hamdy Kassem


    Full Text Available This paper represents an efficient technique for neural network modeling of flight and space dynamics simulation. The technique will free the neural network designer from guessing the size and structure for the required neural network model and will help to minimize the number of neurons. For linear flight/space dynamics systems, the technique can find the network weights and biases directly by solving a system of linear equations without the need for training. Nonlinear flight dynamic systems can be easily modeled by training its linearized models keeping the same network structure. The training is fast, as it uses the linear system knowledge to speed up the training process. The technique is tested on different flight/space dynamic models and showed promising results.

  20. 25th Space Simulation Conference. Environmental Testing: The Earth-Space Connection (United States)

    Packard, Edward


    Topics covered include: Methods of Helium Injection and Removal for Heat Transfer Augmentation; The ESA Large Space Simulator Mechanical Ground Support Equipment for Spacecraft Testing; Temperature Stability and Control Requirements for Thermal Vacuum/Thermal Balance Testing of the Aquarius Radiometer; The Liquid Nitrogen System for Chamber A: A Change from Original Forced Flow Design to a Natural Flow (Thermo Siphon) System; Return to Mercury: A Comparison of Solar Simulation and Flight Data for the MESSENGER Spacecraft; Floating Pressure Conversion and Equipment Upgrades of Two 3.5kw, 20k, Helium Refrigerators; Affect of Air Leakage into a Thermal-Vacuum Chamber on Helium Refrigeration Heat Load; Special ISO Class 6 Cleanroom for the Lunar Reconnaissance Orbiter (LRO) Project; A State-of-the-Art Contamination Effects Research and Test Facility Martian Dust Simulator; Cleanroom Design Practices and Their Influence on Particle Counts; Extra Terrestrial Environmental Chamber Design; Contamination Sources Effects Analysis (CSEA) - A Tool to Balance Cost/Schedule While Managing Facility Availability; SES and Acoustics at GSFC; HST Super Lightweight Interchangeable Carrier (SLIC) Static Test; Virtual Shaker Testing: Simulation Technology Improves Vibration Test Performance; Estimating Shock Spectra: Extensions beyond GEVS; Structural Dynamic Analysis of a Spacecraft Multi-DOF Shaker Table; Direct Field Acoustic Testing; Manufacture of Cryoshroud Surfaces for Space Simulation Chambers; The New LOTIS Test Facility; Thermal Vacuum Control Systems Options for Test Facilities; Extremely High Vacuum Chamber for Low Outgassing Processing at NASA Goddard; Precision Cleaning - Path to Premier; The New Anechoic Shielded Chambers Designed for Space and Commercial Applications at LIT; Extraction of Thermal Performance Values from Samples in the Lunar Dust Adhesion Bell Jar; Thermal (Silicon Diode) Data Acquisition System; Aquarius's Instrument Science Data System (ISDS) Automated

  1. Optimizing grade-control drillhole spacing with conditional simulations

    Directory of Open Access Journals (Sweden)

    Adrian Martínez-Vargas


    Full Text Available This paper summarizes a method to determine the optimum spacing of grade-control drillholes drilled with reverse-circulation. The optimum drillhole spacing was defined as that one whose cost equals the cost of misclassifying ore and waste in selection mining units (SMU. The cost of misclassification of a given drillhole spacing is equal to the cost of processing waste misclassified as ore (Type I error plus the value of the ore misclassified as waste (Type II error. Type I and Type II errors were deduced by comparing true and estimated grades at SMUs, in relation to a cuttoff grade value and assuming free ore selection. True grades at SMUs and grades at drillhole samples were generated with conditional simulations. A set of estimated grades at SMU, one per each drillhole spacing, were generated with ordinary kriging. This method was used to determine the optimum drillhole spacing in a gold deposit. The results showed that the cost of misclassification is sensitive to extreme block values and tend to be overrepresented. Capping SMU’s lost values and implementing diggability constraints was recommended to improve calculations of total misclassification costs.

  2. Twenty-First Water Reactor Safety Information Meeting. Volume 3, Primary system integrity; Aging research, products and applications; Structural and seismic engineering; Seismology and geology: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Monteleone, S. [comp.] [Brookhaven National Lab., Upton, NY (United States)


    This three-volume report contains 90 papers out of the 102 that were presented at the Twenty-First Water Reactor Safety Information Meeting held at the Bethesda Marriott Hotel, Bethesda, Maryland, during the week of October 25-27, 1993. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Germany, Japan, Russia, Switzerland, Taiwan, and United Kingdom. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. Selected papers were indexed separately for inclusion in the Energy Science and Technology Database.

  3. Projected impact of climate change in the hydroclimatology of Senegal with a focus over the Lake of Guiers for the twenty-first century (United States)

    Tall, Moustapha; Sylla, Mouhamadou Bamba; Diallo, Ismaïla; Pal, Jeremy S.; Faye, Aïssatou; Mbaye, Mamadou Lamine; Gaye, Amadou Thierno


    This study analyzes the impact of anthropogenic climate change in the hydroclimatology of Senegal with a focus over the lake of Guiers basin for the middle (2041-2060) and late twenty-first century (2080-2099). To this end, high-resolution multimodel ensemble based on regional climate model experiments considering two Representative Concentration Pathways (RCP4.5 and RCP8.5) is used. The results indicate that an elevated warming, leading to substantial increase of atmospheric water demand, is projected over the whole of Senegal. In the Lake basin, these increases in potential evapotranspiration (PE) range between 10 and 25 % in the near future and for RCP4.5 while for the far future and RCP8.5, they exceed 50 %. In addition, mean precipitation unveils contrasting changes with wetter (10 to 25 % more) conditions by the middle of the century and drier conditions (more than 50 %) during the late twenty-first century. Such changes cause more/less evapotranspiration and soil moisture respectively during the two future periods. Furthermore, surface runoff shows a tendency to increase in most areas amid few locations including the Lake basin with substantial reduction. Finally, it is found that while semi-arid climates develop in the RCP4.5 scenario, generalized arid conditions prevail over the whole Senegal for RCP8.5. It is thus evident that these future climate conditions substantially threaten freshwater availability for the country and irrigated cropping over the Lake basin. Therefore, strong governmental politics are needed to help design response options to cope with the challenges posed by the projected climate change for the country.

  4. University of Central Florida / Deep Space Industries Asteroid Regolith Simulants (United States)

    Britt, Daniel; Covey, Steven D.; Schultz, Cody


    Introduction: The University of Central Florida (UCF), in partnership with Deep Space Industries (DSI) are working under a NASA Phase 2 SBIR contract to develop and produce a family of asteroid regolith simulants for use in research, engineering, and mission operations testing. We base simulant formulas on the mineralogy, particle size, and physical characteristics of CI, CR, CM, C2, CV, and L-Chondrite meteorites. The advantage in simulating meteorites is that the vast majority of meteoritic materials are common rock forming minerals that are available in commercial quantities. While formulas are guided by the meteorites our approach is one of constrained maximization under the limitations of safety, cost, source materials, and ease of handling. In all cases our goal is to deliver a safe, high fidelity analog at moderate cost.Source Materials, Safety, and Biohazards: A critical factor in any useful simulant is to minimize handling risks for biohazards or toxicity. All the terrestrial materials proposed for these simulants were reviewed for potential toxicity. Of particular interest is the organic component of volatile rich carbonaceous chondrites which contain polycyclic aromatic hydrocarbons (PAHs), some of which are known carcinogens and mutagens. Our research suggests that we can maintain rough chemical fidelity by substituting much safer sub-bituminous coal as our organic analog. A second safety consideration is the choice of serpentine group materials. While most serpentine polymorphs are quite safe we avoid fibrous chrysotile because of its asbestos content. Terrestrial materials identified as inputs for our simulants are common rock forming minerals that are available in commercial quantities. These include olivine, pyroxene, plagioclase feldspar, smectite, serpentine, saponite, pyrite, and magnetite in amounts that are appropriate for each type. For CI's and CR’s, their olivines tend to be Fo100 which is rare on Earth. We have substituted Fo90 olivine

  5. Modeling and Simulation for Multi-Missions Space Exploration Vehicle (United States)

    Chang, Max


    Asteroids and Near-Earth Objects [NEOs] are of great interest for future space missions. The Multi-Mission Space Exploration Vehicle [MMSEV] is being considered for future Near Earth Object missions and requires detailed planning and study of its Guidance, Navigation, and Control [GNC]. A possible mission of the MMSEV to a NEO would be to navigate the spacecraft to a stationary orbit with respect to the rotating asteroid and proceed to anchor into the surface of the asteroid with robotic arms. The Dynamics and Real-Time Simulation [DARTS] laboratory develops reusable models and simulations for the design and analysis of missions. In this paper, the development of guidance and anchoring models are presented together with their role in achieving mission objectives and relationships to other parts of the simulation. One important aspect of guidance is in developing methods to represent the evolution of kinematic frames related to the tasks to be achieved by the spacecraft and its robot arms. In this paper, we compare various types of mathematical interpolation methods for position and quaternion frames. Subsequent work will be on analyzing the spacecraft guidance system with different movements of the arms. With the analyzed data, the guidance system can be adjusted to minimize the errors in performing precision maneuvers.

  6. Primary loop simulation of the SP-100 space nuclear reactor

    International Nuclear Information System (INIS)

    Borges, Eduardo M.; Braz Filho, Francisco A.; Guimaraes, Lamartine N.F.


    Between 1983 and 1992 the SP-100 space nuclear reactor development project for electric power generation in a range of 100 to 1000 kWh was conducted in the USA. Several configurations were studied to satisfy different mission objectives and power systems. In this reactor the heat is generated in a compact core and refrigerated by liquid lithium, the primary loops flow are controlled by thermoelectric electromagnetic pumps (EMTE), and thermoelectric converters produce direct current energy. To define the system operation point for an operating nominal power, it is necessary the simulation of the thermal-hydraulic components of the space nuclear reactor. In this paper the BEMTE-3 computer code is used to EMTE pump design performance evaluation to a thermalhydraulic primary loop configuration, and comparison of the system operation points of SP-100 reactor to two thermal powers, with satisfactory results. (author)

  7. A laser particulate spectrometer for a space simulation facility (United States)

    Schmitt, R. J.; Boyd, B. A.; Linford, R. M. F.; Richmond, R. G.


    A laser particulate spectrometer (LPS) system was developed to measure the size and speed distributions of particulate contaminants. Detection of the particulates is achieved by means of light scattering and extinction effects using a single laser beam to cover a size range of 0.8 to 275 microns diameter and a speed range of 0.2 to 20 meters/second. The LPS system was designed to operate in the high-vacuum environment of a space simulation chamber with cold shroud temperatures ranging from 77 to 300 K.

  8. Program NAJOCSC and space charge effect simulation in C01

    International Nuclear Information System (INIS)

    Tang, J.Y.; Chabert, A.; Baron, E.


    During the beam tests of the THI project at GANIL, it was found it difficult to increase the beam power above 2 kW at CSS2 extraction. The space charge effect (abbreviated as S.C. effect) in cyclotrons is suspected to play some role in the phenomenon, especially the longitudinal S.C. one and also the coupling between longitudinal and radial motions. The injector cyclotron C01 is studied, and the role played by the S.C. effect in this cyclotron in the THI case is investigated by a simulation method. (K.A.)

  9. Simulations of space charge neutralization in a magnetized electron cooler

    Energy Technology Data Exchange (ETDEWEB)

    Gerity, James [Texas A-M; McIntyre, Peter M. [Texas A-M; Bruhwiler, David Leslie [RadiaSoft, Boulder; Hall, Christopher [RadiaSoft, Boulder; Moens, Vince Jan [Ecole Polytechnique, Lausanne; Park, Chong Shik [Fermilab; Stancari, Giulio [Fermilab


    Magnetized electron cooling at relativistic energies and Ampere scale current is essential to achieve the proposed ion luminosities in a future electron-ion collider (EIC). Neutralization of the space charge in such a cooler can significantly increase the magnetized dynamic friction and, hence, the cooling rate. The Warp framework is being used to simulate magnetized electron beam dynamics during and after the build-up of neutralizing ions, via ionization of residual gas in the cooler. The design follows previous experiments at Fermilab as a verification case. We also discuss the relevance to EIC designs.

  10. Average accelerator simulation Truebeam using phase space in IAEA format

    International Nuclear Information System (INIS)

    Santana, Emico Ferreira; Milian, Felix Mas; Paixao, Paulo Oliveira; Costa, Raranna Alves da; Velasco, Fermin Garcia


    In this paper is used a computational code of radiation transport simulation based on Monte Carlo technique, in order to model a linear accelerator of treatment by Radiotherapy. This work is the initial step of future proposals which aim to study several treatment of patient by Radiotherapy, employing computational modeling in cooperation with the institutions UESC, IPEN, UFRJ e COI. The Chosen simulation code is GATE/Geant4. The average accelerator is TrueBeam of Varian Company. The geometric modeling was based in technical manuals, and radiation sources on the phase space for photons, provided by manufacturer in the IAEA (International Atomic Energy Agency) format. The simulations were carried out in equal conditions to experimental measurements. Were studied photons beams of 6MV, with 10 per 10 cm of field, focusing on a water phantom. For validation were compared dose curves in depth, lateral profiles in different depths of the simulated results and experimental data. The final modeling of this accelerator will be used in future works involving treatments and real patients. (author)

  11. Unified Simulation and Analysis Framework for Deep Space Navigation Design (United States)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie


    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  12. An FPGA computing demo core for space charge simulation

    International Nuclear Information System (INIS)

    Wu, Jinyuan; Huang, Yifei


    In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.

  13. An FPGA computing demo core for space charge simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jinyuan; Huang, Yifei; /Fermilab


    In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.

  14. Space Debris Attitude Simulation - IOTA (In-Orbit Tumbling Analysis) (United States)

    Kanzler, R.; Schildknecht, T.; Lips, T.; Fritsche, B.; Silha, J.; Krag, H.

    Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA's Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. The In-Orbit Tumbling Analysis tool (IOTA) is a prototype software, currently in development within the framework of ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), which is led by the Astronomical Institute of the University of Bern (AIUB). The project goal is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). Developed by Hyperschall Technologie Göttingen GmbH (HTG), IOTA will be a highly modular software tool to perform short- (days), medium- (months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour

  15. Virtual Reality Simulation of the International Space Welding Experiment (United States)

    Phillips, James A.


    Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) It involves 3-dimensional computer graphics; (2) It includes real-time feedback and response to user actions; and (3) It must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, and we have therefore implemented a VR trainer for the International Space Welding Experiment. My role in the development of the ISWE trainer consisted of the following: (1) created texture-mapped models of the ISWE's rotating sample drum, technology block, tool stowage assembly, sliding foot restraint, and control panel; (2) developed C code for control panel button selection and rotation of the sample drum; (3) In collaboration with Tim Clark (Antares Virtual Reality Systems), developed a serial interface box for the PC and the SGI Indigo so that external control devices, similar to ones actually used on the ISWE, could be used to control virtual objects in the ISWE simulation; (4) In collaboration with Peter Wang (SFFP) and Mark Blasingame (Boeing), established the interference characteristics of the VIM 1000 head-mounted-display and tested software filters to correct the problem; (5) In collaboration with Peter Wang and Mark Blasingame, established software and procedures for interfacing the VPL DataGlove and the Polhemus 6DOF position sensors to the SGI Indigo serial ports. The majority of the ISWE modeling effort was conducted on a PC-based VR Workstation, described below.

  16. Space headache on Earth: head-down-tilted bed rest studies simulating outer-space microgravity. (United States)

    van Oosterhout, W P J; Terwindt, G M; Vein, A A; Ferrari, M D


    Headache is a common symptom during space travel, both isolated and as part of space motion syndrome. Head-down-tilted bed rest (HDTBR) studies are used to simulate outer space microgravity on Earth, and allow countermeasure interventions such as artificial gravity and training protocols, aimed at restoring microgravity-induced physiological changes. The objectives of this article are to assess headache incidence and characteristics during HDTBR, and to evaluate the effects of countermeasures. In a randomized cross-over design by the European Space Agency (ESA), 22 healthy male subjects, without primary headache history, underwent three periods of -6-degree HDTBR. In two of these episodes countermeasure protocols were added, with either centrifugation or aerobic exercise training protocols. Headache occurrence and characteristics were daily assessed using a specially designed questionnaire. In total 14/22 (63.6%) subjects reported a headache during ≥1 of the three HDTBR periods, in 12/14 (85.7%) non-specific, and two of 14 (14.4%) migraine. The occurrence of headache did not differ between HDTBR with and without countermeasures: 12/22 (54.5%) subjects vs. eight of 22 (36.4%) subjects; p = 0.20; 13/109 (11.9%) headache days vs. 36/213 (16.9%) headache days; p = 0.24). During countermeasures headaches were, however, more often mild (p = 0.03) and had fewer associated symptoms (p = 0.008). Simulated microgravity during HDTBR induces headache episodes, mostly on the first day. Countermeasures are useful in reducing headache severity and associated symptoms. Reversible, microgravity-induced cephalic fluid shift may cause headache, also on Earth. HDTBR can be used to study space headache on Earth. © International Headache Society 2014 Reprints and permissions:

  17. A Data Management System for International Space Station Simulation Tools (United States)

    Betts, Bradley J.; DelMundo, Rommel; Elcott, Sharif; McIntosh, Dawn; Niehaus, Brian; Papasin, Richard; Mah, Robert W.; Clancy, Daniel (Technical Monitor)


    Groups associated with the design, operational, and training aspects of the International Space Station make extensive use of modeling and simulation tools. Users of these tools often need to access and manipulate large quantities of data associated with the station, ranging from design documents to wiring diagrams. Retrieving and manipulating this data directly within the simulation and modeling environment can provide substantial benefit to users. An approach for providing these kinds of data management services, including a database schema and class structure, is presented. Implementation details are also provided as a data management system is integrated into the Intelligent Virtual Station, a modeling and simulation tool developed by the NASA Ames Smart Systems Research Laboratory. One use of the Intelligent Virtual Station is generating station-related training procedures in a virtual environment, The data management component allows users to quickly and easily retrieve information related to objects on the station, enhancing their ability to generate accurate procedures. Users can associate new information with objects and have that information stored in a database.

  18. Retos de la bioética en la medicina del siglo XXI Challenges of bioethics in twenty-first century medicine

    Directory of Open Access Journals (Sweden)

    Jorge Alberto Álvarez-Díaz


    Full Text Available Para plantear posibles retos de la bioética en la medicina del siglo XXI es necesario considerar que existieron algunos retos en el pasado (en el origen de esa nueva disciplina llamada bioética; que los retos se han ido modificando con el avance científico, biomédico y humanístico; considerando que los retos que pueden plantearse para el futuro serán, de diferentes maneras, resultado de este devenir histórico. Se plantean como grandes retos: los problemas no resueltos de justicia, equidad y pobreza; los retos que plantea la introducción de nuevas tecnologías con el paradigma de la nanomedicina y los retos que plantea el avance de las neurociencias con el paradigma de la neuroética.In order to propose possible challenges of bioethics in the twenty-first century medicine, it is necessary to consider that there were some past challenges (at the origin of this new discipline called bioethics, that the challenges have been modified with scientific, biomedical and humanistic breakthroughs, considering at the same time that challenges that may arise in the future will be, in different ways, a result of this historical evolution. The major challenges would be in the future: the unsolved problems of justice, equity and poverty; the challenges posed by the introduction of new technologies with the nanomedicine paradigm; and finally, the challenges driven by breakthroughs in neurosciences with the neuroethics paradigm.

  19. Thomas Piketty: Capital in the Twenty-First Century (Le Capital au XXIe siècle. (Ensk þýðing: Arthur Goldhammer.

    Directory of Open Access Journals (Sweden)

    Gylfi Magnússon


    Full Text Available Í umsögn gagnrýnanda kemur meðal annars eftirfarandi fram: Ritinu er ekki ætlað að vera lokaorðin um viðfangsefnið heldur miklu frekar grunnur að frekari umræðu og rannsóknum. Það hefur tekist. Capital in the Twenty-First Century er verk sem hefur þegar vakið mikla umræðu og verður vafalaust rætt áfram árum saman. Það er raunar nánast skyldulesning fyrir þá sem ætla sér að fjalla um þjóðhagfræði og hlutverk hins opinbera, hversu sammála eða ósammála sem þeir eru höfundinum.

  20. Addressing the main challenges of energy security in the twenty-first century – Contributions of the conferences on Sustainable Development of Energy, Water and Environment Systems

    International Nuclear Information System (INIS)

    Markovska, Natasa; Duić, Neven; Mathiesen, Brian Vad; Guzović, Zvonimir; Piacentino, Antonio; Schlör, Holger; Lund, Henrik


    Climate change and fossil fuel reserve depletion both pose challenges for energy security and for wellbeing in general. The top ten among them include: Decarbonising the world economy; Enhancing the energy efficiency and energy savings in buildings; Advancing the energy technologies; Moving towards energy systems based on variable renewables; Electrifying the transport and some industrial processes; Liberalizing and extending the energy markets; Integrating energy sectors to Smart Energy Systems; Making the cities and communities smart; Diversifying the energy sources; and Building more biorefineries. Presenting the contributions of selected conference papers published in the special issues of leading scientific journals (including all the papers from the current Energy special issue), this review demonstrates the capacity of the Conferences on Sustainable Development of Energy, Water and Environment Systems for generation of knowledge which could serve as the centrepiece of a pertinent response to those challenges. - Highlights: • Top ten challenges of energy security in the twenty-first century identified. • Selected SDEWES contributions analysed against the identified challenges. • The role of SDEWES as knowledge generator towards addressing the identified challenges credibly demonstrated.


    International Nuclear Information System (INIS)

    Olshevsky, Vyacheslav; Innocenti, Maria Elena; Cazzola, Emanuele; Lapenta, Giovanni; Deca, Jan; Divin, Andrey; Peng, Ivy Bo; Markidis, Stefano


    We present a systematic attempt to study magnetic null points and the associated magnetic energy conversion in kinetic particle-in-cell simulations of various plasma configurations. We address three-dimensional simulations performed with the semi-implicit kinetic electromagnetic code iPic3D in different setups: variations of a Harris current sheet, dipolar and quadrupolar magnetospheres interacting with the solar wind, and a relaxing turbulent configuration with multiple null points. Spiral nulls are more likely created in space plasmas: in all our simulations except lunar magnetic anomaly (LMA) and quadrupolar mini-magnetosphere the number of spiral nulls prevails over the number of radial nulls by a factor of 3–9. We show that often magnetic nulls do not indicate the regions of intensive energy dissipation. Energy dissipation events caused by topological bifurcations at radial nulls are rather rare and short-lived. The so-called X-lines formed by the radial nulls in the Harris current sheet and LMA simulations are rather stable and do not exhibit any energy dissipation. Energy dissipation is more powerful in the vicinity of spiral nulls enclosed by magnetic flux ropes with strong currents at their axes (their cross sections resemble 2D magnetic islands). These null lines reminiscent of Z-pinches efficiently dissipate magnetic energy due to secondary instabilities such as the two-stream or kinking instability, accompanied by changes in magnetic topology. Current enhancements accompanied by spiral nulls may signal magnetic energy conversion sites in the observational data

  2. Space-Charge Simulation of Integrable Rapid Cycling Synchrotron

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Jeffery [Fermilab; Valishev, Alexander [Fermilab


    Integrable optics is an innovation in particle accelerator design that enables strong nonlinear focusing without generating parametric resonances. We use a Synergia space-charge simulation to investigate the application of integrable optics to a high-intensity hadron ring that could replace the Fermilab Booster. We find that incorporating integrability into the design suppresses the beam halo generated by a mismatched KV beam. Our integrable rapid cycling synchrotron (iRCS) design includes other features of modern ring design such as low momentum compaction factor and harmonically canceling sextupoles. Experimental tests of high-intensity beams in integrable lattices will take place over the next several years at the Fermilab Integrable Optics Test Accelerator (IOTA) and the University of Maryland Electron Ring (UMER).

  3. Analysis of the Thermo-Elastic Response of Space Reflectors to Simulated Space Environment (United States)

    Allegri, G.; Ivagnes, M. M.; Marchetti, M.; Poscente, F.


    The evaluation of space environment effects on materials and structures is a key matter to develop a proper design of long duration missions: since a large part of satellites operating in the earth orbital environment are employed for telecommunications, the development of space antennas and reflectors featured by high dimensional stability versus space environment interactions represents a major challenge for designers. The structural layout of state of the art space antennas and reflectors is very complex, since several different sensible elements and materials are employed: particular care must be placed in evaluating the actual geometrical configuration of the reflectors operating in the space environment, since very limited distortions of the designed layout can produce severe effects on the quality of the signal both received and transmitted, especially for antennas operating at high frequencies. The effects of thermal loads due to direct sunlight exposition and to earth and moon albedo can be easily taken into account employing the standard methods of structural analysis: on the other hand the thermal cycling and the exposition to the vacuum environment produce a long term damage accumulation which affects the whole structure. The typical effects of the just mentioned exposition are the outgassing of polymeric materials and the contamination of the exposed surface, which can affect sensibly the thermo-mechanical properties of the materials themselves and, therefore, the structural global response. The main aim of the present paper is to evaluate the synergistic effects of thermal cycling and of the exposition to high vacuum environment on an innovative antenna developed by Alenia Spazio S.p.a.: to this purpose, both an experimental and numerical research activity has been developed. A complete prototype of the antenna has been exposed to the space environment simulated by the SAS facility: this latter is constituted by an high vacuum chamber, equipped by

  4. Thermal System Upgrade of the Space Environment Simulation Test Chamber (United States)

    Desai, Ashok B.


    The paper deals with the refurbishing and upgrade of the thermal system for the existing thermal vacuum test facility, the Space Environment Simulator, at NASA's Goddard Space Flight Center. The chamber is the largest such facility at the center. This upgrade is the third phase of the long range upgrade of the chamber that has been underway for last few years. The first phase dealt with its vacuum system, the second phase involved the GHe subsystem. The paper describes the considerations of design philosophy options for the thermal system; approaches taken and methodology applied, in the evaluation of the remaining "life" in the chamber shrouds and related equipment by conducting special tests and studies; feasibility and extent of automation, using computer interfaces and Programmable Logic Controllers in the control system and finally, matching the old components to the new ones into an integrated, highly reliable and cost effective thermal system for the facility. This is a multi-year project just started and the paper deals mainly with the plans and approaches to implement the project successfully within schedule and costs.

  5. Future projections of synoptic weather types over the Arabian Peninsula during the twenty-first century using an ensemble of CMIP5 models

    KAUST Repository

    El Kenawy, Ahmed M.


    An assessment of future change in synoptic conditions over the Arabian Peninsula throughout the twenty-first century was performed using 20 climate models from the Coupled Model Intercomparison Project Phase 5 (CMIP5) database. We employed the mean sea level pressure (SLP) data from model output together with NCEP/NCAR reanalysis data and compared the relevant circulation types produced by the Lamb classification scheme for the base period 1975–2000. Overall, model results illustrated good agreement with the reanalysis, albeit with a tendency to underestimate cyclonic (C) and southeasterly (SE) patterns and to overestimate anticyclones and directional flows. We also investigated future projections for each circulation-type during the rainy season (December–May) using three Representative Concentration Pathways (RCPs), comprising RCP2.6, RCP4.5, and RCP8.5. Overall, two scenarios (RCP4.5 and RCP 8.5) revealed a statistically significant increase in weather types favoring above normal rainfall in the region (e.g., C and E-types). In contrast, weather types associated with lower amounts of rainfall (e.g., anticyclones) are projected to decrease in winter but increase in spring. For all scenarios, there was consistent agreement on the sign of change (i.e., positive/negative) for the most frequent patterns (e.g., C, SE, E and A-types), whereas the sign was uncertain for less recurrent types (e.g., N, NW, SE, and W). The projected changes in weather type frequencies in the region can be viewed not only as indicators of change in rainfall response but may also be used to inform impact studies pertinent to water resource planning and management, extreme weather analysis, and agricultural production.

  6. Thomas Piketty’s Book “Capital in the Twenty-First Century”, Karl Marx and the Political Economy of the Internet

    Directory of Open Access Journals (Sweden)

    Christian Fuchs


    Full Text Available Thomas Piketty’s book Capital in the Twenty-First Century has resulted in a sustained political and academic debate about capitalism in the 21st century. This article discusses the relevance of the book in the context of Karl Marx’s works and the political economy of the Internet. It identifies 3 common reactions to Piketty’s book: 1 dignification; 2 denigration of the work’s integrity; 3 the denial of any parallel to Marx. I argue that all three reactions do not help the task of creating a New Left that is urgently needed in the situation of sustained capitalist crisis. Marxists will certainly view Piketty’s analysis of capitalism and political suggestions critically. I argue that they should however not dismiss them, but like Marx and Engels aim to radicalise reform suggestions. In relation to the Internet, this paper discusses especially how insights from Piketty’s book can inform the discussion of tax avoidance by transnational Internet companies such as Google, Facebook and Amazon. For establishing an alternative, non-commercial, non-capitalist Internet one can draw insights about institutional reforms and progressive capital taxation from Piketty that can be radicalised in order to ground radical-reformist Internet politics. “The daily struggle for reforms, for the amelioration of the condition of the workers within the framework of the existing social order, and for democratic institutions, offers to the social democracy the only means of engaging in the proletarian class war and working in the direction of the final goal-the conquest of political power and the suppression of wage labor. Between social reforms and revolution there exists for the social democracy an indissoluble tie. The struggle for reforms is its means; the social revolution, its aim” (Rosa Luxemburg 1899, 41.

  7. Future projections of synoptic weather types over the Arabian Peninsula during the twenty-first century using an ensemble of CMIP5 models (United States)

    El Kenawy, Ahmed M.; McCabe, Matthew F.


    An assessment of future change in synoptic conditions over the Arabian Peninsula throughout the twenty-first century was performed using 20 climate models from the Coupled Model Intercomparison Project Phase 5 (CMIP5) database. We employed the mean sea level pressure (SLP) data from model output together with NCEP/NCAR reanalysis data and compared the relevant circulation types produced by the Lamb classification scheme for the base period 1975-2000. Overall, model results illustrated good agreement with the reanalysis, albeit with a tendency to underestimate cyclonic (C) and southeasterly (SE) patterns and to overestimate anticyclones and directional flows. We also investigated future projections for each circulation-type during the rainy season (December-May) using three Representative Concentration Pathways (RCPs), comprising RCP2.6, RCP4.5, and RCP8.5. Overall, two scenarios (RCP4.5 and RCP 8.5) revealed a statistically significant increase in weather types favoring above normal rainfall in the region (e.g., C and E-types). In contrast, weather types associated with lower amounts of rainfall (e.g., anticyclones) are projected to decrease in winter but increase in spring. For all scenarios, there was consistent agreement on the sign of change (i.e., positive/negative) for the most frequent patterns (e.g., C, SE, E and A-types), whereas the sign was uncertain for less recurrent types (e.g., N, NW, SE, and W). The projected changes in weather type frequencies in the region can be viewed not only as indicators of change in rainfall response but may also be used to inform impact studies pertinent to water resource planning and management, extreme weather analysis, and agricultural production.

  8. Application of the Intervention Mapping Framework to Develop an Integrated Twenty-first Century Core Curriculum—Part Three: Curriculum Implementation and Evaluation

    Directory of Open Access Journals (Sweden)

    Jaime A. Corvin


    Full Text Available Public health professionals have been challenged to radically reform public health training to meet evolving demands of twenty-first century public health. Such a transformation requires a systems thinking approach with an interdisciplinary focus on problem solving, leadership, management and teamwork, technology and information, budgeting and finance, and communication. This article presents processes for implementing and evaluating a revised public health curriculum and outlines lessons learned from this initiative. To date, more than 200 students have participated in the initial pilot testing of this program. A rigorous process and outcome evaluation plan was developed and employed. Results from the evaluation were used to enhance the resulting curriculum. Specifically, all instructional materials were evaluated by both the students who received the materials and the faculty who presented the materials. As each successive pilot is delivered, both enrollment and faculty involvement has increased. Through this process, the value of committed faculty, the importance of engaging learners in the evaluation of an education program, and the need to implement curriculum that has been carefully evaluated and evidence-informed in nature has emerged. We credit our successful transformation of the Masters in Public Health core to the challenge provided by the Framing the Future task force, the commitment of our College of Public Health leadership, the engagement of our faculty, and the time we allowed for the process to unfold. Ultimately, we believe this transformed curriculum will result in better trained public health professionals, interdisciplinary practitioners who can see public health challenges in new and different ways.

  9. Why 'class' is too soft a category to capture the explosiveness of social inequality at the beginning of the twenty-first century. (United States)

    Beck, Ulrich


    We can distinguish four positions on the continuing, or maybe even increasing, relevance of the category of class at the beginning of the twenty-first century depending on the extent to which they accord central importance to (1) the reproduction or (2) the transformation of social classes with regard to (3) the distribution of goods without bads or (4) the distribution of goods and bads. One could say that Dean Curran introduces the concept of 'risk-class' to radicalize the class distribution of risk and charts who will able to occupy areas less exposed to risk and who will have little choice but to occupy areas that are exposed to the brunt of the fact of the risk society. As he mentioned it is important to note that this social structuring of the distribution of bads will be affected not only by class, but also by other forms of social structuration of disadvantage, such as gender and race. In order to demonstrate that the distribution of bads is currently exacerbating class differences in life chances, however, Curran concentrates exclusively on phenomena of individual risks. In the process, he overlooks the problem of systemic risks in relation of the state, science, new corporate roles, management the mass media, law, mobile capital and social movements; at the same time, his conceptual frame of reference does not really thematize the interdependence between individual and systemic risks. Those who reduce the problematic of risk to that of the life chances of individuals are unable to grasp the conflicting social and political logics of risk and class conflicts. Or, to put it pointedly: 'class' is too soft a category to capture the explosiveness of social inequality in world risk society. © London School of Economics and Political Science 2013.

  10. Space Geodetic Technique Co-location in Space: Simulation Results for the GRASP Mission (United States)

    Kuzmicz-Cieslak, M.; Pavlis, E. C.


    The Global Geodetic Observing System-GGOS, places very stringent requirements in the accuracy and stability of future realizations of the International Terrestrial Reference Frame (ITRF): an origin definition at 1 mm or better at epoch and a temporal stability on the order of 0.1 mm/y, with similar numbers for the scale (0.1 ppb) and orientation components. These goals were derived from the requirements of Earth science problems that are currently the international community's highest priority. None of the geodetic positioning techniques can achieve this goal alone. This is due in part to the non-observability of certain attributes from a single technique. Another limitation is imposed from the extent and uniformity of the tracking network and the schedule of observational availability and number of suitable targets. The final limitation derives from the difficulty to "tie" the reference points of each technique at the same site, to an accuracy that will support the GGOS goals. The future GGOS network will address decisively the ground segment and to certain extent the space segment requirements. The JPL-proposed multi-technique mission GRASP (Geodetic Reference Antenna in Space) attempts to resolve the accurate tie between techniques, using their co-location in space, onboard a well-designed spacecraft equipped with GNSS receivers, a SLR retroreflector array, a VLBI beacon and a DORIS system. Using the anticipated system performance for all four techniques at the time the GGOS network is completed (ca 2020), we generated a number of simulated data sets for the development of a TRF. Our simulation studies examine the degree to which GRASP can improve the inter-technique "tie" issue compared to the classical approach, and the likely modus operandi for such a mission. The success of the examined scenarios is judged by the quality of the origin and scale definition of the resulting TRF.

  11. Research and development at the Marshall Space Flight Center Neutral Buoyancy Simulator (United States)

    Kulpa, Vygantas P.


    The Neutral Buoyancy Simulator (NBS), a facility designed to imitate zero-gravity conditions, was used to test the Experimental Assembly of Structures in Extravehicular Activity (EASE) and the Assembly Concept for Construction of Erectable Space Structures (ACCESS). Neutral Buoyancy Simulator applications and operations; early space structure research; development of the EASE/ACCESS experiments; and improvement of NBS simulation are summarized.

  12. Thermally Induced Vibrations of the Hubble Space Telescope's Solar Array 3 in a Test Simulated Space Environment (United States)

    Early, Derrick A.; Haile, William B.; Turczyn, Mark T.; Griffin, Thomas J. (Technical Monitor)


    NASA Goddard Space Flight Center and the European Space Agency (ESA) conducted a disturbance verification test on a flight Solar Array 3 (SA3) for the Hubble Space Telescope using the ESA Large Space Simulator (LSS) in Noordwijk, the Netherlands. The LSS cyclically illuminated the SA3 to simulate orbital temperature changes in a vacuum environment. Data acquisition systems measured signals from force transducers and accelerometers resulting from thermally induced vibrations of the SAI The LSS with its seismic mass boundary provided an excellent background environment for this test. This paper discusses the analysis performed on the measured transient SA3 responses and provides a summary of the results.

  13. Simulating Space Radiation-Induced Breast Tumor Incidence Using Automata. (United States)

    Heuskin, A C; Osseiran, A I; Tang, J; Costes, S V


    Estimating cancer risk from space radiation has been an ongoing challenge for decades primarily because most of the reported epidemiological data on radiation-induced risks are derived from studies of atomic bomb survivors who were exposed to an acute dose of gamma rays instead of chronic high-LET cosmic radiation. In this study, we introduce a formalism using cellular automata to model the long-term effects of ionizing radiation in human breast for different radiation qualities. We first validated and tuned parameters for an automata-based two-stage clonal expansion model simulating the age dependence of spontaneous breast cancer incidence in an unexposed U.S. We then tested the impact of radiation perturbation in the model by modifying parameters to reflect both targeted and nontargeted radiation effects. Targeted effects (TE) reflect the immediate impact of radiation on a cell's DNA with classic end points being gene mutations and cell death. They are well known and are directly derived from experimental data. In contrast, nontargeted effects (NTE) are persistent and affect both damaged and undamaged cells, are nonlinear with dose and are not well characterized in the literature. In this study, we introduced TE in our model and compared predictions against epidemiologic data of the atomic bomb survivor cohort. TE alone are not sufficient for inducing enough cancer. NTE independent of dose and lasting ∼100 days postirradiation need to be added to accurately predict dose dependence of breast cancer induced by gamma rays. Finally, by integrating experimental relative biological effectiveness (RBE) for TE and keeping NTE (i.e., radiation-induced genomic instability) constant with dose and LET, the model predicts that RBE for breast cancer induced by cosmic radiation would be maximum at 220 keV/μm. This approach lays the groundwork for further investigation into the impact of chronic low-dose exposure, inter-individual variation and more complex space radiation


    Directory of Open Access Journals (Sweden)

    Antônio Carlos dos Santos


    Full Text Available A crítica de muitos à globalização é conseqüência dos rumos que ela está tomando. Embora a globalização seja um processo dinâmico em andamento, o seu avanço tem ocorrido de forma desequilibrada, gerando instabilidade política, econômica e social em várias regiões do planeta. O presente trabalho procura, de forma teórica, mostrar a falta da globalização social como um dos fatores que tem provocado desequilíbrio na dinâmica do processo de globalização. Pelo lado econômico, observa-se que a globalização ocorre de forma acelerada e já alcança os mais distantes pontos da face da Terra, ao passo que, pelo lado social, observa-se que a globalização está ausente em algumas regiões e, em outro tanto, ela ocorre de forma lenta e sem muito interesse. De nada vale os benefícios da globalização econômica se não existir a globalização social. Esse e o desafio do século XXI.The criticism of many of globalization is a consequence of directions it is taking. While globalization is a dynamic process in progress, its progress has occurred so unbalanced, creating politicalinstability, economic and social development in various regions of the planet. This paper demand, so theoretically, show the lack of social globalisation as one of the factors that have causedimbalance in the dynamics of the globalization process. On the economic side there is that globalization occurs so rapidly and have reached the most distant points of the face of theEarth, while the social side, there is that globalisation is absent in some regions, and in another both, it happens so slowly and without much interest. It is not worth the benefits of economicglobalization if there is the social globalisation. That and the challenge of the twenty-first century.

  15. Lights, camera, action research: The effects of didactic digital movie making on students' twenty-first century learning skills and science content in the middle school classroom (United States)

    Ochsner, Karl

    Students are moving away from content consumption to content production. Short movies are uploaded onto video social networking sites and shared around the world. Unfortunately they usually contain little to no educational value, lack a narrative and are rarely created in the science classroom. According to new Arizona Technology standards and ISTE NET*S, along with the framework from the Partnership for 21st Century Learning Standards, our society demands students not only to learn curriculum, but to think critically, problem solve effectively, and become adept at communicating and collaborating. Didactic digital movie making in the science classroom may be one way that these twenty-first century learning skills may be implemented. An action research study using a mixed-methods approach to collect data was used to investigate if didactic moviemaking can help eighth grade students learn physical science content while incorporating 21st century learning skills of collaboration, communication, problem solving and critical thinking skills through their group production. Over a five week period, students researched lessons, wrote scripts, acted, video recorded and edited a didactic movie that contained a narrative plot to teach a science strand from the Arizona State Standards in physical science. A pretest/posttest science content test and KWL chart was given before and after the innovation to measure content learned by the students. Students then took a 21st Century Learning Skills Student Survey to measure how much they perceived that communication, collaboration, problem solving and critical thinking were taking place during the production. An open ended survey and a focus group of four students were used for qualitative analysis. Three science teachers used a project evaluation rubric to measure science content and production values from the movies. Triangulating the science content test, KWL chart, open ended questions and the project evaluation rubric, it

  16. Continuum Vlasov Simulation in Four Phase-space Dimensions (United States)

    Cohen, B. I.; Banks, J. W.; Berger, R. L.; Hittinger, J. A.; Brunner, S.


    In the VALHALLA project, we are developing scalable algorithms for the continuum solution of the Vlasov-Maxwell equations in two spatial and two velocity dimensions. We use fourth-order temporal and spatial discretizations of the conservative form of the equations and a finite-volume representation to enable adaptive mesh refinement and nonlinear oscillation control [1]. The code has been implemented with and without adaptive mesh refinement, and with electromagnetic and electrostatic field solvers. A goal is to study the efficacy of continuum Vlasov simulations in four phase-space dimensions for laser-plasma interactions. We have verified the code in examples such as the two-stream instability, the weak beam-plasma instability, Landau damping, electron plasma waves with electron trapping and nonlinear frequency shifts [2]^ extended from 1D to 2D propagation, and light wave propagation.^ We will report progress on code development, computational methods, and physics applications. This work was performed under the auspices of the U.S. DOE by LLNL under contract no. DE-AC52-07NA27344. This work was funded by the Lab. Dir. Res. and Dev. Prog. at LLNL under project tracking code 08-ERD-031. [1] J.W. Banks and J.A.F. Hittinger, to appear in IEEE Trans. Plas. Sci. (Sept., 2010). [2] G.J. Morales and T.M. O'Neil, Phys. Rev. Lett. 28,417 (1972); R. L. Dewar, Phys. Fluids 15,712 (1972).

  17. Functional requirements for design of the Space Ultrareliable Modular Computer (SUMC) system simulator (United States)

    Curran, R. T.; Hornfeck, W. A.


    The functional requirements for the design of an interpretive simulator for the space ultrareliable modular computer (SUMC) are presented. A review of applicable existing computer simulations is included along with constraints on the SUMC simulator functional design. Input requirements, output requirements, and language requirements for the simulator are discussed in terms of a SUMC configuration which may vary according to the application.

  18. Regional variation of carbonaceous aerosols from space and simulations (United States)

    Mukai, Sonoyo; Sano, Itaru; Nakata, Makiko; Kokhanovsky, Alexander


    effect on carbonaceous aerosols. And then the selected data observed by ADEOS-2/GLI and POLDER in 2003 are treated by using Vector form Method of Successive Order of Scattering (VMSOS) for radiative transfer simulations in the semi-infinite atmosphere [2]. Finally the obtained optical properties of the carbonaceous aerosols are investigated in comparison with the numerical model simulations of SPRINTARS. In spite of the limited case studies, it has been pointed out that NUV-channel data are effective for retrieval of the carbonaceous aerosol properties. Therefore we have to treat with this issue for not only detection of biomass burning plume but also retrieval itself. If that happens, synthetic analysis based on multi-channel and/or polarization measurements become practical, and the proposed procedure and results are available for a feasibility study of coming space missions. [1] Sano, I., Y. Okada, M. Mukai and S. Mukai, "Retrieval algorithm based on combined use of POLDER and GLI data for biomass aerosols," J. RSSJ, vol. 29, no. 1, pp. 54-59, doi:10.11440/rssj.29.54, 2009. [2] Mukai, S., M. Nakata, M. Yasumoto, I. Sano and A. Kokhanovsky, "A study of aerosol pollution episode due to agriculture biomass burning in the east-central China using satellite data," Front. Environ. Sci., vol. 3:57, doi: 10.3389/fenvs.2015.00057, 2015.

  19. Bridging the climate-induced water gap in the twenty-first century: adaptation support based on water supply, demand, adaptation and financing. (United States)

    Straatsma, Menno; Droogers, Peter; Brandsma, Jaïrus; Buytaert, Wouter; Karssenberg, Derek; Van Beek, Rens; Wada, Yoshihide; Sutanudjaja, Edwin; Vitolo, Claudia; Schmitz, Oliver; Meijer, Karen; Van Aalst, Maaike; Bierkens, Marc


    Water scarcity affects large parts of the world. Over the course of the twenty-first century, water demand is likely to increase due to population growth and associated food production, and increased economic activity, while water supply is projected to decrease in many regions due to climate change. Despite recent studies that analyze the effect of climate change on water scarcity, e.g. using climate projections under representative concentration pathways (RCP) of the fifth assessment report of the IPCC (AR5), decision support for closing the water gap between now and 2100 does not exist at a meaningful scale and with a global coverage. In this study, we aimed (i) to assess the joint impact of climatic and socio-economic change on water scarcity, (ii) to integrate impact and potential adaptation in one workflow, (iii) to prioritize adaptation options to counteract water scarcity based on their financial, regional socio-economic and environmental implications, and (iv) to deliver all this information in an integrated user-friendly web-based service. To enable the combination of global coverage with local relevance, we aggregated all results for 1604 water provinces (food producing units) delineated in this study, which is five times smaller than previous food producing units. Water supply was computed using the PCR-GLOBWB hydrological and water resources model, parameterized at 5 arcminutes for the whole globe, excluding Antarctica and Greenland. We ran PCR-GLOBWB with a daily forcing derived from five different GCM models from the CMIP5 (GFDL-ESM2M, Hadgem2-ES, IPSL-CMA5-LR, MIROC-ESM-CHEM, NorESM1-M) that were bias corrected using observation-based WATCH data between 1960-1999. For each of the models all four RCPs (RCP 2.6, 4.5, 6.0, and 8.5) were run, producing the ensemble of 20 future projections. The blue water supply was aggregated per month and per water province. Industrial, domestic and irrigation water demands were computed for a limited number of

  20. 3D space combat simulation game with artificial intelligence


    Pernička, Václav


    The goal of this thesis is to design and implement a 3D space shooter with artifitial intelligence. This thesis includes theoretic analysis of space shooters, types of artifitial intelligence and assumptions important for developing in 3D space. The game also includes a simple artifitial intelligent player.

  1. Twenty-First Century Educational Theory and the Challenges of Modern Education: Appealing to the Heritage of the General Teaching Theory of the Secondary Educational Curriculum and the Learning Process (United States)

    Klarin, Mikhail V.


    The article presents an analysis of educational theory in light of the challenges confronting education in the twenty-first century. The author examines how our ideas about the methods for managing the transmission of culture, the subject of education, and the consequences of these changes for the theory of education have changed. The author…

  2. Desert Cyanobacteria under simulated space and Martian conditions (United States)

    Billi, D.; Ghelardini, P.; Onofri, S.; Cockell, C. S.; Rabbow, E.; Horneck, G.


    The environment in space and on planets such as Mars, can be lethal to living organisms and high levels of tolerance to desiccation, cold and radiation are needed for survival: rock-inhabiting cyanobacteria belonging to the genus Chroococcidiopsis can fulfil these requirements [1]. These cyanobacteria constantly appear in the most extreme and dry habitats on Earth, including the McMurdo Dry Valleys (Antarctica) and the Atacama Desert (Chile), which are considered the closest terrestrial analogs of two Mars environmental extremes: cold and aridity. In their natural environment, these cyanobacteria occupy the last refuges for life inside porous rocks or at the stone-soil interfaces, where they survive in a dry, dormant state for prolonged periods. How desert strains of Chroococcidiopsis can dry without dying is only partially understood, even though experimental evidences support the existence of an interplay between mechanisms to avoid (or limit) DNA damage and repair it: i) desert strains of Chroococcidiopsis mend genome fragmentation induced by ionizing radiation [2]; ii) desiccation-survivors protect their genome from complete fragmentation; iii) in the dry state they show a survival to an unattenuated Martian UV flux greater than that of Bacillus subtilis spores [3], and even though they die following atmospheric entry after having orbited the Earth for 16 days [4], they survive to simulated shock pressures up to 10 GPa [5]. Recently additional experiments were carried out at the German Aerospace Center (DLR) of Cologne (Germany) in order to identify suitable biomarkers to investigate the survival of Chroococcidiopsis cells present in lichen-dominated communities, in view of their direct and long term space exposition on the International Space Station (ISS) in the framework of the LIchens and Fungi Experiments (LIFE, EXPOSEEuTEF, ESA). Multilayers of dried cells of strains CCMEE 134 (Beacon Valley, Antarctica), and CCMEE 123 (costal desert, Chile ), shielded by

  3. Development of automation and robotics for space via computer graphic simulation methods (United States)

    Fernandez, Ken


    A robot simulation system, has been developed to perform automation and robotics system design studies. The system uses a procedure-oriented solid modeling language to produce a model of the robotic mechanism. The simulator generates the kinematics, inverse kinematics, dynamics, control, and real-time graphic simulations needed to evaluate the performance of the model. Simulation examples are presented, including simulation of the Space Station and the design of telerobotics for the Orbital Maneuvering Vehicle.

  4. A Path Space Extension for Robust Light Transport Simulation

    DEFF Research Database (Denmark)

    Hachisuka, Toshiya; Pantaleoni, Jacopo; Jensen, Henrik Wann


    We present a new sampling space for light transport paths that makes it possible to describe Monte Carlo path integration and photon density estimation in the same framework. A key contribution of our paper is the introduction of vertex perturbations, which extends the space of paths with loosely...

  5. Global Cropland Area Database (GCAD) derived from Remote Sensing in Support of Food Security in the Twenty-first Century: Current Achievements and Future Possibilities (United States)

    Teluguntla, Pardhasaradhi G.; Thenkabail, Prasad S.; Xiong, Jun N.; Gumma, Murali Krishna; Giri, Chandra; Milesi, Cristina; Ozdogan, Mutlu; Congalton, Russ; Tilton, James; Sankey, Temuulen Tsagaan; Massey, Richard; Phalke, Aparna; Yadav, Kamini


    to biofuels (Bindraban et al., 2009), limited water resources for irrigation expansion (Turral et al., 2009), limits on agricultural intensifications, loss of croplands to urbanization (Khan and Hanjra, 2008), increasing meat consumption (and associated demands on land and water) (Vinnari and Tapio, 2009), environmental infeasibility for cropland expansion (Gordon et al., 2009), and changing climate have all put pressure on our continued ability to sustain global food security in the twenty-first century. So, how does the World continue to meet its food and nutrition needs?. Solutions may come from bio-technology and precision farming, however developments in these fields are not currently moving at rates that will ensure global food security over next few decades. Further, there is a need for careful consideration of possible harmful effects of bio-technology. We should not be looking back 30– 50 years from now, like we have been looking back now at many mistakes made during the green revolution. During the green revolution the focus was only on getting more yield per unit area. Little thought was put about serious damage done to our natural environments, water resources, and human health as a result of detrimental factors such as uncontrolled use of herbicides-pesticides-nutrients, drastic groundwater mining, and salinization of fertile soils due to over irrigation. Currently, there is talk of a “second green revolution” or even an “ever green revolution”, but clear ideas on what these terms actually mean are still debated and are evolving. One of the biggest issues that are not given adequate focus is the use of large quantities of water for food production. Indeed, an overwhelming proportion (60-90%) of all human water use in India goes for producing their food (Falkenmark, M., & Rockström, 2006). But such intensive water use for food production is no longer tenable due to increasing pressure for water use alternatives such as increasing urbanization

  6. Unified Approach to Modeling and Simulation of Space Communication Networks and Systems (United States)

    Barritt, Brian; Bhasin, Kul; Eddy, Wesley; Matthews, Seth


    Network simulator software tools are often used to model the behaviors and interactions of applications, protocols, packets, and data links in terrestrial communication networks. Other software tools that model the physics, orbital dynamics, and RF characteristics of space systems have matured to allow for rapid, detailed analysis of space communication links. However, the absence of a unified toolset that integrates the two modeling approaches has encumbered the systems engineers tasked with the design, architecture, and analysis of complex space communication networks and systems. This paper presents the unified approach and describes the motivation, challenges, and our solution - the customization of the network simulator to integrate with astronautical analysis software tools for high-fidelity end-to-end simulation. Keywords space; communication; systems; networking; simulation; modeling; QualNet; STK; integration; space networks

  7. Simulating Nonlinear Dynamics of Deployable Space Structures, Phase I (United States)

    National Aeronautics and Space Administration — To support NASA's vital interest in developing much larger solar array structures over the next 20 years, MotionPort LLC's Phase I SBIR project will strengthen...

  8. Issues in visual support to real-time space system simulation solved in the Systems Engineering Simulator (United States)

    Yuen, Vincent K.


    The Systems Engineering Simulator has addressed the major issues in providing visual data to its real-time man-in-the-loop simulations. Out-the-window views and CCTV views are provided by three scene systems to give the astronauts their real-world views. To expand the window coverage for the Space Station Freedom workstation a rotating optics system is used to provide the widest field of view possible. To provide video signals to as many viewpoints as possible, windows and CCTVs, with a limited amount of hardware, a video distribution system has been developed to time-share the video channels among viewpoints at the selection of the simulation users. These solutions have provided the visual simulation facility for real-time man-in-the-loop simulations for the NASA space program.

  9. Proceedings of the Twenty-First NASA Propagation Experimenters Meeting (NAPEX XXI) and the Advanced Communications Technology Satellite (ACTS) Propagation Studies Miniworkshop (United States)

    Golshan, Nasser (Editor)


    The NASA Propagation Experimenters (NAPEX) meeting is convened each year to discuss studies supported by the NASA Propagation Program. Representatives from the satellite communications industry, academia and government who have an interest in space-ground radio wave propagation are invited to NAPEX meetings for discussions and exchange of information. The reports delivered at this meeting by program managers and investigators present recent activities and future plans. This forum provides an opportunity for peer discussion of work in progress, timely dissemination of propagation results, and close interaction with the satellite communications industry.

  10. Private ground infrastructures for space exploration missions simulations (United States)

    Souchier, Alain


    The Mars Society, a private non profit organisation devoted to promote the red planet exploration, decided to implement simulated Mars habitat in two locations on Earth: in northern Canada on the rim of a meteoritic crater (2000), in a US Utah desert, location of a past Jurassic sea (2001). These habitats have been built with large similarities to actual planned habitats for first Mars exploration missions. Participation is open to everybody either proposing experimentations or wishing only to participate as a crew member. Participants are from different organizations: Mars Society, Universities, experimenters working with NASA or ESA. The general philosophy of the work conducted is not to do an innovative scientific work on the field but to learn how the scientific work is affected or modified by the simulation conditions. Outside activities are conducted with simulated spacesuits limiting the experimenter abilities. Technology or procedures experimentations are also conducted as well as experimentations on the crew psychology and behaviour.

  11. Space Weathering Evolution on Airless Bodies - Laboratory Simulations with Olivine

    Czech Academy of Sciences Publication Activity Database

    Kohout, Tomáš; Čuda, J.; Bradley, T.; Britt, D.; Filip, J.; Tuček, J.; Malina, O.; Kašlík, J.; Šišková, K.; Zbořil, R.


    Roč. 45, č. 9 (2013), s. 25-26 ISSN 0002-7537. [Annual meeting of the Division for Planetary Sciences of the American Astronomical Society /45./. 06.10.2013-11.10.2013, Denver] Institutional support: RVO:67985831 Keywords : space weathering * asteroid * Moon * olivine Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics

  12. Space-Charge-Limited Emission Models for Particle Simulation (United States)

    Verboncoeur, J. P.; Cartwright, K. L.; Murphy, T.


    Space-charge-limited (SCL) emission of electrons from various materials is a common method of generating the high current beams required to drive high power microwave (HPM) sources. In the SCL emission process, sufficient space charge is extracted from a surface, often of complicated geometry, to drive the electric field normal to the surface close to zero. The emitted current is highly dominated by space charge effects as well as ambient fields near the surface. In this work, we consider computational models for the macroscopic SCL emission process including application of Gauss's law and the Child-Langmuir law for space-charge-limited emission. Models are described for ideal conductors, lossy conductors, and dielectrics. Also considered is the discretization of these models, and the implications for the emission physics. Previous work on primary and dual-cell emission models [Watrous et al., Phys. Plasmas 8, 289-296 (2001)] is reexamined, and aspects of the performance, including fidelity and noise properties, are improved. Models for one-dimensional diodes are considered, as well as multidimensional emitting surfaces, which include corners and transverse fields.

  13. Simulation analysis of photometric data for attitude estimation of unresolved space objects (United States)

    Du, Xiaoping; Gou, Ruixin; Liu, Hao; Hu, Heng; Wang, Yang


    The attitude information acquisition of unresolved space objects, such as micro-nano satellites and GEO objects under the way of ground-based optical observations, is a challenge to space surveillance. In this paper, a useful method is proposed to estimate the SO attitude state according to the simulation analysis of photometric data in different attitude states. The object shape model was established and the parameters of the BRDF model were determined, then the space object photometric model was established. Furthermore, the photometric data of space objects in different states are analyzed by simulation and the regular characteristics of the photometric curves are summarized. The simulation results show that the photometric characteristics are useful for attitude inversion in a unique way. Thus, a new idea is provided for space object identification in this paper.

  14. Proceedings of the Twenty-First NASA Propagation Experiments Meeting (NAPEX XXI) and the Advanced Communications Technology Satellite (ACTS) Propagation Studies Miniworkshop (United States)

    Golshan, Nasser (Editor)


    The NASA Propagation Experimenters (NAPEX) meeting is convened each year to discuss studies supported by the NASA Propagation Program. Representatives from the satellite communications industry, academia and government who have an interest in space-ground radio wave propagation are invited to NAPEX meetings for discussions and exchange of information. The reports delivered at this meeting by program managers and investigators present recent activities and future plans. This forum provides an opportunity for peer discussion of work in progress, timely dissemination of propagation results, and close interaction with the satellite communications industry. NAPEX XXI took place in El Segundo, California on June 11-12, 1997 and consisted of three sessions. Session 1, entitled "ACTS Propagation Study Results & Outcome " covered the results of 20 station-years of Ka-band radio-wave propagation experiments. Session 11, 'Ka-band Propagation Studies and Models,' provided the latest developments in modeling, and analysis of experimental results about radio wave propagation phenomena for design of Ka-band satellite communications systems. Session 111, 'Propagation Research Topics,' covered a diverse range of propagation topics of interest to the space community, including overviews of handbooks and databases on radio wave propagation. The ACTS Propagation Studies miniworkshop was held on June 13, 1997 and consisted of a technical session in the morning and a plenary session in the afternoon. The morning session covered updates on the status of the ACTS Project & Propagation Program, engineering support for ACTS Propagation Terminals, and the Data Center. The plenary session made specific recommendations for the future direction of the program.

  15. Language Simulations: The Blending Space for Writing and Critical Thinking (United States)

    Kovalik, Doina L.; Kovalik, Ludovic M.


    This article describes a language simulation involving six distinct phases: an in-class quick response, a card game, individual research, a classroom debate, a debriefing session, and an argumentative essay. An analysis of student artifacts--quick-response writings and final essays, respectively, both addressing the definition of liberty in a…

  16. The General-Use Nodal Network Solver (GUNNS) Modeling Package for Space Vehicle Flow System Simulation (United States)

    Harvey, Jason; Moore, Michael


    The General-Use Nodal Network Solver (GUNNS) is a modeling software package that combines nodal analysis and the hydraulic-electric analogy to simulate fluid, electrical, and thermal flow systems. GUNNS is developed by L-3 Communications under the TS21 (Training Systems for the 21st Century) project for NASA Johnson Space Center (JSC), primarily for use in space vehicle training simulators at JSC. It has sufficient compactness and fidelity to model the fluid, electrical, and thermal aspects of space vehicles in real-time simulations running on commodity workstations, for vehicle crew and flight controller training. It has a reusable and flexible component and system design, and a Graphical User Interface (GUI), providing capability for rapid GUI-based simulator development, ease of maintenance, and associated cost savings. GUNNS is optimized for NASA's Trick simulation environment, but can be run independently of Trick.

  17. Navigating the Problem Space: The Medium of Simulation Games in the Teaching of History (United States)

    McCall, Jeremiah


    Simulation games can play a critical role in enabling students to navigate the problem spaces of the past while simultaneously critiquing the models designers offer to represent those problem spaces. There is much to be gained through their use. This includes rich opportunities for students to engage the past as independent historians; to consider…

  18. Jumbo Space Environment Simulation and Spacecraft Charging Chamber Characterization (United States)


    probes for Jumbo. Both probes are produced by Trek Inc. Trek probe model 370 is capable of -3 to 3kV and has an extremely fast, 50µs/kV response to...changing surface potentials. Trek probe 341B is capable of -20 to 20kV with a 200 µs/kV response time. During our charging experiments the probe sits...unlimited. 12 REFERENCES [1] R. D. Leach and M. B. Alexander, "Failures and anomalies attributed to spacecraft charging," NASA RP-1375, Marshall Space

  19. Simulation of space charge effects and transition crossing in the Fermilab Booster

    International Nuclear Information System (INIS)

    Lucas, P.; MacLachlan, J.


    The longitudinal phase space program ESME, modified for space charge and wall impedance effects, has been used to simulate transition crossing in the Fermilab Booster. The simulations yield results in reasonable quantitative agreement with measured parameters. They further indicate that a transition jump scheme currently under construction will significantly reduce emittance growth, while attempts to alter machine impedance are less obviously beneficial. In addition to presenting results, this paper points out a serious difficulty, related to statistical fluctuations, in the space charge calculation. False indications of emittance growth can appear if care is not taken to minimize this problem

  20. Behavior of ionic conducting IPN actuators in simulated space conditions (United States)

    Fannir, Adelyne; Plesse, Cédric; Nguyen, Giao T. M.; Laurent, Elisabeth; Cadiergues, Laurent; Vidal, Frédéric


    The presentation focuses on the performances of flexible all-polymer electroactive actuators under space-hazardous environmental factors in laboratory conditions. These bending actuators are based on high molecular weight nitrile butadiene rubber (NBR), poly(ethylene oxide) (PEO) derivative and poly(3,4-ethylenedioxithiophene) (PEDOT). The electroactive PEDOT is embedded within the PEO/NBR membrane which is subsequently swollen with an ionic liquid as electrolyte. Actuators have been submitted to thermal cycling test between -25 to 60°C under vacuum (2.4 10-8 mbar) and to ionizing Gamma radiations at a level of 210 rad/h during 100 h. Actuators have been characterized before and after space environmental condition ageing. In particular, the viscoelasticity properties and mechanical resistance of the materials have been determined by dynamic mechanical analysis and tensile tests. The evolution of the actuation properties as the strain and the output force have been characterized as well. The long-term vacuuming, the freezing temperature and the Gamma radiations do not affect significantly the thermomechanical properties of conducting IPNs actuators. Only a slight decrease on actuation performances has been observed.

  1. Simulation and analysis of tape spring for deployed space structures (United States)

    Chang, Wei; Cao, DongJing; Lian, MinLong


    The tape spring belongs to the configuration of ringent cylinder shell, and the mechanical properties of the structure are significantly affected by the change of geometrical parameters. There are few studies on the influence of geometrical parameters on the mechanical properties of the tape spring. The bending process of the single tape spring was simulated based on simulation software. The variations of critical moment, unfolding moment, and maximum strain energy in the bending process were investigated, and the effects of different radius angles of section and thickness and length on driving capability of the simple tape spring was studied by using these parameters. Results show that the driving capability and resisting disturbance capacity grow with the increase of radius angle of section in the bending process of the single tape spring. On the other hand, these capabilities decrease with increasing length of the single tape spring. In the end, the driving capability and resisting disturbance capacity grow with the increase of thickness in the bending process of the single tape spring. The research has a certain reference value for improving the kinematic accuracy and reliability of deployable structures.

  2. Rare event simulation in finite-infinite dimensional space

    International Nuclear Information System (INIS)

    Au, Siu-Kui; Patelli, Edoardo


    Modern engineering systems are becoming increasingly complex. Assessing their risk by simulation is intimately related to the efficient generation of rare failure events. Subset Simulation is an advanced Monte Carlo method for risk assessment and it has been applied in different disciplines. Pivotal to its success is the efficient generation of conditional failure samples, which is generally non-trivial. Conventionally an independent-component Markov Chain Monte Carlo (MCMC) algorithm is used, which is applicable to high dimensional problems (i.e., a large number of random variables) without suffering from ‘curse of dimension’. Experience suggests that the algorithm may perform even better for high dimensional problems. Motivated by this, for any given problem we construct an equivalent problem where each random variable is represented by an arbitrary (hence possibly infinite) number of ‘hidden’ variables. We study analytically the limiting behavior of the algorithm as the number of hidden variables increases indefinitely. This leads to a new algorithm that is more generic and offers greater flexibility and control. It coincides with an algorithm recently suggested by independent researchers, where a joint Gaussian distribution is imposed between the current sample and the candidate. The present work provides theoretical reasoning and insights into the algorithm.


    International Nuclear Information System (INIS)



    Space charge in high intensity beams is an important issue in accelerator physics. Due to the complicity of the problems, the most effective way of investigating its effect is by computer simulations. In the resent years, many space charge simulation methods have been developed and incorporated in various 2D or 3D multi-particle-tracking codes. It has becoming necessary to benchmark these methods against each other, and against experimental results. As a part of global effort, we present our initial comparison of the space charge methods incorporated in simulation codes ORBIT++, ORBIT and SIMPSONS. In this paper, the methods included in these codes are overviewed. The simulation results are presented and compared. Finally, from this study, the advantages and disadvantages of each method are discussed


    Energy Technology Data Exchange (ETDEWEB)



    Space charge in high intensity beams is an important issue in accelerator physics. Due to the complicity of the problems, the most effective way of investigating its effect is by computer simulations. In the resent years, many space charge simulation methods have been developed and incorporated in various 2D or 3D multi-particle-tracking codes. It has becoming necessary to benchmark these methods against each other, and against experimental results. As a part of global effort, we present our initial comparison of the space charge methods incorporated in simulation codes ORBIT++, ORBIT and SIMPSONS. In this paper, the methods included in these codes are overviewed. The simulation results are presented and compared. Finally, from this study, the advantages and disadvantages of each method are discussed.

  5. Simulation analysis of impulse characteristics of space debris irradiated by multi-pulse laser (United States)

    Lin, Zhengguo; Jin, Xing; Chang, Hao; You, Xiangyu


    Cleaning space debris with laser is a hot topic in the field of space security research. Impulse characteristics are the basis of cleaning space debris with laser. In order to study the impulse characteristics of rotating irregular space debris irradiated by multi-pulse laser, the impulse calculation method of rotating space debris irradiated by multi-pulse laser is established based on the area matrix method. The calculation method of impulse and impulsive moment under multi-pulse irradiation is given. The calculation process of total impulse under multi-pulse irradiation is analyzed. With a typical non-planar space debris (cube) as example, the impulse characteristics of space debris irradiated by multi-pulse laser are simulated and analyzed. The effects of initial angular velocity, spot size and pulse frequency on impulse characteristics are investigated.

  6. Electrical behaviour of a silicone elastomer under simulated space environment

    International Nuclear Information System (INIS)

    Roggero, A; Dantras, E; Paulmier, T; Rejsek-Riba, V; Tonon, C; Dagras, S; Balcon, N; Payan, D


    The electrical behavior of a space-used silicone elastomer was characterized using surface potential decay and dynamic dielectric spectroscopy techniques. In both cases, the dielectric manifestation of the glass transition (dipole orientation) and a charge transport phenomenon were observed. An unexpected linear increase of the surface potential with temperature was observed around T g in thermally-stimulated potential decay experiments, due to molecular mobility limiting dipolar orientation in one hand, and 3D thermal expansion reducing the materials capacitance in the other hand. At higher temperatures, the charge transport process, believed to be thermally activated electron hopping with an activation energy of about 0.4 eV, was studied with and without the silica and iron oxide fillers present in the commercial material. These fillers were found to play a preponderant role in the low-frequency electrical conductivity of this silicone elastomer, probably through a Maxwell–Wagner–Sillars relaxation phenomenon. (paper)

  7. 3D Simulations of Space Charge Effects in Particle Beams

    Energy Technology Data Exchange (ETDEWEB)

    Adelmann, A


    For the first time, it is possible to calculate the complicated three-dimensional proton accelerator structures at the Paul Scherrer Institut (PSI). Under consideration are external and self effects, arising from guiding and space-charge forces. This thesis has as its theme the design, implementation and validation of a tracking program for charged particles in accelerator structures. This work form part of the discipline of Computational Science and Engineering (CSE), more specifically in computational accelerator modelling. The physical model is based on the collisionless Vlasov-Maxwell theory, justified by the low density ({approx} 10{sup 9} protons/cm{sup 3}) of the beam and of the residual gas. The probability of large angle scattering between the protons and the residual gas is then sufficiently low, as can be estimated by considering the mean free path and the total distance a particle travels in the accelerator structure. (author)

  8. Monte Carlo simulations of Microdosimetry for Space Research at FAIR

    International Nuclear Information System (INIS)

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Bleicher, Marcus


    The exposure to high charge and energy (HZE) particles is one of major concerns for humans during their missions in space. As radiation effects essentialy depend on charge, mass and energy of cosmic-ray particles, the radiation quality has to be investigated, e.g. by means of microdosimetry measurements on the board of a spacecraft. We benchmark the electromagnetic models of the Geant4 toolkit with microdosimetry data obtained with a walled Tissue Equivalent Proportional Counter (TEPC) with beams of HZE particles. Our MCHIT model is able to reproduce in general the response functions and microdosimetry variables for nuclear beams from He to Fe with energies of 80–400 MeV per nucleon.

  9. 3D Simulations of Space Charge Effects in Particle Beams

    International Nuclear Information System (INIS)

    Adelmann, A.


    For the first time, it is possible to calculate the complicated three-dimensional proton accelerator structures at the Paul Scherrer Institut (PSI). Under consideration are external and self effects, arising from guiding and space-charge forces. This thesis has as its theme the design, implementation and validation of a tracking program for charged particles in accelerator structures. This work form part of the discipline of Computational Science and Engineering (CSE), more specifically in computational accelerator modelling. The physical model is based on the collisionless Vlasov-Maxwell theory, justified by the low density (∼ 10 9 protons/cm 3 ) of the beam and of the residual gas. The probability of large angle scattering between the protons and the residual gas is then sufficiently low, as can be estimated by considering the mean free path and the total distance a particle travels in the accelerator structure. (author)

  10. Altitude simulation facility for testing large space motors (United States)

    Katz, U.; Lustig, J.; Cohen, Y.; Malkin, I.


    This work describes the design of an altitude simulation facility for testing the AKM motor installed in the 'Ofeq' satellite launcher. The facility, which is controlled by a computer, consists of a diffuser and a single-stage ejector fed with preheated air. The calculations of performance and dimensions of the gas extraction system were conducted according to a one-dimensional analysis. Tests were carried out on a small-scale model of the facility in order to examine the design concept, then the full-scale facility was constructed and operated. There was good agreement among the results obtained from the small-scale facility, from the full-scale facility, and from calculations.

  11. Interfacing Space Communications and Navigation Network Simulation with Distributed System Integration Laboratories (DSIL) (United States)

    Jennings, Esther H.; Nguyen, Sam P.; Wang, Shin-Ywan; Woo, Simon S.


    NASA's planned Lunar missions will involve multiple NASA centers where each participating center has a specific role and specialization. In this vision, the Constellation program (CxP)'s Distributed System Integration Laboratories (DSIL) architecture consist of multiple System Integration Labs (SILs), with simulators, emulators, testlabs and control centers interacting with each other over a broadband network to perform test and verification for mission scenarios. To support the end-to-end simulation and emulation effort of NASA' exploration initiatives, different NASA centers are interconnected to participate in distributed simulations. Currently, DSIL has interconnections among the following NASA centers: Johnson Space Center (JSC), Kennedy Space Center (KSC), Marshall Space Flight Center (MSFC) and Jet Propulsion Laboratory (JPL). Through interconnections and interactions among different NASA centers, critical resources and data can be shared, while independent simulations can be performed simultaneously at different NASA locations, to effectively utilize the simulation and emulation capabilities at each center. Furthermore, the development of DSIL can maximally leverage the existing project simulation and testing plans. In this work, we describe the specific role and development activities at JPL for Space Communications and Navigation Network (SCaN) simulator using the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to simulate communications effects among mission assets. Using MACHETE, different space network configurations among spacecrafts and ground systems of various parameter sets can be simulated. Data that is necessary for tracking, navigation, and guidance of spacecrafts such as Crew Exploration Vehicle (CEV), Crew Launch Vehicle (CLV), and Lunar Relay Satellite (LRS) and orbit calculation data are disseminated to different NASA centers and updated periodically using the High Level Architecture (HLA). In

  12. Performing the comic side of bodily abjection: A study of twenty-first century female stand-up comedy in a multi-cultural and multi-racial Britain


    Blunden, Pamela


    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University. This thesis is a socio-cultural study of the development of female stand-up comedy in the first decade of the twenty-first century within a multi-racial and multi-cultural Britain. It also engages with the theory and practice of performance and asks the question: ‘In what ways can it be said that female stand-up comics perform the comic side of bodily abjection?’ This question is applied to t...

  13. Modeling extreme (Carrington-type) space weather events using three-dimensional MHD code simulations (United States)

    Ngwira, C. M.; Pulkkinen, A. A.; Kuznetsova, M. M.; Glocer, A.


    There is growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure and systems. In the last two decades, significant progress has been made towards the modeling of space weather events. Three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, and have played a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for existing global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events that have a ground footprint comparable (or larger) to the Carrington superstorm. Results are presented for an initial simulation run with ``very extreme'' constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated ground induced geoelectric field to such extreme driving conditions. We also discuss the results and what they might mean for the accuracy of the simulations. The model is further tested using input data for an observed space weather event to verify the MHD model consistence and to draw guidance for future work. This extreme space weather MHD model is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in earth conductors such as power transmission grids.

  14. Twenty-First-Century Aerial Mining (United States)


    blockade (fig. 3).19 It has two parallel inbound and outbound shipping channels, each 1,200 feet wide with a dredged depth averaging 40 feet. East...Sicily is a large island, Operation Husky required a stagger- ing logistical effort. Even had substantial losses occurred, Allied forces possessed...partially dependent on maritime logistics for trade and support to military operations, the renewed capability to deploy mines while maintaining

  15. Twenty-First Century Pathologists' Advocacy. (United States)

    Allen, Timothy Craig


    Pathologists' advocacy plays a central role in the establishment of continuously improving patient care quality and patient safety, and in the maintenance and progress of pathology as a profession. Pathology advocacy's primary goal is the betterment of patient safety and quality medical care; however, payment is a necessary and appropriate component to both, and has a central role in advocacy. Now is the time to become involved in pathology advocacy; the Medicare Access and Children's Health Insurance Program (CHIP) Reauthorization Act of 2015 (MACRA) and the Protecting Access to Medicare Act of 2014 (PAMA) are 2 of the most consequential pieces of legislation impacting the pathology and laboratory industry in the last 20 years. Another current issue of far-reaching impact for pathologists is balance billing, and yet many pathologists have little or no understanding of balance billing. Pathologists at all stages of their careers, and in every professional setting, need to participate. Academic pathologists have a special obligation to, if not become directly involved in advocacy, at least have a broad and current understanding of those issues, as well as the need and responsibility of pathologists to actively engage in advocacy efforts to address them, in order to teach residents the place of advocacy, and its value, as an inseparable and indispensable component of their professional responsibilities.

  16. Departmentalization and Twenty-First Century Skills (United States)

    Watts, Toy Coles


    The purpose of this study was to investigate the relationship between school organizational style and student outcomes. The research questions that guided this study were, "Is there a difference in mathematical performance of fourth graders who receive departmentalized instruction as compared to fourth grade students who receive…

  17. Educating for the Twenty-First Century (United States)

    Ramaley, Judith A.


    In his first inaugural speech, President Obama declared that "our schools fail too many" and an essential component of laying "a new foundation for growth" will be "to transform our schools and colleges and universities to meet the demands of a new age." Concerns about our nation's position in the global education race have led to a focus on…

  18. The politics of space mining - An account of a simulation game (United States)

    Paikowsky, Deganit; Tzezana, Roey


    Celestial bodies like the Moon and asteroids contain materials and precious metals, which are valuable for human activity on Earth and beyond. Space mining has been mainly relegated to the realm of science fiction, and was not treated seriously by the international community. The private industry is starting to assemble towards space mining, and success on this front would have major impact on all nations. We present in this paper a review of current space mining ventures, and the international legislation, which could stand in their way - or aid them in their mission. Following that, we present the results of a role-playing simulation in which the role of several important nations was played by students of international relations. The results of the simulation are used as a basis for forecasting the potential initial responses of the nations of the world to a successful space mining operation in the future.

  19. Benchmark of Space Charge Simulations and Comparison with Experimental Results for High Intensity, Low Energy Accelerators

    CERN Document Server

    Cousineau, Sarah M


    Space charge effects are a major contributor to beam halo and emittance growth leading to beam loss in high intensity, low energy accelerators. As future accelerators strive towards unprecedented levels of beam intensity and beam loss control, a more comprehensive understanding of space charge effects is required. A wealth of simulation tools have been developed for modeling beams in linacs and rings, and with the growing availability of high-speed computing systems, computationally expensive problems that were inconceivable a decade ago are now being handled with relative ease. This has opened the field for realistic simulations of space charge effects, including detailed benchmarks with experimental data. A great deal of effort is being focused in this direction, and several recent benchmark studies have produced remarkably successful results. This paper reviews the achievements in space charge benchmarking in the last few years, and discusses the challenges that remain.

  20. Designing a Distributed Space Systems Simulation in Accordance with the Simulation Interoperability Standards Organization (SISO) (United States)

    Cowen, Benjamin


    Simulations are essential for engineering design. These virtual realities provide characteristic data to scientists and engineers in order to understand the details and complications of the desired mission. A standard development simulation package known as Trick is used in developing a source code to model a component (federate in HLA terms). The runtime executive is integrated into an HLA based distributed simulation. TrickHLA is used to extend a Trick simulation for a federation execution, develop a source code for communication between federates, as well as foster data input and output. The project incorporates international cooperation along with team collaboration. Interactions among federates occur throughout the simulation, thereby relying on simulation interoperability. Communication through the semester went on between participants to figure out how to create this data exchange. The NASA intern team is designing a Lunar Rover federate and a Lunar Shuttle federate. The Lunar Rover federate supports transportation across the lunar surface and is essential for fostering interactions with other federates on the lunar surface (Lunar Shuttle, Lunar Base Supply Depot and Mobile ISRU Plant) as well as transporting materials to the desired locations. The Lunar Shuttle federate transports materials to and from lunar orbit. Materials that it takes to the supply depot include fuel and cargo necessary to continue moon-base operations. This project analyzes modeling and simulation technologies as well as simulation interoperability. Each team from participating universities will work on and engineer their own federate(s) to participate in the SISO Spring 2011 Workshop SIW Smackdown in Boston, Massachusetts. This paper will focus on the Lunar Rover federate.

  1. Application of the Intervention Mapping Framework to Develop an Integrated Twenty-first Century Core Curriculum-Part Two: Translation of MPH Core Competencies into an Integrated Theory-Based Core Curriculum. (United States)

    Corvin, Jaime A; DeBate, Rita; Wolfe-Quintero, Kate; Petersen, Donna J


    In the twenty-first century, the dynamics of health and health care are changing, necessitating a commitment to revising traditional public health curricula to better meet present day challenges. This article describes how the College of Public Health at the University of South Florida utilized the Intervention Mapping framework to translate revised core competencies into an integrated, theory-driven core curriculum to meet the training needs of the twenty-first century public health scholar and practitioner. This process resulted in the development of four sequenced courses: History and Systems of Public Health and Population Assessment I delivered in the first semester and Population Assessment II and Translation to Practice delivered in the second semester. While the transformation process, moving from traditional public health core content to an integrated and innovative curriculum, is a challenging and daunting task, Intervention Mapping provides the ideal framework for guiding this process. Intervention mapping walks the curriculum developers from the broad goals and objectives to the finite details of a lesson plan. Throughout this process, critical lessons were learned, including the importance of being open to new ideologies and frameworks and the critical need to involve key-stakeholders in every step of the decision-making process to ensure the sustainability of the resulting integrated and theory-based curriculum. Ultimately, as a stronger curriculum emerged, the developers and instructors themselves were changed, fostering a stronger public health workforce from within.

  2. Application of the Intervention Mapping Framework to Develop an Integrated Twenty-First Century Core Curriculum-Part 1: Mobilizing the Community to Revise the Masters of Public Health Core Competencies. (United States)

    DeBate, Rita; Corvin, Jaime A; Wolfe-Quintero, Kate; Petersen, Donna J


    Twenty-first century health challenges have significantly altered the expanding role and functions of public health professionals. Guided by a call from the Association of Schools and Programs of Public Health's (ASPPH) and the Framing the Future: The Second 100 Years of Education for Public Health report to adopt new and innovative approaches to prepare public health leaders, the University of South Florida College of Public Health aimed to self-assess the current Masters of Public Health (MPH) core curriculum with regard to preparing students to meet twenty-first century public health challenges. This paper describes how Intervention Mapping was employed as a framework to increase readiness and mobilize the COPH community for curricular change. Intervention Mapping provides an ideal framework, allowing organizations to access capacity, specify goals, and guide the change process from curriculum development to implementation and evaluation of competency-driven programs. The steps outlined in this paper resulted in a final set of revised MPH core competencies that are interdisciplinary in nature and fulfill the emergent needs to address changing trends in both public health education and challenges in population health approaches. Ultimately, the competencies developed through this process were agreed upon by the entire College of Public Health faculty, signaling one college's readiness for change, while providing the impetus to revolutionize the delivery of public health education at the University of South Florida.

  3. Theory and Simulation of the Physics of Space Charge Dominated Beams

    International Nuclear Information System (INIS)

    Haber, Irving


    This report describes modeling of intense electron and ion beams in the space charge dominated regime. Space charge collective modes play an important role in the transport of intense beams over long distances. These modes were first observed in particle-in-cell simulations. The work presented here is closely tied to the University of Maryland Electron Ring (UMER) experiment and has application to accelerators for heavy ion beam fusion

  4. Monte Carlo simulation of a medical linear accelerator for generation of phase spaces

    International Nuclear Information System (INIS)

    Oliveira, Alex C.H.; Santana, Marcelo G.; Lima, Fernando R.A.; Vieira, Jose W.


    Radiotherapy uses various techniques and equipment for local treatment of cancer. The equipment most often used in radiotherapy to the patient irradiation are linear accelerators (Linacs) which produce beams of X-rays in the range 5-30 MeV. Among the many algorithms developed over recent years for evaluation of dose distributions in radiotherapy planning, the algorithms based on Monte Carlo methods have proven to be very promising in terms of accuracy by providing more realistic results. The MC methods allow simulating the transport of ionizing radiation in complex configurations, such as detectors, Linacs, phantoms, etc. The MC simulations for applications in radiotherapy are divided into two parts. In the first, the simulation of the production of the radiation beam by the Linac is performed and then the phase space is generated. The phase space contains information such as energy, position, direction, etc. og millions of particles (photos, electrons, positrons). In the second part the simulation of the transport of particles (sampled phase space) in certain configurations of irradiation field is performed to assess the dose distribution in the patient (or phantom). The objective of this work is to create a computational model of a 6 MeV Linac using the MC code Geant4 for generation of phase spaces. From the phase space, information was obtained to asses beam quality (photon and electron spectra and two-dimensional distribution of energy) and analyze the physical processes involved in producing the beam. (author)

  5. DataSpaces: An Interaction and Coordination Framework for Coupled Simulation Workflows

    International Nuclear Information System (INIS)

    Docan, Ciprian; Klasky, Scott A.; Parashar, Manish


    Emerging high-performance distributed computing environments are enabling new end-to-end formulations in science and engineering that involve multiple interacting processes and data-intensive application workflows. For example, current fusion simulation efforts are exploring coupled models and codes that simultaneously simulate separate application processes, such as the core and the edge turbulence, and run on different high performance computing resources. These components need to interact, at runtime, with each other and with services for data monitoring, data analysis and visualization, and data archiving. As a result, they require efficient support for dynamic and flexible couplings and interactions, which remains a challenge. This paper presents Data-Spaces, a flexible interaction and coordination substrate that addresses this challenge. DataSpaces essentially implements a semantically specialized virtual shared space abstraction that can be associatively accessed by all components and services in the application workflow. It enables live data to be extracted from running simulation components, indexes this data online, and then allows it to be monitored, queried and accessed by other components and services via the space using semantically meaningful operators. The underlying data transport is asynchronous, low-overhead and largely memory-to-memory. The design, implementation, and experimental evaluation of DataSpaces using a coupled fusion simulation workflow is presented.

  6. Design space development for the extraction process of Danhong injection using a Monte Carlo simulation method.

    Directory of Open Access Journals (Sweden)

    Xingchu Gong

    Full Text Available A design space approach was applied to optimize the extraction process of Danhong injection. Dry matter yield and the yields of five active ingredients were selected as process critical quality attributes (CQAs. Extraction number, extraction time, and the mass ratio of water and material (W/M ratio were selected as critical process parameters (CPPs. Quadratic models between CPPs and CQAs were developed with determination coefficients higher than 0.94. Active ingredient yields and dry matter yield increased as the extraction number increased. Monte-Carlo simulation with models established using a stepwise regression method was applied to calculate the probability-based design space. Step length showed little effect on the calculation results. Higher simulation number led to results with lower dispersion. Data generated in a Monte Carlo simulation following a normal distribution led to a design space with a smaller size. An optimized calculation condition was obtained with 10,000 simulation times, 0.01 calculation step length, a significance level value of 0.35 for adding or removing terms in a stepwise regression, and a normal distribution for data generation. The design space with a probability higher than 0.95 to attain the CQA criteria was calculated and verified successfully. Normal operating ranges of 8.2-10 g/g of W/M ratio, 1.25-1.63 h of extraction time, and two extractions were recommended. The optimized calculation conditions can conveniently be used in design space development for other pharmaceutical processes.

  7. 26th Space Simulation Conference Proceedings. Environmental Testing: The Path Forward (United States)

    Packard, Edward A.


    Topics covered include: A Multifunctional Space Environment Simulation Facility for Accelerated Spacecraft Materials Testing; Exposure of Spacecraft Surface Coatings in a Simulated GEO Radiation Environment; Gravity-Offloading System for Large-Displacement Ground Testing of Spacecraft Mechanisms; Microscopic Shutters Controlled by cRIO in Sounding Rocket; Application of a Physics-Based Stabilization Criterion to Flight System Thermal Testing; Upgrade of a Thermal Vacuum Chamber for 20 Kelvin Operations; A New Approach to Improve the Uniformity of Solar Simulator; A Perfect Space Simulation Storm; A Planetary Environmental Simulator/Test Facility; Collimation Mirror Segment Refurbishment inside ESA s Large Space; Space Simulation of the CBERS 3 and 4 Satellite Thermal Model in the New Brazilian 6x8m Thermal Vacuum Chamber; The Certification of Environmental Chambers for Testing Flight Hardware; Space Systems Environmental Test Facility Database (SSETFD), Website Development Status; Wallops Flight Facility: Current and Future Test Capabilities for Suborbital and Orbital Projects; Force Limited Vibration Testing of JWST NIRSpec Instrument Using Strain Gages; Investigation of Acoustic Field Uniformity in Direct Field Acoustic Testing; Recent Developments in Direct Field Acoustic Testing; Assembly, Integration and Test Centre in Malaysia: Integration between Building Construction Works and Equipment Installation; Complex Ground Support Equipment for Satellite Thermal Vacuum Test; Effect of Charging Electron Exposure on 1064nm Transmission through Bare Sapphire Optics and SiO2 over HfO2 AR-Coated Sapphire Optics; Environmental Testing Activities and Capabilities for Turkish Space Industry; Integrated Circuit Reliability Simulation in Space Environments; Micrometeoroid Impacts and Optical Scatter in Space Environment; Overcoming Unintended Consequences of Ambient Pressure Thermal Cycling Environmental Tests; Performance and Functionality Improvements to Next Generation

  8. Modeling and Simulation of DC Power Electronics Systems Using Harmonic State Space (HSS) Method

    DEFF Research Database (Denmark)

    Kwon, Jun Bum; Wang, Xiongfei; Bak, Claus Leth


    based on the state-space averaging and generalized averaging, these also have limitations to show the same results as with the non-linear time domain simulations. This paper presents a modeling and simulation method for a large dc power electronic system by using Harmonic State Space (HSS) modeling......For the efficiency and simplicity of electric systems, the dc based power electronics systems are widely used in variety applications such as electric vehicles, ships, aircrafts and also in homes. In these systems, there could be a number of dynamic interactions between loads and other dc-dc....... Through this method, the required computation time and CPU memory for large dc power electronics systems can be reduced. Besides, the achieved results show the same results as with the non-linear time domain simulation, but with the faster simulation time which is beneficial in a large network....

  9. A Coordinated Initialization Process for the Distributed Space Exploration Simulation (DSES) (United States)

    Phillips, Robert; Dexter, Dan; Hasan, David; Crues, Edwin Z.


    This document describes the federate initialization process that was developed at the NASA Johnson Space Center with the HIIA Transfer Vehicle Flight Controller Trainer (HTV FCT) simulations and refined in the Distributed Space Exploration Simulation (DSES). These simulations use the High Level Architecture (HLA) IEEE 1516 to provide the communication and coordination between the distributed parts of the simulation. The purpose of the paper is to describe a generic initialization sequence that can be used to create a federate that can: 1. Properly initialize all HLA objects, object instances, interactions, and time management 2. Check for the presence of all federates 3. Coordinate startup with other federates 4. Robustly initialize and share initial object instance data with other federates.

  10. An IBM PC-based math model for space station solar array simulation (United States)

    Emanuel, E. M.


    This report discusses and documents the design, development, and verification of a microcomputer-based solar cell math model for simulating the Space Station's solar array Initial Operational Capability (IOC) reference configuration. The array model is developed utilizing a linear solar cell dc math model requiring only five input parameters: short circuit current, open circuit voltage, maximum power voltage, maximum power current, and orbit inclination. The accuracy of this model is investigated using actual solar array on orbit electrical data derived from the Solar Array Flight Experiment/Dynamic Augmentation Experiment (SAFE/DAE), conducted during the STS-41D mission. This simulator provides real-time simulated performance data during the steady state portion of the Space Station orbit (i.e., array fully exposed to sunlight). Eclipse to sunlight transients and shadowing effects are not included in the analysis, but are discussed briefly. Integrating the Solar Array Simulator (SAS) into the Power Management and Distribution (PMAD) subsystem is also discussed.

  11. An alternative phase-space distribution to sample initial conditions for classical dynamics simulations

    International Nuclear Information System (INIS)

    Garcia-Vela, A.


    A new quantum-type phase-space distribution is proposed in order to sample initial conditions for classical trajectory simulations. The phase-space distribution is obtained as the modulus of a quantum phase-space state of the system, defined as the direct product of the coordinate and momentum representations of the quantum initial state. The distribution is tested by sampling initial conditions which reproduce the initial state of the Ar-HCl cluster prepared by ultraviolet excitation, and by simulating the photodissociation dynamics by classical trajectories. The results are compared with those of a wave packet calculation, and with a classical simulation using an initial phase-space distribution recently suggested. A better agreement is found between the classical and the quantum predictions with the present phase-space distribution, as compared with the previous one. This improvement is attributed to the fact that the phase-space distribution propagated classically in this work resembles more closely the shape of the wave packet propagated quantum mechanically

  12. Space charge and magnet error simulations for the SNS accumulator ring

    International Nuclear Information System (INIS)

    Beebe-Wang, J.; Fedotov, A.V.; Wei, J.; Machida, S.


    The effects of space charge forces and magnet errors in the beam of the Spallation Neutron Source (SNS) accumulator ring are investigated. In this paper, the focus is on the emittance growth and halo/tail formation in the beam due to space charge with and without magnet errors. The beam properties of different particle distributions resulting from various injection painting schemes are investigated. Different working points in the design of SNS accumulator ring lattice are compared. The simulations in close-to-resonance condition in the presence of space charge and magnet errors are presented. (author)

  13. WENESSA, Wide Eye-Narrow Eye Space Simulation fo Situational Awareness (United States)

    Albarait, O.; Payne, D. M.; LeVan, P. D.; Luu, K. K.; Spillar, E.; Freiwald, W.; Hamada, K.; Houchard, J.

    In an effort to achieve timelier indications of anomalous object behaviors in geosynchronous earth orbit, a Planning Capability Concept (PCC) for a “Wide Eye-Narrow Eye” (WE-NE) telescope network has been established. The PCC addresses the problem of providing continuous and operationally robust, layered and cost-effective, Space Situational Awareness (SSA) that is focused on monitoring deep space for anomalous behaviors. It does this by first detecting the anomalies with wide field of regard systems, and then providing reliable handovers for detailed observational follow-up by another optical asset. WENESSA will explore the added value of such a system to the existing Space Surveillance Network (SSN). The study will assess and quantify the degree to which the PCC completely fulfills, or improves or augments, these deep space knowledge deficiencies relative to current operational systems. In order to improve organic simulation capabilities, we will explore options for the federation of diverse community simulation approaches, while evaluating the efficiencies offered by a network of small and larger aperture, ground-based telescopes. Existing Space Modeling and Simulation (M&S) tools designed for evaluating WENESSA-like problems will be taken into consideration as we proceed in defining and developing the tools needed to perform this study, leading to the creation of a unified Space M&S environment for the rapid assessment of new capabilities. The primary goal of this effort is to perform a utility assessment of the WE-NE concept. The assessment will explore the mission utility of various WE-NE concepts in discovering deep space anomalies in concert with the SSN. The secondary goal is to generate an enduring modeling and simulation environment to explore the utility of future proposed concepts and supporting technologies. Ultimately, our validated simulation framework would support the inclusion of other ground- and space-based SSA assets through integrated

  14. Numerical simulation of electromagnetic waves in Schwarzschild space-time by finite difference time domain method and Green function method (United States)

    Jia, Shouqing; La, Dongsheng; Ma, Xuelian


    The finite difference time domain (FDTD) algorithm and Green function algorithm are implemented into the numerical simulation of electromagnetic waves in Schwarzschild space-time. FDTD method in curved space-time is developed by filling the flat space-time with an equivalent medium. Green function in curved space-time is obtained by solving transport equations. Simulation results validate both the FDTD code and Green function code. The methods developed in this paper offer a tool to solve electromagnetic scattering problems.

  15. Novel simulation method of space charge effects in electron optical systems including emission of electrons

    Czech Academy of Sciences Publication Activity Database

    Zelinka, Jiří; Oral, Martin; Radlička, Tomáš


    Roč. 184, JAN (2018), s. 66-76 ISSN 0304-3991 R&D Projects: GA MŠk(CZ) LO1212; GA MŠk ED0017/01/01 Institutional support: RVO:68081731 Keywords : space charge * self-consistent simulation * aberration polynomial * electron emission Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 2.843, year: 2016

  16. Being an "Agent Provocateur": Utilising Online Spaces for Teacher Professional Development in Virtual Simulation Games (United States)

    deNoyelles, Aimee; Raider-Roth, Miriam


    This article details the results of an action research study which investigated how teachers used online learning community spaces to develop and support their teaching and learning of the Jewish Court of All Time (JCAT), a web-mediated, character-playing, simulation game that engages participants with social, historical and cultural curricula.…

  17. Using Monte Carlo Simulation To Improve Cargo Mass Estimates For International Space Station Commercial Resupply Flights (United States)


    The Challenges of ISS Resupply .......................................... 23 F. THE IMPORTANCE OF MASS PROPERTIES IN SPACECRAFT AND MISSION DESIGN...Transportation System TBA trundle bearing assembly VLC verification loads cycle xv EXECUTIVE SUMMARY Resupplying the International Space priorities. This study addresses those challenges by developing Monte Carlo simulations based on over 13 years of as- flownSS resupply


    International Nuclear Information System (INIS)

    Staff, Jan E.; Niebergal, Brian P.; Ouyed, Rachid; Pudritz, Ralph E.; Cai, Kai


    We perform state-of-the-art, three-dimensional, time-dependent simulations of magnetized disk winds, carried out to simulation scales of 60 AU, in order to confront optical Hubble Space Telescope observations of protostellar jets. We 'observe' the optical forbidden line emission produced by shocks within our simulated jets and compare these with actual observations. Our simulations reproduce the rich structure of time-varying jets, including jet rotation far from the source, an inner (up to 400 km s -1 ) and outer (less than 100 km s -1 ) component of the jet, and jet widths of up to 20 AU in agreement with observed jets. These simulations when compared with the data are able to constrain disk wind models. In particular, models featuring a disk magnetic field with a modest radial spatial variation across the disk are favored.

  19. The daylighting dashboard - A simulation-based design analysis for daylit spaces

    Energy Technology Data Exchange (ETDEWEB)

    Reinhart, Christoph F. [Harvard University, Graduate School of Design, 48 Quincy Street, Cambridge, MA 02138 (United States); Wienold, Jan [Fraunhofer Institute for Solar Energy Systems, Heidenhofstrasse 2, 79110 Freiburg (Germany)


    This paper presents a vision of how state-of-the-art computer-based analysis techniques can be effectively used during the design of daylit spaces. Following a review of recent advances in dynamic daylight computation capabilities, climate-based daylighting metrics, occupant behavior and glare analysis, a fully integrated design analysis method is introduced that simultaneously considers annual daylight availability, visual comfort and energy use: Annual daylight glare probability profiles are combined with an occupant behavior model in order to determine annual shading profiles and visual comfort conditions throughout a space. The shading profiles are then used to calculate daylight autonomy plots, energy loads, operational energy costs and green house gas emissions. The paper then shows how simulation results for a sidelit space can be visually presented to simulation non-experts using the concept of a daylighting dashboard. The paper ends with a discussion of how the daylighting dashboard could be practically implemented using technologies that are available today. (author)

  20. Surgical Space Suits Increase Particle and Microbiological Emission Rates in a Simulated Surgical Environment. (United States)

    Vijaysegaran, Praveen; Knibbs, Luke D; Morawska, Lidia; Crawford, Ross W


    The role of space suits in the prevention of orthopedic prosthetic joint infection remains unclear. Recent evidence suggests that space suits may in fact contribute to increased infection rates, with bioaerosol emissions from space suits identified as a potential cause. This study aimed to compare the particle and microbiological emission rates (PER and MER) of space suits and standard surgical clothing. A comparison of emission rates between space suits and standard surgical clothing was performed in a simulated surgical environment during 5 separate experiments. Particle counts were analyzed with 2 separate particle counters capable of detecting particles between 0.1 and 20 μm. An Andersen impactor was used to sample bacteria, with culture counts performed at 24 and 48 hours. Four experiments consistently showed statistically significant increases in both PER and MER when space suits are used compared with standard surgical clothing. One experiment showed inconsistent results, with a trend toward increases in both PER and MER when space suits are used compared with standard surgical clothing. Space suits cause increased PER and MER compared with standard surgical clothing. This finding provides mechanistic evidence to support the increased prosthetic joint infection rates observed in clinical studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Numerical simulation of a cabin ventilation subsystem in a space station oriented real-time system

    Directory of Open Access Journals (Sweden)

    Zezheng QIU


    Full Text Available An environment control and life support system (ECLSS is an important system in a space station. The ECLSS is a typical complex system, and the real-time simulation technology can help to accelerate its research process by using distributed hardware in a loop simulation system. An implicit fixed time step numerical integration method is recommended for a real-time simulation system with time-varying parameters. However, its computational efficiency is too low to satisfy the real-time data interaction, especially for the complex ECLSS system running on a PC cluster. The instability problem of an explicit method strongly limits its application in the ECLSS real-time simulation although it has a high computational efficiency. This paper proposes an improved numerical simulation method to overcome the instability problem based on the explicit Euler method. A temperature and humidity control subsystem (THCS is firstly established, and its numerical stability is analyzed by using the eigenvalue estimation theory. Furthermore, an adaptive operator is proposed to avoid the potential instability problem. The stability and accuracy of the proposed method are investigated carefully. Simulation results show that this proposed method can provide a good way for some complex time-variant systems to run their real-time simulation on a PC cluster. Keywords: Numerical integration method, Real-time simulation, Stability, THCS, Time-variant system

  2. Integrated visualization of simulation results and experimental devices in virtual-reality space

    International Nuclear Information System (INIS)

    Ohtani, Hiroaki; Ishiguro, Seiji; Shohji, Mamoru; Kageyama, Akira; Tamura, Yuichi


    We succeeded in integrating the visualization of both simulation results and experimental device data in virtual-reality (VR) space using CAVE system. Simulation results are shown using Virtual LHD software, which can show magnetic field line, particle trajectory, and isosurface of plasma pressure of the Large Helical Device (LHD) based on data from the magnetohydrodynamics equilibrium simulation. A three-dimensional mouse, or wand, determines the initial position and pitch angle of a drift particle or the starting point of a magnetic field line, interactively in the VR space. The trajectory of a particle and the stream-line of magnetic field are calculated using the Runge-Kutta-Huta integration method on the basis of the results obtained after pointing the initial condition. The LHD vessel is objectively visualized based on CAD-data. By using these results and data, the simulated LHD plasma can be interactively drawn in the objective description of the LHD experimental vessel. Through this integrated visualization, it is possible to grasp the three-dimensional relationship of the positions between the device and plasma in the VR space, opening a new path in contribution to future research. (author)

  3. Simulations of the observation of clouds and aerosols with the Experimental Lidar in Space Equipment system. (United States)

    Liu, Z; Voelger, P; Sugimoto, N


    We carried out a simulation study for the observation of clouds and aerosols with the Japanese Experimental Lidar in Space Equipment (ELISE), which is a two-wavelength backscatter lidar with three detection channels. The National Space Development Agency of Japan plans to launch the ELISE on the Mission Demonstrate Satellite 2 (MDS-2). In the simulations, the lidar return signals for the ELISE are calculated for an artificial, two-dimensional atmospheric model including different types of clouds and aerosols. The signal detection processes are simulated realistically by inclusion of various sources of noise. The lidar signals that are generated are then used as input for simulations of data analysis with inversion algorithms to investigate retrieval of the optical properties of clouds and aerosols. The results demonstrate that the ELISE can provide global data on the structures and optical properties of clouds and aerosols. We also conducted an analysis of the effects of cloud inhomogeneity on retrievals from averaged lidar profiles. We show that the effects are significant for space lidar observations of optically thick broken clouds.

  4. Simulation of Cascaded Longitudinal-Space-Charge Amplifier at the Fermilab Accelerator Science & Technology (Fast) Facility

    Energy Technology Data Exchange (ETDEWEB)

    Halavanau, A. [Northern Illinois U.; Piot, P. [Northern Illinois U.


    Cascaded Longitudinal Space Charge Amplifiers (LSCA) have been proposed as a mechanism to generate density modulation over a board spectral range. The scheme has been recently demonstrated in the optical regime and has confirmed the production of broadband optical radiation. In this paper we investigate, via numerical simulations, the performance of a cascaded LSCA beamline at the Fermilab Accelerator Science & Technology (FAST) facility to produce broadband ultraviolet radiation. Our studies are carried out using elegant with included tree-based grid-less space charge algorithm.

  5. Energy content of stormtime ring current from phase space mapping simulations

    International Nuclear Information System (INIS)

    Chen, M.W.; Schulz, M.; Lyons, L.R.


    The authors perform a model study to account for the increase in energy content of the trapped-particle population which occurs during the main phase of major geomagnetic storms. They consider stormtime particle transport in the equatorial region of the magnetosphere. They start with a phase space distribution of the ring current before the storm, created by a steady state transport model. They then use a previously developed guiding center particle simulation to map the stormtime ring current phase space, following Liouville's theorem. This model is able to account for the ten to twenty fold increase in energy content of magnetospheric ions during the storm

  6. Validated simulator for space debris removal with nets and other flexible tethers applications (United States)

    Gołębiowski, Wojciech; Michalczyk, Rafał; Dyrek, Michał; Battista, Umberto; Wormnes, Kjetil


    In the context of active debris removal technologies and preparation activities for the e.Deorbit mission, a simulator for net-shaped elastic bodies dynamics and their interactions with rigid bodies, has been developed. Its main application is to aid net design and test scenarios for space debris deorbitation. The simulator can model all the phases of the debris capturing process: net launch, flight and wrapping around the target. It handles coupled simulation of rigid and flexible bodies dynamics. Flexible bodies were implemented using Cosserat rods model. It allows to simulate flexible threads or wires with elasticity and damping for stretching, bending and torsion. Threads may be combined into structures of any topology, so the software is able to simulate nets, pure tethers, tether bundles, cages, trusses, etc. Full contact dynamics was implemented. Programmatic interaction with simulation is possible - i.e. for control implementation. The underlying model has been experimentally validated and due to significant gravity influence, experiment had to be performed in microgravity conditions. Validation experiment for parabolic flight was a downscaled process of Envisat capturing. The prepacked net was launched towards the satellite model, it expanded, hit the model and wrapped around it. The whole process was recorded with 2 fast stereographic camera sets for full 3D trajectory reconstruction. The trajectories were used to compare net dynamics to respective simulations and then to validate the simulation tool. The experiments were performed on board of a Falcon-20 aircraft, operated by National Research Council in Ottawa, Canada. Validation results show that model reflects phenomenon physics accurately enough, so it may be used for scenario evaluation and mission design purposes. The functionalities of the simulator are described in detail in the paper, as well as its underlying model, sample cases and methodology behind validation. Results are presented and

  7. ML-Space: Hybrid Spatial Gillespie and Particle Simulation of Multi-Level Rule-Based Models in Cell Biology. (United States)

    Bittig, Arne T; Uhrmacher, Adelinde M


    Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.

  8. Simulation of space radiation effects on polyimide film materials for high temperature applications. Final report

    International Nuclear Information System (INIS)

    Fogdall, L.B.; Cannaday, S.S.


    Space environment effects on candidate materials for the solar sail film are determined. Polymers, including metallized polyimides that might be suitable solar radiation receivers, were exposed to combined proton and solar electromagnetic radiation. Each test sample was weighted, to simulate the tension on the polymer when it is stretched into near-planar shape while receiving solar radiation. Exposure rates up to 16 times that expected in Earth orbit were employed, to simulate near-sun solar sailing conditions. Sample appearance, elongation, and shrinkage were monitored, noted, and documented in situ. Thermosetting polyimides showed less degradation or visual change in appearance than thermoplastics

  9. Simulation Evaluation of Controller-Managed Spacing Tools under Realistic Operational Conditions (United States)

    Callantine, Todd J.; Hunt, Sarah M.; Prevot, Thomas


    Controller-Managed Spacing (CMS) tools have been developed to aid air traffic controllers in managing high volumes of arriving aircraft according to a schedule while enabling them to fly efficient descent profiles. The CMS tools are undergoing refinement in preparation for field demonstration as part of NASA's Air Traffic Management (ATM) Technology Demonstration-1 (ATD-1). System-level ATD-1 simulations have been conducted to quantify expected efficiency and capacity gains under realistic operational conditions. This paper presents simulation results with a focus on CMS-tool human factors. The results suggest experienced controllers new to the tools find them acceptable and can use them effectively in ATD-1 operations.

  10. Modeling extreme "Carrington-type" space weather events using three-dimensional global MHD simulations (United States)

    Ngwira, Chigomezyo M.; Pulkkinen, Antti; Kuznetsova, Maria M.; Glocer, Alex


    There is a growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure. In the last two decades, significant progress has been made toward the first-principles modeling of space weather events, and three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, thereby playing a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for the modern global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events with a Dst footprint comparable to the Carrington superstorm of September 1859 based on the estimate by Tsurutani et. al. (2003). Results are presented for a simulation run with "very extreme" constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated induced geoelectric field on the ground to such extreme driving conditions. The model setup is further tested using input data for an observed space weather event of Halloween storm October 2003 to verify the MHD model consistence and to draw additional guidance for future work. This extreme space weather MHD model setup is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in ground-based conductor systems such as power transmission grids. Therefore, our ultimate goal is to explore the level of geoelectric fields that can be induced from an assumed storm of the reported magnitude, i.e., Dst˜=-1600 nT.

  11. PATH: a lumped-element beam-transport simulation program with space charge

    International Nuclear Information System (INIS)

    Farrell, J.A.


    PATH is a group of computer programs for simulating charged-particle beam-transport systems. It was developed for evaluating the effects of some aberrations without a time-consuming integration of trajectories through the system. The beam-transport portion of PATH is derived from the well-known program, DECAY TURTLE. PATH contains all features available in DECAY TURTLE (including the input format) plus additional features such as a more flexible random-ray generator, longitudinal phase space, some additional beamline elements, and space-charge routines. One of the programs also provides a simulation of an Alvarez linear accelerator. The programs, originally written for a CDC 7600 computer system, also are available on a VAX-VMS system. All of the programs are interactive with input prompting for ease of use

  12. Simulation of space protons influence on silicon semiconductor devices using gamma-neutron irradiation

    International Nuclear Information System (INIS)

    Zhukov, Y.N.; Zinchenko, V.F.; Ulimov, V.N.


    In this study the authors focus on the problems of simulating the space proton energy spectra under laboratory gamma-neutron radiation tests of semiconductor devices (SD). A correct simulation of radiation effects implies to take into account and evaluate substantial differences in the processes of formation of primary defects in SD in space environment and under laboratory testing. These differences concern: 1) displacement defects, 2) ionization defects and 3) intensity of radiation. The study shows that: - the energy dependence of nonionizing energy loss (NIEL) is quite universal to predict the degradation of SD parameters associated to displacement defects, and - MOS devices that are sensitive to ionization defects indicated the same variation of parameters under conditions of equality of ionization density generated by protons and gamma radiations. (A.C.)

  13. Simulation of the preliminary General Electric SP-100 space reactor concept using the ATHENA computer code

    International Nuclear Information System (INIS)

    Fletcher, C.D.


    The capability to perform thermal-hydraulic analyses of a space reactor using the ATHENA computer code is demonstrated. The fast reactor, liquid-lithium coolant loops, and lithium-filled heat pipes of the preliminary General electric SP-100 design were modeled with ATHENA. Two demonstration transient calculations were performed simulating accident conditions. Calculated results are available for display using the Nuclear Plant Analyzer color graphics analysis tool in addition to traditional plots. ATHENA-calculated results appear reasonable, both for steady state full power conditions, and for the two transients. This analysis represents the first known transient thermal-hydraulic simulation using an integral space reactor system model incorporating heat pipes. 6 refs., 17 figs., 1 tab

  14. Status of CEA design and simulation studies of 200 KWe turboelectric space power system

    International Nuclear Information System (INIS)

    Carre, F.; Proust, E.; Gervaise, F.; Schwartz, J.P.; Tilliette, Z.; Vrillon, B.


    This paper presents the updated design features of the reference 200 kWe turboelectric space generator studied in France, and comments some of the alternative options to be analyzed in the near future, concerning the reactor and the conversion system in particular. Also presented the major conclusions of the simulation studies, that have been performed to analyze the overall behavior of the reference generator, during the start up and the accidental transients

  15. Free-Space Squeezing Assists Perfectly Matched Layers in Simulations on a Tight Domain

    DEFF Research Database (Denmark)

    Shyroki, Dzmitry; Ivinskaya, Aliaksandra; Lavrinenko, Andrei


    outside the object, as in simulations of eigenmodes or scattering at a wavelength comparable to or larger than the object itself. Here, we show how, in addition to applying the perfectly matched layers (PMLs), outer free space can be squeezed to avoid cutting the evanescent field tails by the PMLs...... or computational domain borders. Adding the squeeze-transform layers to the standard PMLs requires no changes to the finite-difference algorithms....

  16. A simulation model for reliability evaluation of Space Station power systems (United States)

    Singh, C.; Patton, A. D.; Kumar, Mudit; Wagner, H.


    A detailed simulation model for the hybrid Space Station power system is presented which allows photovoltaic and solar dynamic power sources to be mixed in varying proportions. The model considers the dependence of reliability and storage characteristics during the sun and eclipse periods, and makes it possible to model the charging and discharging of the energy storage modules in a relatively accurate manner on a continuous basis.

  17. Computer graphics testbed to simulate and test vision systems for space applications (United States)

    Cheatham, John B.


    Artificial intelligence concepts are applied to robotics. Artificial neural networks, expert systems and laser imaging techniques for autonomous space robots are being studied. A computer graphics laser range finder simulator developed by Wu has been used by Weiland and Norwood to study use of artificial neural networks for path planning and obstacle avoidance. Interest is expressed in applications of CLIPS, NETS, and Fuzzy Control. These applications are applied to robot navigation.

  18. Combining annual daylight simulation with photobiology data to assess the relative circadian efficacy of interior spaces

    Energy Technology Data Exchange (ETDEWEB)

    Pechacek, C.S.; Andersen, M. [Massachusetts Inst. of Technology, Cambridge, MA (United States). Dept. of Architecture, Building Technology; Lockley, S.W. [Harvard Medical School, Boston, MA (United States). Div. of Sleep Medicine, Brigham and Women' s Hospital


    This paper addressed the issue of hospital design and the role of daylight in patient health care. It presented a new approach for integrating empirical data and findings in photobiology into the performance assessment of a space, thus combining both visual and health-related criteria. Previous studies have reported significant health care outcomes in daylit environments, although the mechanism and photoreceptor systems controlling these effects remain unknown. This study focused on furthering the previous studies beyond windows to describing the characteristics of daylight that may promote human health by providing daylighting for the appropriate synchronization of circadian rhythms, and then make specific daylighting recommendations, grounded in biological findings. In particular, this study investigated the use of daylight autonomy (DA) to simulate the probabilistic and temporal potential of daylight for human health needs. Results of photobiology research were used to define threshold values for lighting, which were then used as goals for simulations. These goals included spectrum, intensity and timing of light at the human eye. The study investigated the variability of key architectural decisions in hospital room design to determine their influence on achieving the goals. The simulations showed how choices in building orientation, window size, user-window position and interior finishes affect the circadian efficacy of a space. Design decisions can improve or degrade the health potential for the space considered. While the findings in this research were specific to hospitals, the results can be applied to other building types such as office buildings and residences. 33 refs., 7 figs.

  19. Real-time graphics for the Space Station Freedom cupola, developed in the Systems Engineering Simulator (United States)

    Red, Michael T.; Hess, Philip W.


    Among the Lyndon B. Johnson Space Center's responsibilities for Space Station Freedom is the cupola. Attached to the resource node, the cupola is a windowed structure that will serve as the space station's secondary control center. From the cupola, operations involving the mobile service center and orbital maneuvering vehicle will be conducted. The Systems Engineering Simulator (SES), located in building 16, activated a real-time man-in-the-loop cupola simulator in November 1987. The SES cupola is an engineering tool with the flexibility to evolve in both hardware and software as the final cupola design matures. Two workstations are simulated with closed-circuit television monitors, rotational and translational hand controllers, programmable display pushbuttons, and graphics display with trackball and keyboard. The displays and controls of the SES cupola are driven by a Silicon Graphics Integrated Raster Imaging System (IRIS) 4D/70 GT computer. Through the use of an interactive display builder program, SES, cupola display pages consisting of two dimensional and three dimensional graphics are constructed. These display pages interact with the SES via the IRIS real-time graphics interface. The focus is on the real-time graphics interface applications software developed on the IRIS.

  20. Numerical Simulation and Optimization of Hole Spacing for Cement Grouting in Rocks

    Directory of Open Access Journals (Sweden)

    Ping Fu


    Full Text Available The fine fissures of V-diabase were the main stratigraphic that affected the effectiveness of foundation grout curtain in Dagang Mountain Hydropower Station. Thus, specialized in situ grouting tests were conducted to determine reasonable hole spacing and other parameters. Considering time variation of the rheological parameters of grout, variation of grouting pressure gradient, and evolution law of the fracture opening, numerical simulations were performed on the diffusion process of cement grouting in the fissures of the rock mass. The distribution of permeability after grouting was obtained on the basis of analysis results, and the grouting hole spacing was discussed based on the reliability analysis. A probability of optimization along with a finer optimization precision as 0.1 m could be adopted when compared with the accuracy of 0.5 m that is commonly used. The results could provide a useful reference for choosing reasonable grouting hole spacing in similar projects.

  1. Numerical simulation and experimental research for the natural convection in an annular space in LMFBR

    International Nuclear Information System (INIS)

    Wang Zhou; Luo Rui; Yang Xianyong; Liang Taofeng


    In a pool fast reactor, the roof structure is penetrated by a number of pumps and heat exchanger units to form some annular spaces with various sizes. The natural convection of argon gas happens in the pool sky and the small annular gaps between those components and the roof containment due to thermosiphonic effects. The natural convection is studied experimentally and numerically to predict the temperature distributions inside the annular space and its surrounding structure. Numerical simulation is performed by using LVEL turbulence model and extending computational domain to the entire pool sky. The predicted results are in fair agreement with the experimental data. In comparison with commonly used k-ε model, LVEL model has better accuracy for the turbulent flow in a gap space

  2. Space, the Final Frontier”: How Good are Agent-Based Models at Simulating Individuals and Space in Cities?

    Directory of Open Access Journals (Sweden)

    Alison Heppenstall


    Full Text Available Cities are complex systems, comprising of many interacting parts. How we simulate and understand causality in urban systems is continually evolving. Over the last decade the agent-based modeling (ABM paradigm has provided a new lens for understanding the effects of interactions of individuals and how through such interactions macro structures emerge, both in the social and physical environment of cities. However, such a paradigm has been hindered due to computational power and a lack of large fine scale datasets. Within the last few years we have witnessed a massive increase in computational processing power and storage, combined with the onset of Big Data. Today geographers find themselves in a data rich era. We now have access to a variety of data sources (e.g., social media, mobile phone data, etc. that tells us how, and when, individuals are using urban spaces. These data raise several questions: can we effectively use them to understand and model cities as complex entities? How well have ABM approaches lent themselves to simulating the dynamics of urban processes? What has been, or will be, the influence of Big Data on increasing our ability to understand and simulate cities? What is the appropriate level of spatial analysis and time frame to model urban phenomena? Within this paper we discuss these questions using several examples of ABM applied to urban geography to begin a dialogue about the utility of ABM for urban modeling. The arguments that the paper raises are applicable across the wider research environment where researchers are considering using this approach.

  3. Simulation of the Plasma Meniscus with and without Space Charge using Triode Extraction System

    International Nuclear Information System (INIS)

    Abdel Rahman, M.M.; EI-Khabeary, H.


    In this work simulation of the singly charged argon ion trajectories for a variable plasma meniscus is studied with and without space charge for the triode extraction system by using SIMION 3D (Simulation of Ion Optics in Three Dimensions) version 7 personal computer program. Tbe influence of acceleration voltage applied to tbe acceleration electrode of the triode extraction system on the shape of the plasma meniscus has been determined. The plasma electrode is set at +5000 volt and the acceleration voltage applied to the acceleration electrode is varied from -5000 volt to +5000 volt. In the most of the concave and convex plasma shapes ion beam emittance can be calculated by using separate standard deviations of positions and elevations angles. Ion beam emittance as a function of the curvature of the plasma meniscus for different plasma shapes ( flat concave and convex ) without space change at acceleration voltage varied from -5000 volt to +5000 volt applied to the acceleration electrode of the triode extraction system has been investigated. Tbe influence of the extraction gap on ion beam emittance for a plasma concave shape of 3.75 mm without space charge at acceleration voltage, V a cc = -2000 volt applied to the acceleration electrode of the triode extraction system has been determined. Also the influence of space charge on ion beam emittance for variable plasma meniscus at acceleration voltage, V a cc = - 2000 volt applied to the acceleration electrode of. the triode extraction system has been studied

  4. Simulation of the plasma meniscus with and without space charge using triode extraction system

    International Nuclear Information System (INIS)

    Rahman, M.M.Abdel; El-Khabeary, H.


    In this work, simulation of the singly charged argon ion trajectories for a variable plasma meniscus is studied with and without space charge for the triode extraction system by using SIMION 3D (Simulation of Ion Optics in Three Dimensions) version 7 personal computer program. The influence of acceleration voltage applied to the acceleration electrode of the triode extraction system on the shape of the plasma meniscus has been determined. The plasma electrode is set at +5000 volt and the acceleration voltage applied to the acceleration electrode is varied from -5000 volt to +5000 volt. In the most of the concave and convex plasma shapes, ion beam emittance can be calculated by using separate standard deviations of positions and elevations angles. Ion beam emittance as a function of the curvature of the plasma meniscus for different plasma shapes ( flat, concave and convex ) without space charge at acceleration voltage varied from -5000 volt to +5000 volt applied to the acceleration electrode of the triode extraction system has been investigated. The influence of the extraction gap on ion beam emittance for a plasma concave shape of 3.75 mm without space charge at acceleration voltage, V acc = -2000 volt applied to the acceleration electrode of the triode extraction system has been determined. Also the influence of space charge on ion beam emittance for variable plasma meniscus at acceleration voltage, V acc = -2000 volt applied to the acceleration electrode of the triode extraction system has been studied. (author)

  5. Monte Carlo simulations for the space radiation superconducting shield project (SR2S). (United States)

    Vuolo, M; Giraudo, M; Musenich, R; Calvelli, V; Ambroglini, F; Burger, W J; Battiston, R


    Astronauts on deep-space long-duration missions will be exposed for long time to galactic cosmic rays (GCR) and Solar Particle Events (SPE). The exposure to space radiation could lead to both acute and late effects in the crew members and well defined countermeasures do not exist nowadays. The simplest solution given by optimized passive shielding is not able to reduce the dose deposited by GCRs below the actual dose limits, therefore other solutions, such as active shielding employing superconducting magnetic fields, are under study. In the framework of the EU FP7 SR2S Project - Space Radiation Superconducting Shield--a toroidal magnetic system based on MgB2 superconductors has been analyzed through detailed Monte Carlo simulations using Geant4 interface GRAS. Spacecraft and magnets were modeled together with a simplified mechanical structure supporting the coils. Radiation transport through magnetic fields and materials was simulated for a deep-space mission scenario, considering for the first time the effect of secondary particles produced in the passage of space radiation through the active shielding and spacecraft structures. When modeling the structures supporting the active shielding systems and the habitat, the radiation protection efficiency of the magnetic field is severely decreasing compared to the one reported in previous studies, when only the magnetic field was modeled around the crew. This is due to the large production of secondary radiation taking place in the material surrounding the habitat. Copyright © 2016 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  6. Time-Accurate Unsteady Pressure Loads Simulated for the Space Launch System at Wind Tunnel Conditions (United States)

    Alter, Stephen J.; Brauckmann, Gregory J.; Kleb, William L.; Glass, Christopher E.; Streett, Craig L.; Schuster, David M.


    A transonic flow field about a Space Launch System (SLS) configuration was simulated with the Fully Unstructured Three-Dimensional (FUN3D) computational fluid dynamics (CFD) code at wind tunnel conditions. Unsteady, time-accurate computations were performed using second-order Delayed Detached Eddy Simulation (DDES) for up to 1.5 physical seconds. The surface pressure time history was collected at 619 locations, 169 of which matched locations on a 2.5 percent wind tunnel model that was tested in the 11 ft. x 11 ft. test section of the NASA Ames Research Center's Unitary Plan Wind Tunnel. Comparisons between computation and experiment showed that the peak surface pressure RMS level occurs behind the forward attach hardware, and good agreement for frequency and power was obtained in this region. Computational domain, grid resolution, and time step sensitivity studies were performed. These included an investigation of pseudo-time sub-iteration convergence. Using these sensitivity studies and experimental data comparisons, a set of best practices to date have been established for FUN3D simulations for SLS launch vehicle analysis. To the author's knowledge, this is the first time DDES has been used in a systematic approach and establish simulation time needed, to analyze unsteady pressure loads on a space launch vehicle such as the NASA SLS.

  7. A simulation of the laser interferometer space antenna data stream from galactic white dwarf binaries

    International Nuclear Information System (INIS)

    Benacquista, M J; DeGoes, J; Lunder, D


    Gravitational radiation from the galactic population of white dwarf binaries is expected to produce a background signal in the laser interferometer space antenna (LISA) frequency band. At frequencies below 1 mHz, this signal is expected to be confusion limited and has been approximated as Gaussian noise. At frequencies above about 5 mHz, the signal will consist of separable individual sources. We have produced a simulation of the LISA data stream from a population of 90k galactic binaries in the frequency range between 1 and 5 mHz. This signal is compared with the simulated signal from globular cluster populations of binaries. Notable features of the simulation as well as potential data analysis schemes for extracting information are presented

  8. Exploration of DGVM Parameter Solution Space Using Simulated Annealing: Implications for Forecast Uncertainties (United States)

    Wells, J. R.; Kim, J. B.


    Parameters in dynamic global vegetation models (DGVMs) are thought to be weakly constrained and can be a significant source of errors and uncertainties. DGVMs use between 5 and 26 plant functional types (PFTs) to represent the average plant life form in each simulated plot, and each PFT typically has a dozen or more parameters that define the way it uses resource and responds to the simulated growing environment. Sensitivity analysis explores how varying parameters affects the output, but does not do a full exploration of the parameter solution space. The solution space for DGVM parameter values are thought to be complex and non-linear; and multiple sets of acceptable parameters may exist. In published studies, PFT parameters are estimated from published literature, and often a parameter value is estimated from a single published value. Further, the parameters are "tuned" using somewhat arbitrary, "trial-and-error" methods. BIOMAP is a new DGVM created by fusing MAPSS biogeography model with Biome-BGC. It represents the vegetation of North America using 26 PFTs. We are using simulated annealing, a global search method, to systematically and objectively explore the solution space for the BIOMAP PFTs and system parameters important for plant water use. We defined the boundaries of the solution space by obtaining maximum and minimum values from published literature, and where those were not available, using +/-20% of current values. We used stratified random sampling to select a set of grid cells representing the vegetation of the conterminous USA. Simulated annealing algorithm is applied to the parameters for spin-up and a transient run during the historical period 1961-1990. A set of parameter values is considered acceptable if the associated simulation run produces a modern potential vegetation distribution map that is as accurate as one produced by trial-and-error calibration. We expect to confirm that the solution space is non-linear and complex, and that

  9. Implementation of an Open-Scenario, Long-Term Space Debris Simulation Approach (United States)

    Nelson, Bron; Yang Yang, Fan; Carlino, Roberto; Dono Perez, Andres; Faber, Nicolas; Henze, Chris; Karacalioglu, Arif Goktug; O'Toole, Conor; Swenson, Jason; Stupl, Jan


    This paper provides a status update on the implementation of a flexible, long-term space debris simulation approach. The motivation is to build a tool that can assess the long-term impact of various options for debris-remediation, including the LightForce space debris collision avoidance concept that diverts objects using photon pressure [9]. State-of-the-art simulation approaches that assess the long-term development of the debris environment use either completely statistical approaches, or they rely on large time steps on the order of several days if they simulate the positions of single objects over time. They cannot be easily adapted to investigate the impact of specific collision avoidance schemes or de-orbit schemes, because the efficiency of a collision avoidance maneuver can depend on various input parameters, including ground station positions and orbital and physical parameters of the objects involved in close encounters (conjunctions). Furthermore, maneuvers take place on timescales much smaller than days. For example, LightForce only changes the orbit of a certain object (aiming to reduce the probability of collision), but it does not remove entire objects or groups of objects. In the same sense, it is also not straightforward to compare specific de-orbit methods in regard to potential collision risks during a de-orbit maneuver. To gain flexibility in assessing interactions with objects, we implement a simulation that includes every tracked space object in Low Earth Orbit (LEO) and propagates all objects with high precision and variable time-steps as small as one second. It allows the assessment of the (potential) impact of physical or orbital changes to any object. The final goal is to employ a Monte Carlo approach to assess the debris evolution during the simulation time-frame of 100 years and to compare a baseline scenario to debris remediation scenarios or other scenarios of interest. To populate the initial simulation, we use the entire space

  10. State, space relay modeling and simulation using the electromagnetic Transients Program and its transient analysis of control systems capability

    International Nuclear Information System (INIS)

    Domijan, A.D. Jr.; Emami, M.V.


    This paper reports on a simulation of a MHO distance relay developed to study the effect of its operation under various system conditions. Simulation is accomplished using a state space approach and a modeling technique using ElectroMagnetic Transient Program (Transient Analysis of Control Systems). Furthermore, simulation results are compared with those obtained in another independent study as a control, to validate the results. A data code for the practical utilization of this simulation is given

  11. Time Accurate Unsteady Pressure Loads Simulated for the Space Launch System at a Wind Tunnel Condition (United States)

    Alter, Stephen J.; Brauckmann, Gregory J.; Kleb, Bil; Streett, Craig L; Glass, Christopher E.; Schuster, David M.


    Using the Fully Unstructured Three-Dimensional (FUN3D) computational fluid dynamics code, an unsteady, time-accurate flow field about a Space Launch System configuration was simulated at a transonic wind tunnel condition (Mach = 0.9). Delayed detached eddy simulation combined with Reynolds Averaged Naiver-Stokes and a Spallart-Almaras turbulence model were employed for the simulation. Second order accurate time evolution scheme was used to simulate the flow field, with a minimum of 0.2 seconds of simulated time to as much as 1.4 seconds. Data was collected at 480 pressure taps at locations, 139 of which matched a 3% wind tunnel model, tested in the Transonic Dynamic Tunnel (TDT) facility at NASA Langley Research Center. Comparisons between computation and experiment showed agreement within 5% in terms of location for peak RMS levels, and 20% for frequency and magnitude of power spectral densities. Grid resolution and time step sensitivity studies were performed to identify methods for improved accuracy comparisons to wind tunnel data. With limited computational resources, accurate trends for reduced vibratory loads on the vehicle were observed. Exploratory methods such as determining minimized computed errors based on CFL number and sub-iterations, as well as evaluating frequency content of the unsteady pressures and evaluation of oscillatory shock structures were used in this study to enhance computational efficiency and solution accuracy. These techniques enabled development of a set of best practices, for the evaluation of future flight vehicle designs in terms of vibratory loads.

  12. Analysis of Waves in Space Plasma (WISP) near field simulation and experiment (United States)

    Richie, James E.


    The WISP payload scheduler for a 1995 space transportation system (shuttle flight) will include a large power transmitter on board at a wide range of frequencies. The levels of electromagnetic interference/electromagnetic compatibility (EMI/EMC) must be addressed to insure the safety of the shuttle crew. This report is concerned with the simulation and experimental verification of EMI/EMC for the WISP payload in the shuttle cargo bay. The simulations have been carried out using the method of moments for both thin wires and patches to stimulate closed solids. Data obtained from simulation is compared with experimental results. An investigation of the accuracy of the modeling approach is also included. The report begins with a description of the WISP experiment. A description of the model used to simulate the cargo bay follows. The results of the simulation are compared to experimental data on the input impedance of the WISP antenna with the cargo bay present. A discussion of the methods used to verify the accuracy of the model is shown to illustrate appropriate methods for obtaining this information. Finally, suggestions for future work are provided.

  13. Optimal design of a composite space shield based on numerical simulations

    International Nuclear Information System (INIS)

    Son, Byung Jin; Yoo, Jeong Hoon; Lee, Min Hyung


    In this study, optimal design of a stuffed Whipple shield is proposed by using numerical simulations and new penetration criterion. The target model was selected based on the shield model used in the Columbus module of the international space station. Because experimental results can be obtained only in the low velocity region below 7 km/s, it is required to derive the Ballistic limit curve (BLC) in the high velocity region above 7 km/s by numerical simulation. AUTODYN-2D, the commercial hydro-code package, was used to simulate the nonlinear transient analysis for the hypervelocity impact. The Smoothed particle hydrodynamics (SPH) method was applied to projectile and bumper modeling to represent the debris cloud generated after the impact. Numerical simulation model and selected material properties were validated through a quantitative comparison between numerical and experimental results. A new criterion to determine whether the penetration occurs or not is proposed from kinetic energy analysis by numerical simulation in the velocity region over 7 km/s. The parameter optimization process was performed to improve the protection ability at a specific condition through the Design of experiment (DOE) method and the Response surface methodology (RSM). The performance of the proposed optimal design was numerically verified.

  14. Simulating Coupling Complexity in Space Plasmas: First Results from a new code (United States)

    Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.


    The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal

  15. Three dimensional simulations of space charge dominated heavy ion beams with applications to inertial fusion energy

    International Nuclear Information System (INIS)

    Grote, D.P.


    Heavy ion fusion requires injection, transport and acceleration of high current beams. Detailed simulation of such beams requires fully self-consistent space charge fields and three dimensions. WARP3D, developed for this purpose, is a particle-in-cell plasma simulation code optimized to work within the framework of an accelerator's lattice of accelerating, focusing, and bending elements. The code has been used to study several test problems and for simulations and design of experiments. Two applications are drift compression experiments on the MBE-4 facility at LBL and design of the electrostatic quadrupole injector for the proposed ILSE facility. With aggressive drift compression on MBE-4, anomalous emittance growth was observed. Simulations carried out to examine possible causes showed that essentially all the emittance growth is result of external forces on the beam and not of internal beam space-charge fields. Dominant external forces are the dodecapole component of focusing fields, the image forces on the surrounding pipe and conductors, and the octopole fields that result from the structure of the quadrupole focusing elements. Goal of the design of the electrostatic quadrupole injector is to produce a beam of as low emittance as possible. The simulations show that the dominant effects that increase the emittance are the nonlinear octopole fields and the energy effect (fields in the axial direction that are off-axis). Injectors were designed that minimized the beam envelope in order to reduce the effect of the nonlinear fields. Alterations to the quadrupole structure that reduce the nonlinear fields further were examined. Comparisons were done with a scaled experiment resulted in very good agreement

  16. James Webb Space Telescope Optical Simulation Testbed: Segmented Mirror Phase Retrieval Testing (United States)

    Laginja, Iva; Egron, Sylvain; Brady, Greg; Soummer, Remi; Lajoie, Charles-Philippe; Bonnefois, Aurélie; Long, Joseph; Michau, Vincent; Choquet, Elodie; Ferrari, Marc; Leboulleux, Lucie; Mazoyer, Johan; N’Diaye, Mamadou; Perrin, Marshall; Petrone, Peter; Pueyo, Laurent; Sivaramakrishnan, Anand


    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a hardware simulator designed to produce JWST-like images. A model of the JWST three mirror anastigmat is realized with three lenses in form of a Cooke Triplet, which provides JWST-like optical quality over a field equivalent to a NIRCam module, and an Iris AO segmented mirror with hexagonal elements is standing in for the JWST segmented primary. This setup successfully produces images extremely similar to NIRCam images from cryotesting in terms of the PSF morphology and sampling relative to the diffraction limit.The testbed is used for staff training of the wavefront sensing and control (WFS&C) team and for independent analysis of WFS&C scenarios of the JWST. Algorithms like geometric phase retrieval (GPR) that may be used in flight and potential upgrades to JWST WFS&C will be explored. We report on the current status of the testbed after alignment, implementation of the segmented mirror, and testing of phase retrieval techniques.This optical bench complements other work at the Makidon laboratory at the Space Telescope Science Institute, including the investigation of coronagraphy for segmented aperture telescopes. Beyond JWST we intend to use JOST for WFS&C studies for future large segmented space telescopes such as LUVOIR.

  17. Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes

    CERN Document Server

    Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R


    This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...

  18. Human spaceflight and space adaptations: Computational simulation of gravitational unloading on the spine (United States)

    Townsend, Molly T.; Sarigul-Klijn, Nesrin


    Living in reduced gravitational environments for a prolonged duration such, as a fly by mission to Mars or an extended stay at the international space station, affects the human body - in particular, the spine. As the spine adapts to spaceflight, morphological and physiological changes cause the mechanical integrity of the spinal column to be compromised, potentially endangering internal organs, nervous health, and human body mechanical function. Therefore, a high fidelity computational model and simulation of the whole human spine was created and validated for the purpose of investigating the mechanical integrity of the spine in crew members during exploratory space missions. A spaceflight exposed spine has been developed through the adaptation of a three-dimensional nonlinear finite element model with the updated Lagrangian formulation of a healthy ground-based human spine in vivo. Simulation of the porohyperelastic response of the intervertebral disc to mechanical unloading resulted in a model capable of accurately predicting spinal swelling/lengthening, spinal motion, and internal stress distribution. The curvature of this space adaptation exposed spine model was compared to a control terrestrial-based finite element model, indicating how the shape changed. Finally, the potential of injury sites to crew members are predicted for a typical 9 day mission.

  19. Simulation of the space debris environment in LEO using a simplified approach (United States)

    Kebschull, Christopher; Scheidemann, Philipp; Hesselbach, Sebastian; Radtke, Jonas; Braun, Vitali; Krag, H.; Stoll, Enrico


    Several numerical approaches exist to simulate the evolution of the space debris environment. These simulations usually rely on the propagation of a large population of objects in order to determine the collision probability for each object. Explosion and collision events are triggered randomly using a Monte-Carlo (MC) approach. So in many different scenarios different objects are fragmented and contribute to a different version of the space debris environment. The results of the single Monte-Carlo runs therefore represent the whole spectrum of possible evolutions of the space debris environment. For the comparison of different scenarios, in general the average of all MC runs together with its standard deviation is used. This method is computationally very expensive due to the propagation of thousands of objects over long timeframes and the application of the MC method. At the Institute of Space Systems (IRAS) a model capable of describing the evolution of the space debris environment has been developed and implemented. The model is based on source and sink mechanisms, where yearly launches as well as collisions and explosions are considered as sources. The natural decay and post mission disposal measures are the only sink mechanisms. This method reduces the computational costs tremendously. In order to achieve this benefit a few simplifications have been applied. The approach of the model partitions the Low Earth Orbit (LEO) region into altitude shells. Only two kinds of objects are considered, intact bodies and fragments, which are also divided into diameter bins. As an extension to a previously presented model the eccentricity has additionally been taken into account with 67 eccentricity bins. While a set of differential equations has been implemented in a generic manner, the Euler method was chosen to integrate the equations for a given time span. For this paper parameters have been derived so that the model is able to reflect the results of the numerical MC

  20. Simulations of VLBI observations of a geodetic satellite providing co-location in space (United States)

    Anderson, James M.; Beyerle, Georg; Glaser, Susanne; Liu, Li; Männel, Benjamin; Nilsson, Tobias; Heinkelmann, Robert; Schuh, Harald


    We performed Monte Carlo simulations of very-long-baseline interferometry (VLBI) observations of Earth-orbiting satellites incorporating co-located space-geodetic instruments in order to study how well the VLBI frame and the spacecraft frame can be tied using such measurements. We simulated observations of spacecraft by VLBI observations, time-of-flight (TOF) measurements using a time-encoded signal in the spacecraft transmission, similar in concept to precise point positioning, and differential VLBI (D-VLBI) observations using angularly nearby quasar calibrators to compare their relative performance. We used the proposed European Geodetic Reference Antenna in Space (E-GRASP) mission as an initial test case for our software. We found that the standard VLBI technique is limited, in part, by the present lack of knowledge of the absolute offset of VLBI time to Coordinated Universal Time at the level of microseconds. TOF measurements are better able to overcome this problem and provide frame ties with uncertainties in translation and scale nearly a factor of three smaller than those yielded from VLBI measurements. If the absolute time offset issue can be resolved by external means, the VLBI results can be significantly improved and can come close to providing 1 mm accuracy in the frame tie parameters. D-VLBI observations with optimum performance assumptions provide roughly a factor of two higher uncertainties for the E-GRASP orbit. We additionally simulated how station and spacecraft position offsets affect the frame tie performance.

  1. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    International Nuclear Information System (INIS)

    Ahmadi, Rouhollah; Khamehchi, Ehsan


    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data

  2. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Ahmadi, Rouhollah, E-mail: [Amirkabir University of Technology, PhD Student at Reservoir Engineering, Department of Petroleum Engineering (Iran, Islamic Republic of); Khamehchi, Ehsan [Amirkabir University of Technology, Faculty of Petroleum Engineering (Iran, Islamic Republic of)


    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.

  3. Simulation of space-charge effects in an ungated GEM-based TPC

    Energy Technology Data Exchange (ETDEWEB)

    Böhmer, F.V., E-mail:; Ball, M.; Dørheim, S.; Höppner, C.; Ketzer, B.; Konorov, I.; Neubert, S.; Paul, S.; Rauch, J.; Vandenbroucke, M.


    A fundamental limit to the application of Time Projection Chambers (TPCs) in high-rate experiments is the accumulation of slowly drifting ions in the active gas volume, which compromises the homogeneity of the drift field and hence the detector resolution. Conventionally, this problem is overcome by the use of ion-gating structures. This method, however, introduces large dead times and restricts trigger rates to a few hundred per second. The ion gate can be eliminated from the setup by the use of Gas Electron Multiplier (GEM) foils for gas amplification, which intrinsically suppress the backflow of ions. This makes the continuous operation of a TPC at high rates feasible. In this work, Monte Carlo simulations of the buildup of ion space charge in a GEM-based TPC and the correction of the resulting drift distortions are discussed, based on realistic numbers for the ion backflow in a triple-GEM amplification stack. A TPC in the future P{sup ¯}ANDA experiment at FAIR serves as an example for the experimental environment. The simulations show that space charge densities up to 65 fC cm{sup −3} are reached, leading to electron drift distortions of up to 10 mm. The application of a laser calibration system to correct these distortions is investigated. Based on full simulations of the detector physics and response, we show that it is possible to correct for the drift distortions and to maintain the good momentum resolution of the GEM-TPC.

  4. Simulation of space-charge effects in an ungated GEM-based TPC

    International Nuclear Information System (INIS)

    Böhmer, F.V.; Ball, M.; Dørheim, S.; Höppner, C.; Ketzer, B.; Konorov, I.; Neubert, S.; Paul, S.; Rauch, J.; Vandenbroucke, M.


    A fundamental limit to the application of Time Projection Chambers (TPCs) in high-rate experiments is the accumulation of slowly drifting ions in the active gas volume, which compromises the homogeneity of the drift field and hence the detector resolution. Conventionally, this problem is overcome by the use of ion-gating structures. This method, however, introduces large dead times and restricts trigger rates to a few hundred per second. The ion gate can be eliminated from the setup by the use of Gas Electron Multiplier (GEM) foils for gas amplification, which intrinsically suppress the backflow of ions. This makes the continuous operation of a TPC at high rates feasible. In this work, Monte Carlo simulations of the buildup of ion space charge in a GEM-based TPC and the correction of the resulting drift distortions are discussed, based on realistic numbers for the ion backflow in a triple-GEM amplification stack. A TPC in the future P ¯ ANDA experiment at FAIR serves as an example for the experimental environment. The simulations show that space charge densities up to 65 fC cm −3 are reached, leading to electron drift distortions of up to 10 mm. The application of a laser calibration system to correct these distortions is investigated. Based on full simulations of the detector physics and response, we show that it is possible to correct for the drift distortions and to maintain the good momentum resolution of the GEM-TPC

  5. Effects of repeated simulated removal activities on feral swine movements and space use (United States)

    Fischer, Justin W.; McMurtry , Dan; Blass, Chad R.; Walter, W. David; Beringer, Jeff; VerCauterren, Kurt C.


    Abundance and distribution of feral swine (Sus scrofa) in the USA have increased dramatically during the last 30 years. Effective measures are needed to control and eradicate feral swine populations without displacing animals over wider areas. Our objective was to investigate effects of repeated simulated removal activities on feral swine movements and space use. We analyzed location data from 21 feral swine that we fitted with Global Positioning System harnesses in southern MO, USA. Various removal activities were applied over time to eight feral swine before lethal removal, including trapped-and-released, chased with dogs, chased with hunter, and chased with helicopter. We found that core space-use areas were reduced following the first removal activity, whereas overall space-use areas and diurnal movement distances increased following the second removal activity. Mean geographic centroid shifts did not differ between pre- and post-periods for either the first or second removal activities. Our information on feral swine movements and space use precipitated by human removal activities, such as hunting, trapping, and chasing with dogs, helps fill a knowledge void and will aid wildlife managers. Strategies to optimize management are needed to reduce feral swine populations while preventing enlarged home ranges and displacing individuals, which could lead to increased disease transmission risk and human-feral swine conflict in adjacent areas.

  6. Laboratory simulation of the formation of an ionospheric depletion using Keda Space Plasma EXperiment (KSPEX

    Directory of Open Access Journals (Sweden)

    Pengcheng Yu


    Full Text Available In the work, the formation of an ionospheric depletion was simulated in a controlled laboratory plasma. The experiment was performed by releasing chemical substance sulfur hexafluoride (SF6 into the pure argon discharge plasma. Results indicate that the plasma parameters change significantly after release of chemicals. The electron density is nearly depleted due to the sulfur hexafluoride-electron attachment reaction; and the electron temperature and space potential experience an increase due to the decrease of the electron density. Compared to the traditional active release experiments, the laboratory scheme can be more efficient, high repetition rate and simpler measurement of the varying plasma parameter after chemical releasing. Therefore, it can effective building the bridge between the theoretical work and real space observation.

  7. Contamination Control Assessment of the World's Largest Space Environment Simulation Chamber (United States)

    Snyder, Aaron; Henry, Michael W.; Grisnik, Stanley P.; Sinclair, Stephen M.


    The Space Power Facility s thermal vacuum test chamber is the largest chamber in the world capable of providing an environment for space simulation. To improve performance and meet stringent requirements of a wide customer base, significant modifications were made to the vacuum chamber. These include major changes to the vacuum system and numerous enhancements to the chamber s unique polar crane, with a goal of providing high cleanliness levels. The significance of these changes and modifications are discussed in this paper. In addition, the composition and arrangement of the pumping system and its impact on molecular back-streaming are discussed in detail. Molecular contamination measurements obtained with a TQCM and witness wafers during two recent integrated system tests of the chamber are presented and discussed. Finally, a concluding remarks section is presented.

  8. SPACE code simulation of ATLAS DVI line break accident test (SB DVI 08 Test)

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Sang Gyu [KHNP, Daejeon (Korea, Republic of)


    APR1400 has adopted new safety design features which are 4 mechanically independent DVI (Direct Vessel Injection) systems and fluidic device in the safety injection tanks (SITs). Hence, DVI line break accident has to be evaluated as one of the small break LOCA (SBLOCA) to ensure the safety of APR1400. KAERI has been performed for DVI line break test (SB DVI 08) using ATLAS (Advanced Thermal Hydraulic Test Loop for Accident Simulation) facility which is an integral effect test facility for APR1400. The test result shows that the core collapsed water level decreased before a loop seal clearance, so that a core uncover occurred. At this time, the peak cladding temperature (PCT) is rapidly increased even though the emergency core cooling (ECC) water is injected from safety injection pump (SIP). This test result is useful for supporting safety analysis using thermal hydraulic safety analysis code and increases the understanding of SBLOCA phenomena in APR1400. The SBLOCA evaluation methodology for APR1400 is now being developed using SPACE code. The object of the development of this methodology is to set up a conservative evaluation methodology in accordance with appendix K of 10 CFR 50. ATLAS SB DVI 08 test is selected for the evaluation of SBLOCA methodology using SPACE code. Before applying the conservative models and correlations, benchmark calculation of the test is performed with the best estimate models and correlations to verify SPACE code capability. This paper deals with benchmark calculations results of ATLAS SB DVI 08 test. Calculation results of the major hydraulics variables are compared with measured data. Finally, this paper carries out the SPACE code performances for simulating the integral effect test of SBLOCA.

  9. Frequency Domain Modeling and Simulation of DC Power Electronic Systems Using Harmonic State Space Method

    DEFF Research Database (Denmark)

    Kwon, Jun Bum; Wang, Xiongfei; Blaabjerg, Frede


    For the efficiency and simplicity of electric systems, the dc power electronic systems are widely used in a variety of applications such as electric vehicles, ships, aircraft and also in homes. In these systems, there could be a number of dynamic interactions and frequency coupling between network...... with different switching frequency or harmonics from ac-dc converters makes that harmonics and frequency coupling are both problems of ac system and challenges of dc system. This paper presents a modeling and simulation method for a large dc power electronic system by using Harmonic State Space (HSS) modeling...

  10. A space simulation test chamber development for the investigation of radiometric properties of materials (United States)

    Enlow, D. L.


    The design, fabrication, and preliminary utilization of a thermal vacuum space simulation facility are discussed. The facility was required to perform studies on the thermal radiation properties of materials. A test chamber was designed to provide high pumping speed, low pressure, a low photon level radiation background (via high emissivity, coated, finned cryopanels), internal heat sources for rapid warmup, and rotary and linear motion of the irradiated materials specimen. The radiation detection system consists of two wideband infrared photoconductive detectors, their cryogenic coolers, a cryogenic-cooled blackbody source, and a cryogenic-cooled optical radiation modulator.

  11. Design and Implementation of a Space Environment Simulation Toolbox for Small Satellites

    DEFF Research Database (Denmark)

    Amini, Rouzbeh; Larsen, Jesper A.; Izadi-Zamanabadi, Roozbeh


    This paper presents a developed toolbox for space environment model in SIMULINK that facilitates development and design of Attitude Determination and Control Systems (ADCS) for a Low Earth Orbit (LEO) spacecraft. The toolbox includes, among others, models of orbit propagators, disturbances, Earth...... gravity field, Earth magnetic field and eclipse. The structure and facilities within the toolbox are described and exemplified using a student satellite case (AAUSAT-II). The validity of developed models is confirmed by comparing the simulation results with the realistic data obtained from the Danish...

  12. Design and simulation of a planar micro-optic free-space receiver (United States)

    Nadler, Brett R.; Hallas, Justin M.; Karp, Jason H.; Ford, Joseph E.


    We propose a compact directional optical receiver for free-space communications, where a microlens array and micro-optic structures selectively couple light from a narrow incidence angle into a thin slab waveguide and then to an edge-mounted detector. A small lateral translation of the lenslet array controls the coupled input angle, enabling the receiver to select the transmitter source direction. We present the optical design and simulation of a 10mm x 10mm aperture receiver using a 30μm thick silicon waveguide able to couple up to 2.5Gbps modulated input to a 10mm x 30μm wide detector.

  13. Solar concentrator panel and gore testing in the JPL 25-foot space simulator (United States)

    Dennison, E. W.; Argoud, M. J.


    The optical imaging characteristics of parabolic solar concentrator panels (or gores) have been measured using the optical beam of the JPL 25-foot space simulator. The simulator optical beam has been characterized, and the virtual source position and size have been determined. These data were used to define the optical test geometry. The point source image size and focal length have been determined for several panels. A flux distribution of a typical solar concentrator has been estimated from these data. Aperture photographs of the panels were used to determine the magnitude and characteristics of the reflecting surface errors. This measurement technique has proven to be highly successful at determining the optical characteristics of solar concentrator panels.

  14. Effect of empty buckets on coupled bunch instability in RHIC Booster: Longitudinal phase-space simulation

    International Nuclear Information System (INIS)

    Bogacz, S.A.; Griffin, J.E.; Khiari, F.Z.


    Excitation of large amplitude coherent dipole bunch oscillations by beam induced voltages in spurious narrow resonances are simulated using a longitudinal phase-space tracking code (ESME). Simulation of the developing instability in a high intensity proton beam driven by a spurious parasitic resonance of the rf cavities allows one to estimate the final longitudinal emittance of the beam at the end of the cycle, which puts serious limitations on the machine performance. The growth of the coupled bunch modes is significantly enhanced if a gap of missing bunches is present, which is an inherent feature of the high intensity proton machines. A strong transient excitation of the parasitic resonance by the Fourier components of the beam spectrum resulting from the presence of the gap is suggested as a possible mechanism of this enhancement. 10 refs., 4 figs., 1 tab

  15. Robotic Design Choice Overview using Co-simulation and Design Space Exploration

    DEFF Research Database (Denmark)

    Christiansen, Martin Peter; Larsen, Peter Gorm; Nyholm Jørgensen, Rasmus


    . Simulations are used to evaluate the robot model output response in relation to operational demands. An example of a load carrying challenge in relation to the feeding robot is presented and a design space is defined with candidate solutions in both the mechanical and software domains. Simulation results......Rapid robotic system development has created a demand for multi-disciplinary methods and tools to explore and compare design alternatives. In this paper, we present a collaborative modelling technique that combines discrete-event models of controller software with continuous-time models of physical...... robot components. The proposed co-modelling method utilises Vienna Development Method (VDM) and Matlab for discrete-event modelling and 20-sim for continuous-time modelling. The model-based development of a mobile robot mink feeding system is used to illustrate the collaborative modelling method...

  16. Tetrahedral-Mesh Simulation of Turbulent Flows with the Space-Time Conservative Schemes (United States)

    Chang, Chau-Lyan; Venkatachari, Balaji; Cheng, Gary C.


    Direct numerical simulations of turbulent flows are predominantly carried out using structured, hexahedral meshes despite decades of development in unstructured mesh methods. Tetrahedral meshes offer ease of mesh generation around complex geometries and the potential of an orientation free grid that would provide un-biased small-scale dissipation and more accurate intermediate scale solutions. However, due to the lack of consistent multi-dimensional numerical formulations in conventional schemes for triangular and tetrahedral meshes at the cell interfaces, numerical issues exist when flow discontinuities or stagnation regions are present. The space-time conservative conservation element solution element (CESE) method - due to its Riemann-solver-free shock capturing capabilities, non-dissipative baseline schemes, and flux conservation in time as well as space - has the potential to more accurately simulate turbulent flows using unstructured tetrahedral meshes. To pave the way towards accurate simulation of shock/turbulent boundary-layer interaction, a series of wave and shock interaction benchmark problems that increase in complexity, are computed in this paper with triangular/tetrahedral meshes. Preliminary computations for the normal shock/turbulence interactions are carried out with a relatively coarse mesh, by direct numerical simulations standards, in order to assess other effects such as boundary conditions and the necessity of a buffer domain. The results indicate that qualitative agreement with previous studies can be obtained for flows where, strong shocks co-exist along with unsteady waves that display a broad range of scales, with a relatively compact computational domain and less stringent requirements for grid clustering near the shock. With the space-time conservation properties, stable solutions without any spurious wave reflections can be obtained without a need for buffer domains near the outflow/farfield boundaries. Computational results for the

  17. Use of Parallel Micro-Platform for the Simulation the Space Exploration (United States)

    Velasco Herrera, Victor Manuel; Velasco Herrera, Graciela; Rosano, Felipe Lara; Rodriguez Lozano, Salvador; Lucero Roldan Serrato, Karen

    The purpose of this work is to create a parallel micro-platform, that simulates the virtual movements of a space exploration in 3D. One of the innovations presented in this design consists of the application of a lever mechanism for the transmission of the movement. The development of such a robot is a challenging task very different of the industrial manipulators due to a totally different target system of requirements. This work presents the study and simulation, aided by computer, of the movement of this parallel manipulator. The development of this model has been developed using the platform of computer aided design Unigraphics, in which it was done the geometric modeled of each one of the components and end assembly (CAD), the generation of files for the computer aided manufacture (CAM) of each one of the pieces and the kinematics simulation of the system evaluating different driving schemes. We used the toolbox (MATLAB) of aerospace and create an adaptive control module to simulate the system.

  18. Effects of simulated space environmental parameters on six commercially available composite materials

    International Nuclear Information System (INIS)

    Funk, J.G.; Sykes, G.F. Jr.


    The effects of simulated space environmental parameters on microdamage induced by the environment in a series of commercially available graphite-fiber-reinforced composite materials were determined. Composites with both thermoset and thermoplastic resin systems were studied. Low-Earth-Orbit (LEO) exposures were simulated by thermal cycling; geosynchronous-orbit (GEO) exposures were simulated by electron irradiation plus thermal cycling. The thermal cycling temperature range was -250 F to either 200 F or 150 F. The upper limits of the thermal cycles were different to ensure that an individual composite material was not cycled above its glass transition temperature. Material response was characterized through assessment of the induced microcracking and its influence on mechanical property changes at both room temperature and -250 F. Microdamage was induced in both thermoset and thermoplastic advanced composite materials exposed to the simulated LEO environment. However, a 350 F cure single-phase toughened epoxy composite was not damaged during exposure to the LEO environment. The simuated GEO environment produced microdamage in all materials tested

  19. Effects of incentives on psychosocial performances in simulated space-dwelling groups (United States)

    Hienz, Robert D.; Brady, Joseph V.; Hursh, Steven R.; Gasior, Eric D.; Spence, Kevin R.; Emurian, Henry H.

    Prior research with individually isolated 3-person crews in a distributed, interactive, planetary exploration simulation examined the effects of communication constraints and crew configuration changes on crew performance and psychosocial self-report measures. The present report extends these findings to a model of performance maintenance that operationalizes conditions under which disruptive affective responses by crew participants might be anticipated to emerge. Experiments evaluated the effects of changes in incentive conditions on crew performance and self-report measures in simulated space-dwelling groups. Crews participated in a simulated planetary exploration mission that required identification, collection, and analysis of geologic samples. Results showed that crew performance effectiveness was unaffected by either positive or negative incentive conditions, while self-report measures were differentially affected—negative incentive conditions produced pronounced increases in negative self-report ratings and decreases in positive self-report ratings, while positive incentive conditions produced increased positive self-report ratings only. Thus, incentive conditions associated with simulated spaceflight missions can significantly affect psychosocial adaptation without compromising task performance effectiveness in trained and experienced crews.

  20. Vapor Space Corrosion Testing Simulating The Environment Of Hanford Double Shell Tanks

    Energy Technology Data Exchange (ETDEWEB)

    Wiersma, B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Gray, J. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Garcia-Diaz, B. L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Murphy, T. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hicks, K. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)


    As part of an integrated program to better understand corrosion in the high level waste tanks, Hanford has been investigating corrosion at the liquid/air interface (LAI) and at higher areas in the tank vapor space. This current research evaluated localized corrosion in the vapor space over Hanford double shell tank simulants to assess the impact of ammonia and new minimum nitrite concentration limits, which are part of the broader corrosion chemistry limits. The findings from this study showed that the presence of ammonia gas (550 ppm) in the vapor space is sufficient to reduce corrosion over the short-term (i.e. four months) for a Hanford waste chemistry (SY102 High Nitrate). These findings are in agreement with previous studies at both Hanford and SRS which showed ammonia gas in the vapor space to be inhibitive. The presence of ammonia in electrochemical test solution, however, was insufficient to inhibit against pitting corrosion. The effect of the ammonia appears to be a function of the waste chemistry and may have more significant effects in waste with low nitrite concentrations. Since high levels of ammonia were found beneficial in previous studies, additional testing is recommended to assess the necessary minimum concentration for protection of carbon steel. The new minimum R value of 0.15 was found to be insufficient to prevent pitting corrosion in the vapor space. The pitting that occurred, however, did not progress over the four-month test. Pits appeared to stop growing, which would indicate that pitting might not progress through wall.

  1. Effects and mechanism on Kapton film under ozone exposure in a ground near space simulator (United States)

    Wei, Qiang; Yang, Guimin; Liu, Gang; Jiang, Haifu; Zhang, Tingting


    The effect on aircraft materials in the near space environment is a key part of air-and-space integration research. Ozone and aerodynamic fluids are important organizational factors in the near space environment and both have significant influences on the performance of aircraft materials. In the present paper a simulated ozone environment was used to test polyimide material that was rotated at the approximate velocity of 150-250 m/s to form an aerodynamic fluid field. The goal was to evaluate the performance evolution of materials under a comprehensive environment of ozone molecular corrosion and aerodynamic fluids. The research results show that corrosion and sputtering by ozone molecules results in Kapton films exhibiting a rugged "carpet-like" morphology exhibits an increase in surface roughness. The morphology after ozone exposure led to higher surface roughness and an increase in surface optical diffuse reflection, which is expressed by the lower optical transmittance and the gradual transition from light orange to brown. The mass loss test, XPS, and FTIR analysis show that the molecular chains on the surface of the Kapton film are destroyed resulting in Csbnd C bond breaking to form small volatile molecules such as CO2 or CO, which are responsible for a linear increase in mass loss per unit area. The Csbnd N and Csbnd O structures exhibit weakening tendency under ozone exposure. The present paper explores the evaluation method for Kapton's adaptability under the ozone exposure test in the near space environment, and elucidates the corrosion mechanism and damage mode of the polyimide material under the combined action of ozone corrosion and the aerodynamic fluid. This work provides a methodology for studying materials in the near-space environment.

  2. MCNP6 simulation of reactions of interest to FRIB, medical, and space applications

    International Nuclear Information System (INIS)

    Mashnik, Stepan G.


    The latest production-version of the Los Alamos Monte Carlo N-Particle transport code MCNP6 has been used to simulate a variety of particle-nucleus and nucleus-nucleus reactions of academic and applied interest to research subjects at the Facility for Rare Isotope Beams (FRIB), medical isotope production, space-radiation shielding, cosmic-ray propagation, and accelerator applications, including several reactions induced by radioactive isotopes, analyzing production of both stable and radioactive residual nuclei. Here, we discuss examples of validation and verification of MCNP6 by comparing with recent neutron spectra measured at the Heavy Ion Medical Accelerator in Chiba, Japan; spectra of light fragments from several reactions measured recently at GANIL, France; INFN Laboratori Nazionali del Sud, Catania, Italy; COSY of the Jülich Research Center, Germany; and cross sections of products from several reactions measured lately at GSI, Darmstadt, Germany; ITEP, Moscow, Russia; and, LANSCE, LANL, Los Alamos, U.S.A. As a rule, MCNP6 provides quite good predictions for most of the reactions we analyzed so far, allowing us to conclude that it can be used as a reliable and useful simulation tool for various applications for FRIB, medical, and space applications involving stable and radioactive isotopes. (author)

  3. Extended phase-space methods for enhanced sampling in molecular simulations: a review

    Directory of Open Access Journals (Sweden)

    Hiroshi eFujisaki


    Full Text Available Molecular Dynamics simulations are a powerful approach to study biomolecular conformational changes or protein-ligand, protein-protein and protein-DNA/RNA interactions. Straightforward applications however are often hampered by incomplete sampling, since in a typical simulated trajectory the system will spend most of its time trapped by high energy barriers in restricted regions of the configuration space. Over the years, several techniques have been designed to overcome this problem and enhance space sampling. Here, we review a class of methods that rely on the idea of extending the set of dynamical variables of the system by adding extra ones associated to functions describing the process under study. In particular, we illustrate the Temperature Accelerated Molecular Dynamics (TAMD, Logarithmic Mean Force Dynamics (LogMFD, andMultiscale Enhanced Sampling (MSES algorithms. We also discuss combinations with techniques for searching reaction paths. We show the advantages presented by this approach and how it allows to quickly sample important regions of the free energy landscape via automatic exploration.

  4. Cosmological observations with a wide field telescope in space: Pixel simulations of EUCLID spectrometer

    International Nuclear Information System (INIS)

    Zoubian, Julien


    The observations of the supernovae, the cosmic microwave background, and more recently the measurement of baryon acoustic oscillations and the weak lensing effects, converge to a Lambda CDM model, with an accelerating expansion of the today Universe. This model need two dark components to fit the observations, the dark matter and the dark energy. Two approaches seem particularly promising to measure both geometry of the Universe and growth of dark matter structures, the analysis of the weak distortions of distant galaxies by gravitational lensing and the study of the baryon acoustic oscillations. Both methods required a very large sky surveys of several thousand square degrees. In the context of the spectroscopic survey of the space mission EUCLID, dedicated to the study of the dark side of the universe, I developed a pixel simulation tool for analyzing instrumental performances. The proposed method can be summarized in three steps. The first step is to simulate the observables, i.e. mainly the sources of the sky. I work up a new method, adapted for spectroscopic simulations, which allows to mock an existing survey of galaxies in ensuring that the distribution of the spectral properties of galaxies are representative of current observations, in particular the distribution of the emission lines. The second step is to simulate the instrument and produce images which are equivalent to the expected real images. Based on the pixel simulator of the HST, I developed a new tool to compute the images of the spectroscopic channel of EUCLID. The new simulator have the particularity to be able to simulate PSF with various energy distributions and detectors which have different pixels. The last step is the estimation of the performances of the instrument. Based on existing tools, I set up a pipeline of image processing and performances measurement. My main results were: 1) to validate the method by simulating an existing survey of galaxies, the WISP survey, 2) to determine the

  5. Understanding the microscopic moisture migration in pore space using DEM simulation

    Directory of Open Access Journals (Sweden)

    Yuan Guo


    Full Text Available The deformation of soil skeleton and migration of pore fluid are the major factors relevant to the triggering of and damages by liquefaction. The influence of pore fluid migration during earthquake has been demonstrated from recent model experiments and field case studies. Most of the current liquefaction assessment models are based on testing of isotropic liquefiable materials. However the recent New Zealand earthquake shows much severer damages than those predicted by existing models. A fundamental cause has been contributed to the embedded layers of low permeability silts. The existence of these silt layers inhibits water migration under seismic loads, which accelerated liquefaction and caused a much larger settlement than that predicted by existing theories. This study intends to understand the process of moisture migration in the pore space of sand using discrete element method (DEM simulation. Simulations were conducted on consolidated undrained triaxial testing of sand where a cylinder sample of sand was built and subjected to a constant confining pressure and axial loading. The porosity distribution was monitored during the axial loading process. The spatial distribution of porosity change was determined, which had a direct relationship with the distribution of excess pore water pressure. The non-uniform distribution of excess pore water pressure causes moisture migration. From this, the migration of pore water during the loading process can be estimated. The results of DEM simulation show a few important observations: (1 External forces are mainly carried and transmitted by the particle chains of the soil sample; (2 Porosity distribution during loading is not uniform due to non-homogeneous soil fabric (i.e. the initial particle arrangement and existence of particle chains; (3 Excess pore water pressure develops differently at different loading stages. At the early stage of loading, zones with a high initial porosity feature higher

  6. Toward multi-scale simulation of reconnection phenomena in space plasma (United States)

    Den, M.; Horiuchi, R.; Usami, S.; Tanaka, T.; Ogawa, T.; Ohtani, H.


    Magnetic reconnection is considered to play an important role in space phenomena such as substorm in the Earth's magnetosphere. It is well known that magnetic reconnection is controlled by microscopic kinetic mechanism. Frozen-in condition is broken due to particle kinetic effects and collisionless reconnection is triggered when current sheet is compressed as thin as ion kinetic scales under the influence of external driving flow. On the other hand configuration of the magnetic field leading to formation of diffusion region is determined in macroscopic scale and topological change after reconnection is also expressed in macroscopic scale. Thus magnetic reconnection is typical multi-scale phenomenon and microscopic and macroscopic physics are strongly coupled. Recently Horiuchi et al. developed an effective resistivity model based on particle-in-cell (PIC) simulation results obtained in study of collisionless driven reconnection and applied to a global magnetohydrodynamics (MHD) simulation of substorm in the Earth's magnetosphere. They showed reproduction of global behavior in substrom such as dipolarization and flux rope formation by global three dimensional MHD simulation. Usami et al. developed multi-hierarchy simulation model, in which macroscopic and microscopic physics are solved self-consistently and simultaneously. Based on the domain decomposition method, this model consists of three parts: a MHD algorithm for macroscopic global dynamics, a PIC algorithm for microscopic kinetic physics, and an interface algorithm to interlock macro and micro hierarchies. They verified the interface algorithm by simulation of plasma injection flow. In their latest work, this model was applied to collisionless reconnection in an open system and magnetic reconnection was successfully found. In this paper, we describe our approach to clarify multi-scale phenomena and report the current status. Our recent study about extension of the MHD domain to global system is presented. We

  7. The Value of Biomedical Simulation Environments to Future Human Space Flight Missions (United States)

    Mulugeta, Lealem; Myers, Jerry G.; Skytland, Nicholas G.; Platts, Steven H.


    With the ambitious goals to send manned missions to asteroids and onto Mars, substantial work will be required to ensure the well being of the men and women who will undertake these difficult missions. Unlike current International Space Station or Shuttle missions, astronauts will be required to endure long-term exposure to higher levels of radiation, isolation and reduced gravity. These new operation conditions will pose health risks that are currently not well understood and perhaps unanticipated. Therefore, it is essential to develop and apply advanced tools to predict, assess and mitigate potential hazards to astronaut health. NASA s Digital Astronaut Project (DAP) is working to develop and apply computational models of physiologic response to space flight operation conditions over various time periods and environmental circumstances. The collective application and integration of well vetted models assessing the physiology, biomechanics and anatomy is referred to as the Digital Astronaut. The Digital Astronaut simulation environment will serve as a practical working tool for use by NASA in operational activities such as the prediction of biomedical risks and functional capabilities of astronauts. In additional to space flight operation conditions, DAP s work has direct applicability to terrestrial biomedical research by providing virtual environments for hypothesis testing, experiment design, and to reduce animal/human testing. A practical application of the DA to assess pre and post flight responses to exercise is illustrated and the difficulty in matching true physiological responses is discussed.

  8. GNSS reflectometry aboard the International Space Station: phase-altimetry simulation to detect ocean topography anomalies (United States)

    Semmling, Maximilian; Leister, Vera; Saynisch, Jan; Zus, Florian; Wickert, Jens


    An ocean altimetry experiment using Earth reflected GNSS signals has been proposed to the European Space Agency (ESA). It is part of the GNSS Reflectometry Radio Occultation Scatterometry (GEROS) mission that is planned aboard the International Space Station (ISS). Altimetric simulations are presented that examine the detection of ocean topography anomalies assuming GNSS phase delay observations. Such delay measurements are well established for positioning and are possible due to a sufficient synchronization of GNSS receiver and transmitter. For altimetric purpose delays of Earth reflected GNSS signals can be observed similar to radar altimeter signals. The advantage of GNSS is the synchronized separation of transmitter and receiver that allow a significantly increased number of observation per receiver due to more than 70 GNSS transmitters currently in orbit. The altimetric concept has already been applied successfully to flight data recorded over the Mediterranean Sea. The presented altimetric simulation considers anomalies in the Agulhas current region which are obtained from the Region Ocean Model System (ROMS). Suitable reflection events in an elevation range between 3° and 30° last about 10min with ground track's length >3000km. Typical along-track footprints (1s signal integration time) have a length of about 5km. The reflection's Fresnel zone limits the footprint of coherent observations to a major axis extention between 1 to 6km dependent on the elevation. The altimetric performance depends on the signal-to-noise ratio (SNR) of the reflection. Simulation results show that precision is better than 10cm for SNR of 30dB. Whereas, it is worse than 0.5m if SNR goes down to 10dB. Precision, in general, improves towards higher elevation angles. Critical biases are introduced by atmospheric and ionospheric refraction. Corresponding correction strategies are still under investigation.

  9. Experiments and simulation of a net closing mechanism for tether-net capture of space debris (United States)

    Sharf, Inna; Thomsen, Benjamin; Botta, Eleonora M.; Misra, Arun K.


    This research addresses the design and testing of a debris containment system for use in a tether-net approach to space debris removal. The tether-net active debris removal involves the ejection of a net from a spacecraft by applying impulses to masses on the net, subsequent expansion of the net, the envelopment and capture of the debris target, and the de-orbiting of the debris via a tether to the chaser spacecraft. To ensure a debris removal mission's success, it is important that the debris be successfully captured and then, secured within the net. To this end, we present a concept for a net closing mechanism, which we believe will permit consistently successful debris capture via a simple and unobtrusive design. This net closing system functions by extending the main tether connecting the chaser spacecraft and the net vertex to the perimeter and around the perimeter of the net, allowing the tether to actuate closure of the net in a manner similar to a cinch cord. A particular embodiment of the design in a laboratory test-bed is described: the test-bed itself is comprised of a scaled-down tether-net, a supporting frame and a mock-up debris. Experiments conducted with the facility demonstrate the practicality of the net closing system. A model of the net closure concept has been integrated into the previously developed dynamics simulator of the chaser/tether-net/debris system. Simulations under tether tensioning conditions demonstrate the effectiveness of the closure concept for debris containment, in the gravity-free environment of space, for a realistic debris target. The on-ground experimental test-bed is also used to showcase its utility for validating the dynamics simulation of the net deployment, and a full-scale automated setup would make possible a range of validation studies of other aspects of a tether-net debris capture mission.

  10. Dynamic simulation of space heating systems with radiators controlled by TRVs in buildings

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Baoping; Fu, Lin; Di, Hongfa [Department of Building Science, School of Architecture, Tsinghua University, Beijing 100084 (China)


    The objective of this paper is to develop a model for simulating the thermal and hydraulic behavior of space heating systems with radiators controlled by thermostat valves (TRVs) in multi-family buildings. This is done by treating the building and the heating system as a complete entity. Sub-models for rooms, radiators, TRVs, and the hydraulic network are derived. Then the suggested sub-models are combined to form an integrated model by considering interactions between them. The proposed model takes into account the heat transfer between neighboring rooms, the transport delay in the radiator, the self-adjusting function of the TRV, and the consumer's regulation behavior, as well as the hydraulic interactions between consumers. To test the model, two space heating systems in Beijing and Tianjin were investigated, and the model was validated under three operation modes. There was good agreement between the measured and simulated values for room temperature, return water temperature, and flow rate. A modeling analysis case was given based on an existing building and heating system. It was found that when the set value of the TRVs were kept on 2-3, about 12.4% reduction of heat consumption could be gained, compared with the situation in which the TRVs were kept fully open. The water flow rate was an important index that truly reflected the heat load change. It was also noted that if the flow rate or supply water temperature changed much during the transport delay time in the radiator, ignoring the transport delay would introduce an obvious deviation of the simulation results. Additionally, when an apartment stopped using the heating system during a heating season, the heat consumption of its neighboring apartments would be increased about 6-14%. (author)

  11. Internal Flow Simulation of Enhanced Performance Solid Rocket Booster for the Space Transportation System (United States)

    Ahmad, Rashid A.; McCool, Alex (Technical Monitor)


    An enhanced performance solid rocket booster concept for the space shuttle system has been proposed. The concept booster will have strong commonality with the existing, proven, reliable four-segment Space Shuttle Reusable Solid Rocket Motors (RSRM) with individual component design (nozzle, insulator, etc.) optimized for a five-segment configuration. Increased performance is desirable to further enhance safety/reliability and/or increase payload capability. Performance increase will be achieved by adding a fifth propellant segment to the current four-segment booster and opening the throat to accommodate the increased mass flow while maintaining current pressure levels. One development concept under consideration is the static test of a "standard" RSRM with a fifth propellant segment inserted and appropriate minimum motor modifications. Feasibility studies are being conducted to assess the potential for any significant departure in component performance/loading from the well-characterized RSRM. An area of concern is the aft motor (submerged nozzle inlet, aft dome, etc.) where the altered internal flow resulting from the performance enhancing features (25% increase in mass flow rate, higher Mach numbers, modified subsonic nozzle contour) may result in increased component erosion and char. To assess this issue and to define the minimum design changes required to successfully static test a fifth segment RSRM engineering test motor, internal flow studies have been initiated. Internal aero-thermal environments were quantified in terms of conventional convective heating and discrete phase alumina particle impact/concentration and accretion calculations via Computational Fluid Dynamics (CFD) simulation. Two sets of comparative CFD simulations of the RSRM and the five-segment (IBM) concept motor were conducted with CFD commercial code FLUENT. The first simulation involved a two-dimensional axi-symmetric model of the full motor, initial grain RSRM. The second set of analyses

  12. Simulating the Effect of Space Vehicle Environments on Directional Solidification of a Binary Alloy (United States)

    Westra, D. G.; Heinrich, J. C.; Poirier, D. R.


    Space microgravity missions are designed to provide a microgravity environment for scientific experiments, but these missions cannot provide a perfect environment, due to vibrations caused by crew activity, on-board experiments, support systems (pumps, fans, etc.), periodic orbital maneuvers, and water dumps. Therefore, it is necessary to predict the impact of these vibrations on space experiments, prior to performing them. Simulations were conducted to study the effect of the vibrations on the directional solidification of a dendritic alloy. Finite element ca!cu!attie?ls were dme with a simd2titcr based on a continuum model of dendritic solidification, using the Fractional Step Method (FSM). The FSM splits the solution of the momentum equation into two steps: the viscous intermediate step, which does not enforce continuity; and the inviscid projection step, which calculates the pressure and enforces continuity. The FSM provides significant computational benefits for predicting flows in a directionally solidified alloy, compared to other methods presently employed, because of the efficiency gains in the uncoupled solution of velocity and pressure. finite differences, arises when the interdendritic liquid reaches the eutectic temperature and concentration. When a node reaches eutectic temperature, it is assumed that the solidification of the eutectic liquid continues at constant temperature until all the eutectic is solidified. With this approach, solidification is not achieved continuously across an element; rather, the element is not considered solidified until the eutectic isotherm overtakes the top nodes. For microgravity simulations, where the convection is driven by shrinkage, it introduces large variations in the fluid velocity. When the eutectic isotherm reaches a node, all the eutectic must be solidified in a short period, causing an abrupt increase in velocity. To overcome this difficulty, we employed a scheme to numerically predict a more accurate value

  13. WRF simulation of a severe hailstorm over Baramati: a study into the space-time evolution (United States)

    Murthy, B. S.; Latha, R.; Madhuparna, H.


    Space-time evolution of a severe hailstorm occurred over the western India as revealed by WRF-ARW simulations are presented. We simulated a specific event centered over Baramati (18.15°N, 74.58°E, 537 m AMSL) on March 9, 2014. A physical mechanism, proposed as a conceptual model, signifies the role of multiple convective cells organizing through outflows leading to a cold frontal type flow, in the presence of a low over the northern Arabian Sea, propagates from NW to SE triggering deep convection and precipitation. A `U' shaped cold pool encircled by a converging boundary forms to the north of Baramati due to precipitation behind the moisture convergence line with strong updrafts ( 15 ms-1) leading to convective clouds extending up to 8 km in a narrow region of 30 km. The outflows from the convective clouds merge with the opposing southerly or southwesterly winds from the Arabian Sea and southerly or southeasterly winds from the Bay of Bengal resulting in moisture convergence (maximum 80 × 10-3 g kg-1 s-1). The vertical profile of the area-averaged moisture convergence over the cold pool shows strong convergence above 850 hPa and divergence near the surface indicating elevated convection. Radar reflectivity (50-60 dBZ) and vertical component of vorticity maximum ( 0.01-0.14 s-1) are observed along the convergence zone. Stratiform clouds ahead of the squall line and parallel wind flow at 850 hPa and nearly perpendicular flow at higher levels relative to squall line as evidenced by relatively low and wide-spread reflectivity suggests that organizational mode of squall line may be categorized as `Mixed Mode' type where northern part can be a parallel stratiform while the southern part resembles with a leading stratiform. Simulated rainfall (grid scale 27 km) leads the observed rainfall by 1 h while its magnitude is 2 times of the observed rainfall (grid scale 100 km) derived from Kalpana-1. Thus, this study indicates that under synoptically favorable conditions

  14. Quantifying uncertainty in Transcranial Magnetic Stimulation - A high resolution simulation study in ICBM space. (United States)

    Toschi, Nicola; Keck, Martin E; Welt, Tobias; Guerrisi, Maria


    Transcranial Magnetic Stimulation offers enormous potential for noninvasive brain stimulation. While it is known that brain tissue significantly "reshapes" induced field and charge distributions, most modeling investigations to-date have focused on single-subject data with limited generality. Further, the effects of the significant uncertainties which exist in the simulation (i.e. brain conductivity distributions) and stimulation (e.g. coil positioning and orientations) setup have not been quantified. In this study, we construct a high-resolution anisotropic head model in standard ICBM space, which can be used as a population-representative standard for bioelectromagnetic simulations. Further, we employ Monte-Carlo simulations in order to quantify how uncertainties in conductivity values propagate all the way to induced field and currents, demonstrating significant, regionally dependent dispersions in values which are commonly assumed "ground truth". This framework can be leveraged in order to quantify the effect of any type of uncertainty in noninvasive brain stimulation and bears relevance in all applications of TMS, both investigative and therapeutic.

  15. The Aouda.X space suit simulator and its applications to astrobiology. (United States)

    Groemer, Gernot E; Hauth, Stefan; Luger, Ulrich; Bickert, Klaus; Sattler, Birgit; Hauth, Eva; Föger, Daniel; Schildhammer, Daniel; Agerer, Christian; Ragonig, Christoph; Sams, Sebastian; Kaineder, Felix; Knoflach, Martin


    We have developed the space suit simulator Aouda.X, which is capable of reproducing the physical and sensory limitations a flight-worthy suit would have on Mars. Based upon a Hard-Upper-Torso design, it has an advanced human-machine interface and a sensory network connected to an On-Board Data Handling system to increase the situational awareness in the field. Although the suit simulator is not pressurized, the physical forces that lead to a reduced working envelope and physical performance are reproduced with a calibrated exoskeleton. This allows us to simulate various pressure regimes from 0.3-1 bar. Aouda.X has been tested in several laboratory and field settings, including sterile sampling at 2800 m altitude inside a glacial ice cave and a cryochamber at -110°C, and subsurface tests in connection with geophysical instrumentation relevant to astrobiology, including ground-penetrating radar, geoacoustics, and drilling. The communication subsystem allows for a direct interaction with remote science teams via telemetry from a mission control center. Aouda.X as such is a versatile experimental platform for studying Mars exploration activities in a high-fidelity Mars analog environment with a focus on astrobiology and operations research that has been optimized to reduce the amount of biological cross contamination. We report on the performance envelope of the Aouda.X system and its operational limitations.

  16. In-Vessel Composting of Simulated Long-Term Missions Space-Related Solid Wastes (United States)

    Rodriguez-Carias, Abner A.; Sager, John; Krumins, Valdis; Strayer, Richard; Hummerick, Mary; Roberts, Michael S.


    Reduction and stabilization of solid wastes generated during space missions is a major concern for the Advanced Life Support - Resource Recovery program at the NASA, Kennedy Space Center. Solid wastes provide substrates for pathogen proliferation, produce strong odor, and increase storage requirements during space missions. A five periods experiment was conducted to evaluate the Space Operation Bioconverter (SOB), an in vessel composting system, as a biological processing technology to reduce and stabilize simulated long-term missions space related solid-wastes (SRSW). For all periods, SRSW were sorted into components with fast (FBD) and slow (SBD) biodegradability. Uneaten food and plastic were used as a major FBD and SBD components, respectively. Compost temperature (C), CO2 production (%), mass reduction (%), and final pH were utilized as criteria to determine compost quality. In period 1, SOB was loaded with a 55% FBD: 45% SBD mixture and was allowed to compost for 7 days. An eleven day second composting period was conducted loading the SOB with 45% pre-composted SRSW and 55% FBD. Period 3 and 4 evaluated the use of styrofoam as a bulking agent and the substitution of regular by degradable plastic on the composting characteristics of SRSW, respectively. The use of ceramic as a bulking agent and the relationship between initial FBD mass and heat production was investigated in period 5. Composting SRSW resulted in an acidic fermentation with a minor increase in compost temperature, low CO2 production, and slightly mass reduction. Addition of styrofoam as a bulking agent and substitution of regular by biodegradable plastic improved the composting characteristics of SRSW, as evidenced by higher pH, CO2 production, compost temperature and mass reduction. Ceramic as a bulking agent and increase the initial FBD mass (4.4 kg) did not improve the composting process. In summary, the SOB is a potential biological technology for reduction and stabilization of mission space

  17. A High Fidelity Approach to Data Simulation for Space Situational Awareness Missions (United States)

    Hagerty, S.; Ellis, H., Jr.


    Space Situational Awareness (SSA) is vital to maintaining our Space Superiority. A high fidelity, time-based simulation tool, PROXOR™ (Proximity Operations and Rendering), supports SSA by generating realistic mission scenarios including sensor frame data with corresponding truth. This is a unique and critical tool for supporting mission architecture studies, new capability (algorithm) development, current/future capability performance analysis, and mission performance prediction. PROXOR™ provides a flexible architecture for sensor and resident space object (RSO) orbital motion and attitude control that simulates SSA, rendezvous and proximity operations scenarios. The major elements of interest are based on the ability to accurately simulate all aspects of the RSO model, viewing geometry, imaging optics, sensor detector, and environmental conditions. These capabilities enhance the realism of mission scenario models and generated mission image data. As an input, PROXOR™ uses a library of 3-D satellite models containing 10+ satellites, including low-earth orbit (e.g., DMSP) and geostationary (e.g., Intelsat) spacecraft, where the spacecraft surface properties are those of actual materials and include Phong and Maxwell-Beard bidirectional reflectance distribution function (BRDF) coefficients for accurate radiometric modeling. We calculate the inertial attitude, the changing solar and Earth illumination angles of the satellite, and the viewing angles from the sensor as we propagate the RSO in its orbit. The synthetic satellite image is rendered at high resolution and aggregated to the focal plane resolution resulting in accurate radiometry even when the RSO is a point source. The sensor model includes optical effects from the imaging system [point spread function (PSF) includes aberrations, obscurations, support structures, defocus], detector effects (CCD blooming, left/right bias, fixed pattern noise, image persistence, shot noise, read noise, and quantization

  18. Approximate Bayesian Computation by Subset Simulation using hierarchical state-space models (United States)

    Vakilzadeh, Majid K.; Huang, Yong; Beck, James L.; Abrahamsson, Thomas


    A new multi-level Markov Chain Monte Carlo algorithm for Approximate Bayesian Computation, ABC-SubSim, has recently appeared that exploits the Subset Simulation method for efficient rare-event simulation. ABC-SubSim adaptively creates a nested decreasing sequence of data-approximating regions in the output space that correspond to increasingly closer approximations of the observed output vector in this output space. At each level, multiple samples of the model parameter vector are generated by a component-wise Metropolis algorithm so that the predicted output corresponding to each parameter value falls in the current data-approximating region. Theoretically, if continued to the limit, the sequence of data-approximating regions would converge on to the observed output vector and the approximate posterior distributions, which are conditional on the data-approximation region, would become exact, but this is not practically feasible. In this paper we study the performance of the ABC-SubSim algorithm for Bayesian updating of the parameters of dynamical systems using a general hierarchical state-space model. We note that the ABC methodology gives an approximate posterior distribution that actually corresponds to an exact posterior where a uniformly distributed combined measurement and modeling error is added. We also note that ABC algorithms have a problem with learning the uncertain error variances in a stochastic state-space model and so we treat them as nuisance parameters and analytically integrate them out of the posterior distribution. In addition, the statistical efficiency of the original ABC-SubSim algorithm is improved by developing a novel strategy to regulate the proposal variance for the component-wise Metropolis algorithm at each level. We demonstrate that Self-regulated ABC-SubSim is well suited for Bayesian system identification by first applying it successfully to model updating of a two degree-of-freedom linear structure for three cases: globally

  19. Simulated Partners and Collaborative Exercise (SPACE to boost motivation for astronauts: study protocol

    Directory of Open Access Journals (Sweden)

    Deborah L. Feltz


    Full Text Available Abstract Background Astronauts may have difficulty adhering to exercise regimens at vigorous intensity levels during long space missions. Vigorous exercise is important for aerobic and musculoskeletal health during space missions and afterwards. A key impediment to maintaining vigorous exercise is motivation. Finding ways to motivate astronauts to exercise at levels necessary to mitigate reductions in musculoskeletal health and aerobic capacity have not been explored. The focus of Simulated Partners and Collaborative Exercise (SPACE is to use recently documented motivation gains in task groups to heighten the exercise experience for participants, similar in age and fitness to astronauts, for vigorous exercise over a 6-month exercise regimen. A secondary focus is to determine the most effective features in simulated exercise partners for enhancing enjoyment, self-efficacy, and social connectedness. The aims of the project are to (1 Create software-generated (SG exercise partners and interface software with a cycle ergometer; (2 Pilot test design features of SG partners within a video exercise game (exergame, and (3 Test whether exercising with an SG partner over 24-week time period, compared to exercising alone, leads to greater work effort, aerobic capacity, muscle strength, exercise adherence, and enhanced psychological parameters. Methods/Design This study was approved by the Institutional Review Board (IRB. Chronic exercisers, between the ages 30 and 62, were asked to exercise on a cycle ergometer 6 days per week for 24 weeks using a routine consisting of alternating between moderate-intensity continuous and high-intensity interval sessions. Participants were assigned to one of three conditions: no partner (control, always faster SG partner, or SG partner who was not always faster. Participants were told they could vary cycle ergometer output to increase or decrease intensity during the sessions. Mean change in cycle ergometer power (watts

  20. Numerical Simulation of Ionospheric Disturbances Generated by the Chelyabinsk and Tunguska Space Body Impacts (United States)

    Shuvalov, V. V.; Khazins, V. M.


    Numerical simulation of atmospheric disturbances during the first hours after the Chelyabinsk and Tunguska space body impacts has been carried out. The results of detailed calculations, including the stages of destruction, evaporation and deceleration of the cosmic body, the generation of atmospheric disturbances and their propagation over distances of thousands of kilometers, have been compared with the results of spherical explosions with energy equal to the kinetic energy of meteoroids. It has been shown that in the case of the Chelyabinsk meteorite, an explosive analogy provides acceptable dimensions of the perturbed region and the perturbation amplitude. With a more powerful Tunguska fall, the resulting atmospheric flow is very different from the explosive one; an atmospheric plume emerges that releases matter from the meteoric trace to an altitude of the order of a thousand kilometers.

  1. Simulation of transients with space-dependent feedback by coarse mesh flux expansion method

    International Nuclear Information System (INIS)

    Langenbuch, S.; Maurer, W.; Werner, W.


    For the simulation of the time-dependent behaviour of large LWR-cores, even the most efficient Finite-Difference (FD) methods require a prohibitive amount of computing time in order to achieve results of acceptable accuracy. Static CM-solutions computed with a mesh-size corresponding to the fuel element structure (about 20 cm) are at least as accurate as FD-solutions computed with about 5 cm mesh-size. For 3d-calculations this results in a reduction of storage requirements by a factor 60 and of computing costs by a factor 40, relative to FD-methods. These results have been obtained for pure neutronic calculations, where feedback is not taken into account. In this paper it is demonstrated that the method retains its accuracy also in kinetic calculations, even in the presence of strong space dependent feedback. (orig./RW) [de

  2. Simulating and assessing boson sampling experiments with phase-space representations (United States)

    Opanchuk, Bogdan; Rosales-Zárate, Laura; Reid, Margaret D.; Drummond, Peter D.


    The search for new, application-specific quantum computers designed to outperform any classical computer is driven by the ending of Moore's law and the quantum advantages potentially obtainable. Photonic networks are promising examples, with experimental demonstrations and potential for obtaining a quantum computer to solve problems believed classically impossible. This introduces a challenge: how does one design or understand such photonic networks? One must be able to calculate observables using general methods capable of treating arbitrary inputs, dissipation, and noise. We develop complex phase-space software for simulating these photonic networks, and apply this to boson sampling experiments. Our techniques give sampling errors orders of magnitude lower than experimental correlation measurements for the same number of samples. We show that these techniques remove systematic errors in previous algorithms for estimating correlations, with large improvements in errors in some cases. In addition, we obtain a scalable channel-combination strategy for assessment of boson sampling devices.

  3. Particle-in-cell/accelerator code for space-charge dominated beam simulation

    Energy Technology Data Exchange (ETDEWEB)


    Warp is a multidimensional discrete-particle beam simulation program designed to be applicable where the beam space-charge is non-negligible or dominant. It is being developed in a collaboration among LLNL, LBNL and the University of Maryland. It was originally designed and optimized for heave ion fusion accelerator physics studies, but has received use in a broader range of applications, including for example laser wakefield accelerators, e-cloud studies in high enery accelerators, particle traps and other areas. At present it incorporates 3-D, axisymmetric (r,z) planar (x-z) and transverse slice (x,y) descriptions, with both electrostatic and electro-magnetic fields, and a beam envelope model. The code is guilt atop the Python interpreter language.

  4. Infrared spectroscopic analysis of the effects of simulated space radiation on a polyimide (United States)

    Ferl, J. E.; Long, E. R., Jr.


    Infrared spectroscopic techniques have been used to study the effects of electron radiation on the polyimide PMDA-p,p-prime- ODA. The radiation exposures were made at various dose rates, for a total dose approximately equal to that for 30 years of exposure to electron radiation in geosynchronous earth orbit. At high dose rates the major effect was probably the formation of a polyisoimide or a charged quaternary amine, and at the low dose rates the effect was a reduction in the amount or aromatic ether linkage. In addition, the effects of dose rate for a small total dose were studied. Elevated temperatures occurred at high dose rates and were, in part, probably the cause of the radiation product. The data suggest that dose rates for accelerated simulations of the space environment should not exceed 100,000 rads/sec.

  5. PIC Simulations of Velocity-space Instabilities in a Decreasing Magnetic Field: Viscosity and Thermal Conduction (United States)

    Riquelme, Mario; Quataert, Eliot; Verscharen, Daniel


    We use particle-in-cell (PIC) simulations of a collisionless, electron–ion plasma with a decreasing background magnetic field, {\\boldsymbol{B}}, to study the effect of velocity-space instabilities on the viscous heating and thermal conduction of the plasma. If | {\\boldsymbol{B}}| decreases, the adiabatic invariance of the magnetic moment gives rise to pressure anisotropies with {p}| | ,j> {p}\\perp ,j ({p}| | ,j and {p}\\perp ,j represent the pressure of species j (electron or ion) parallel and perpendicular to B ). Linear theory indicates that, for sufficiently large anisotropies, different velocity-space instabilities can be triggered. These instabilities in principle have the ability to pitch-angle scatter the particles, limiting the growth of the anisotropies. Our simulations focus on the nonlinear, saturated regime of the instabilities. This is done through the permanent decrease of | {\\boldsymbol{B}}| by an imposed plasma shear. We show that, in the regime 2≲ {β }j≲ 20 ({β }j\\equiv 8π {p}j/| {\\boldsymbol{B}}{| }2), the saturated ion and electron pressure anisotropies are controlled by the combined effect of the oblique ion firehose and the fast magnetosonic/whistler instabilities. These instabilities grow preferentially on the scale of the ion Larmor radius, and make {{Δ }}{p}e/{p}| | ,e≈ {{Δ }}{p}i/{p}| | ,i (where {{Δ }}{p}j={p}\\perp ,j-{p}| | ,j). We also quantify the thermal conduction of the plasma by directly calculating the mean free path of electrons, {λ }e, along the mean magnetic field, finding that {λ }e depends strongly on whether | {\\boldsymbol{B}}| decreases or increases. Our results can be applied in studies of low-collisionality plasmas such as the solar wind, the intracluster medium, and some accretion disks around black holes.

  6. Dynamic load synthesis for shock numerical simulation in space structure design (United States)

    Monti, Riccardo; Gasbarri, Paolo


    Pyroshock loads are the most stressing environments that a space equipment experiences during its operating life from a mechanical point of view. In general, the mechanical designer considers the pyroshock analysis as a very demanding constraint. Unfortunately, due to the non-linear behaviour of the structure under such loads, only the experimental tests can demonstrate if it is able to withstand these dynamic loads. By taking all the previous considerations into account, some preliminary information about the design correctness could be done by performing ;ad-hoc; numerical simulations, for example via commercial finite element software (i.e. MSC Nastran). Usually these numerical tools face the shock solution in two ways: 1) a direct mode, by using a time dependent enforcement and by evaluating the time-response and space-response as well as the internal forces; 2) a modal basis approach, by considering a frequency dependent load and of course by evaluating internal forces in the frequency domain. This paper has the main aim to develop a numerical tool to synthetize the time dependent enforcement based on deterministic and/or genetic algorithm optimisers. In particular starting from a specified spectrum in terms of SRS (Shock Response Spectrum) a time dependent discrete function, typically an acceleration profile, will be obtained to force the equipment by simulating the shock event. The synthetizing time and the interface with standards numerical codes will be two of the main topics dealt with in the paper. In addition a congruity and consistency methodology will be presented to ensure that the identified time dependent loads fully match the specified spectrum.

  7. Visualizing Space Weather: The Planeterrella Auroral Simulator as a Heliophysics Public Outreach Tool (United States)

    Masongsong, E. V.; Lilensten, J.; Booth, M. J.; Suri, G.; Heflinger, T. G.; Angelopoulos, V.


    The NASA THEMIS and ARTEMIS satellite missions study "space weather," which describes the solar wind influence on Earth's protective magnetic shield, the magnetosphere. Space weather is important to study and predict because it can damage critical GPS and communications satellites, harm space travelers, and even disable our global electrical grid. The Planeterrella is an innovative heliophysics outreach demonstration, expanding public awareness of space weather by visualizing the sun-Earth connection up close and in-person. Using a glass vacuum chamber, two magnetized spheres and a 1kV power supply, the device can simulate plasma configurations of the solar corona, solar wind, Van Allen radiation belts, and auroral ovals, all of which are observable only by satellites. This "aurora in a bottle" is a modernized version of the original Terrella built by Kristian Birkeland in the 1890s to show that the aurora are electrical in nature. Adapted from plans by Lilensten et al. at CNRS-IPAG, the UCLA Planeterrella was completed in Nov. 2013, the second device of its kind in the U.S., and the centerpiece of the THEMIS/ARTEMIS mobile public outreach exhibit. In combination with captivating posters, 3D magnetic field models, dazzling aurora videos and magnetosphere animations, the Planeterrella has already introduced over 1200 people to the electrical link between our sun and the planets. Most visitors had seen solar flare images in the news, however the Planeterrella experience enhanced their appreciation of the dynamic solar wind and its effects on Earth's invisible magnetic field. Most importantly, visitors young and old realized that magnets are not just cool toys or only for powering hybrid car motors and MRIs, they are a fundamental aspect of ongoing life on Earth and are key to the formation and evolution of planets, moons, and stars, extending far beyond our galaxy to other planetary systems throughout the universe. Novel visualizations such as the Planeterrella can

  8. CFD-simulation of radiator for air cooling of microprocessors in a limitided space

    Directory of Open Access Journals (Sweden)

    Trofimov V. E.


    Full Text Available One of the final stages of microprocessors development is heat test. This procedure is performed on a special stand, the main element of which is the switching PCB with one or more mounted microprocessor sockets, chipsets, interfaces, jumpers and other components which provide various modes of microprocessor operation. The temperature of microprocessor housing is typically changed using thermoelectric module. The cold surface of the module with controlled temperature is in direct thermal contact with the microprocessor housing designed for cooler installation. On the hot surface of the module a radiator is mounted. The radiator dissipates the cumulative heat flow from both the microprocessor and the module. High density PCB layout, the requirement of free access to the jumpers and interfaces, and the presence of numerous sensors limit the space for radiator mounting and require the use of an extremely compact radiator, especially in air cooling conditions. One of the possible solutions for this problem may reduce the area of the radiator heat-transfer surfaces due to a sharp growth of the heat transfer coefficient without increasing the air flow rate. To ensure a sharp growth of heat transfer coefficient on the heat-transfer surface one should make in the surface one or more dead-end cavities into which the impact air jets would flow. CFD simulation of this type of radiator has been conducted. The heat-aerodynamic characteristics and design recommendations for removing heat from microprocessors in a limited space have been determined.

  9. Simulation and Control Lab Development for Power and Energy Management for NASA Manned Deep Space Missions (United States)

    McNelis, Anne M.; Beach, Raymond F.; Soeder, James F.; McNelis, Nancy B.; May, Ryan; Dever, Timothy P.; Trase, Larry


    The development of distributed hierarchical and agent-based control systems will allow for reliable autonomous energy management and power distribution for on-orbit missions. Power is one of the most critical systems on board a space vehicle, requiring quick response time when a fault or emergency is identified. As NASAs missions with human presence extend beyond low earth orbit autonomous control of vehicle power systems will be necessary and will need to reliably function for long periods of time. In the design of autonomous electrical power control systems there is a need to dynamically simulate and verify the EPS controller functionality prior to use on-orbit. This paper presents the work at NASA Glenn Research Center in Cleveland, Ohio where the development of a controls laboratory is being completed that will be utilized to demonstrate advanced prototype EPS controllers for space, aeronautical and terrestrial applications. The control laboratory hardware, software and application of an autonomous controller for demonstration with the ISS electrical power system is the subject of this paper.


    Energy Technology Data Exchange (ETDEWEB)

    Wu, Benjamin [Department of Physics, University of Florida, Gainesville, FL 32611 (United States); Loo, Sven Van [School of Physics and Astronomy, University of Leeds, Leeds LS2 9JT (United Kingdom); Tan, Jonathan C. [Departments of Astronomy and Physics, University of Florida, Gainesville, FL 32611 (United States); Bruderer, Simon, E-mail: [Max Planck Institute for Extraterrestrial Physics, Giessenbachstrasse 1, D-85748 Garching (Germany)


    We utilize magnetohydrodynamic (MHD) simulations to develop a numerical model for giant molecular cloud (GMC)–GMC collisions between nearly magnetically critical clouds. The goal is to determine if, and under what circumstances, cloud collisions can cause pre-existing magnetically subcritical clumps to become supercritical and undergo gravitational collapse. We first develop and implement new photodissociation region based heating and cooling functions that span the atomic to molecular transition, creating a multiphase ISM and allowing modeling of non-equilibrium temperature structures. Then in 2D and with ideal MHD, we explore a wide parameter space of magnetic field strength, magnetic field geometry, collision velocity, and impact parameter and compare isolated versus colliding clouds. We find factors of ∼2–3 increase in mean clump density from typical collisions, with strong dependence on collision velocity and magnetic field strength, but ultimately limited by flux-freezing in 2D geometries. For geometries enabling flow along magnetic field lines, greater degrees of collapse are seen. We discuss observational diagnostics of cloud collisions, focussing on {sup 13}CO(J = 2–1), {sup 13}CO(J = 3–2), and {sup 12}CO(J = 8–7) integrated intensity maps and spectra, which we synthesize from our simulation outputs. We find that the ratio of J = 8–7 to lower-J emission is a powerful diagnostic probe of GMC collisions.

  11. 2D full-wave simulation of waves in space and tokamak plasmas

    Directory of Open Access Journals (Sweden)

    Kim Eun-Hwa


    Full Text Available Simulation results using a 2D full-wave code (FW2D for space and NSTX fusion plasmas are presented. The FW2D code solves the cold plasma wave equations using the finite element method. The wave code has been successfully applied to describe low frequency waves in planetary magnetospheres (i.e., dipole geometry and the results include generation and propagation of externally driven ultra-low frequency waves via mode conversion at Mercury and mode coupling, refraction and reflection of internally driven field-aligned propagating left-handed electromagnetic ion cyclotron (EMIC waves at Earth. In this paper, global structure of linearly polarized EMIC waves is examined and the result shows such resonant wave modes can be localized near the equatorial plane. We also adopt the FW2D code to tokamak geometry and examine radio frequency (RF waves in the scape-off layer (SOL of tokamaks. By adopting the rectangular and limiter boundary, we compare the results with existing AORSA simulations. The FW2D code results for the high harmonic fast wave heating case on NSTX with a rectangular vessel boundary shows excellent agreement with the AORSA code.

  12. 2D full-wave simulation of waves in space and tokamak plasmas (United States)

    Kim, Eun-Hwa; Bertelli, Nicola; Johnson, Jay; Valeo, Ernest; Hosea, Joel


    Simulation results using a 2D full-wave code (FW2D) for space and NSTX fusion plasmas are presented. The FW2D code solves the cold plasma wave equations using the finite element method. The wave code has been successfully applied to describe low frequency waves in planetary magnetospheres (i.e., dipole geometry) and the results include generation and propagation of externally driven ultra-low frequency waves via mode conversion at Mercury and mode coupling, refraction and reflection of internally driven field-aligned propagating left-handed electromagnetic ion cyclotron (EMIC) waves at Earth. In this paper, global structure of linearly polarized EMIC waves is examined and the result shows such resonant wave modes can be localized near the equatorial plane. We also adopt the FW2D code to tokamak geometry and examine radio frequency (RF) waves in the scape-off layer (SOL) of tokamaks. By adopting the rectangular and limiter boundary, we compare the results with existing AORSA simulations. The FW2D code results for the high harmonic fast wave heating case on NSTX with a rectangular vessel boundary shows excellent agreement with the AORSA code.

  13. A simulation based optimization approach to model and design life support systems for manned space missions (United States)

    Aydogan, Selen

    This dissertation considers the problem of process synthesis and design of life-support systems for manned space missions. A life-support system is a set of technologies to support human life for short and long-term spaceflights, via providing the basic life-support elements, such as oxygen, potable water, and food. The design of the system needs to meet the crewmember demand for the basic life-support elements (products of the system) and it must process the loads generated by the crewmembers. The system is subject to a myriad of uncertainties because most of the technologies involved are still under development. The result is high levels of uncertainties in the estimates of the model parameters, such as recovery rates or process efficiencies. Moreover, due to the high recycle rates within the system, the uncertainties are amplified and propagated within the system, resulting in a complex problem. In this dissertation, two algorithms have been successfully developed to help making design decisions for life-support systems. The algorithms utilize a simulation-based optimization approach that combines a stochastic discrete-event simulation and a deterministic mathematical programming approach to generate multiple, unique realizations of the controlled evolution of the system. The timelines are analyzed using time series data mining techniques and statistical tools to determine the necessary technologies, their deployment schedules and capacities, and the necessary basic life-support element amounts to support crew life and activities for the mission duration.

  14. Numeric simulations of en-masse space closure with sliding mechanics. (United States)

    Kojima, Yukio; Fukui, Hisao


    En-masse sliding mechanics have been typically used for space closure. Because of friction created at the bracket-wire interface, the force system during tooth movement has not been clarified. Long-term tooth movements in en-masse sliding mechanics were simulated with the finite element method. Tipping of the anterior teeth occurred immediately after application of retraction forces. The force system then changed so that the teeth moved almost bodily, and friction occurred at the bracket-wire interface. Net force transferred to the anterior teeth was approximately one fourth of the applied force. The amount of the mesial force acting on the posterior teeth was the same as that acting on the anterior teeth. Irrespective of the amount of friction, the ratio of movement distances between the posterior and anterior teeth was almost the same. By increasing the applied force or decreasing the frictional coefficient, the teeth moved rapidly, but the tipping angle of the anterior teeth increased because of the elastic deflection of the archwire. Finite element simulation clarified the tooth movement and the force system in en-masse sliding mechanics. Long-term tooth movement could not be predicted from the initial force system. The friction was not detrimental to the anchorage. Increasing the applied force or decreasing the friction for rapid tooth movement might result in tipping of the teeth. Copyright © 2010 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  15. A Steam Jet Plume Simulation in a Large Bulk Space with a System Code MARS

    International Nuclear Information System (INIS)

    Bae, Sung Won; Chung, Bub Dong


    From May 2002, the OECD-SETH group has launched the PANDA Project in order to provide an experimental data base for a multi-dimensional code assessment. OECD-SETH group expects the PANDA Project will meet the increasing needs for adequate experimental data for a 3D distribution of relevant variables like the temperature, velocity and steam-air concentrations that are measured with a sufficient resolution and accuracy. The scope of the PANDA Project is the mixture stratification and mixing phenomena in a large bulk space. Total of 24 test series are still being performed in PSI, Switzerland. The PANDA facility consists of 2 main large vessels and 1 connection pipe Within the large vessels, a steam injection nozzle and outlet vent are arranged for each test case. These tests are categorized into 3 modes, i.e. the high momentum, near wall plume, and free plume tests. KAERI has also participated in the SETH group since 1997 so that the multi-dimensional capability of the MARS code could be assessed and developed. Test 17, the high steam jet injection test, has already been simulated by MARS and shows promising results. Now, the test 9 and 9bis cases which use a low speed horizontal steam jet flow have been simulated and investigated

  16. Femtosecond laser irradiation of olivine single crystals: Experimental simulation of space weathering (United States)

    Fazio, A.; Harries, D.; Matthäus, G.; Mutschke, H.; Nolte, S.; Langenhorst, F.


    Space weathering is one of the most common surface process occurring on atmosphere-free bodies such as asteroids and the Moon. It is caused mainly by solar wind irradiation and the impact of micrometeoroids. In order to simulate space weathering effects, in particular those produced by hypervelocity impacts, we produced microcraters via ultra-short (∼100 fs) laser irradiation of crystallographically oriented slices of forsterite-rich (Fo94.7) olivine. The main advantages of the application of a femtosecond laser radiation to reproduce the space weathering effects are (1) the high peak irradiance (1015 W cm-2), which generates the propagation of the shock wave at the nanosecond timescale (i.e., timescale of the micrometeoroid impacts); (2) the rapid transfer of energy to the target material, which avoids the interaction of laser light with the developing vapor plume; (3) a small laser beam, which allows the effects of a single impact to be simulated. The results of our spectroscopic and electron microscopic investigation validate this approach: the samples show strong darkening and reddening of the reflectance spectra and structural damages similar to the natural microcraters found on regolith grains of the Moon and asteroid 25143 Itokawa. Detailed investigations of several microcrater cross-sections by transmission electron microscopy allowed the detection of shock-induced defect microstructures. From the top to the bottom of the grain, the shock wave causes evaporation, melting, solid-state recrystallization, misorientation, fracturing, and the propagation of dislocations with Burgers vectors parallel to [001]. The formation of a short-lived vapor plume causes the kinetic fractionation of the gas and the preferential loss of lighter elements, mostly magnesium and oxygen. The high temperatures within the melt layer and the kinetic loss of oxygen promote the thermal reduction of iron and nickel, which leads to the formation of metallic nanoparticles (npFe0). The

  17. Extinction effects of atmospheric compositions on return signals of space-based lidar from numerical simulation (United States)

    Yao, Lilin; Wang, Fu; Min, Min; Zhang, Ying; Guo, Jianping; Yu, Xiao; Chen, Binglong; Zhao, Yiming; Wang, Lidong


    The atmospheric composition induced extinction effect on return signals of space-based lidar remains incomprehensively understood, especially around 355 nm and 2051 nm channels. Here we simulated the extinction effects of atmospheric gases (e.g., H2O, CO2, and O3) and six types of aerosols (clean continental, clean marine, dust, polluted continental, polluted dust, and smoke) on return signals of space-based lidar system at 355 nm, 532 nm, 1064 nm, and 2051 nm channels, based on a robust lidar return signal simulator in combination with radiative transfer model (LBLRTM). Results show significant Rayleigh (molecular) scattering effects in the return signals at 355 nm and 532 nm channels, which markedly decays with increases in wavelength. The spectral transmittance of CO2 is nearly 0, yet the transmittance of H2O is approximately 100% at 2051 nm, which verifies this 2051 nm channel is suitable for CO2 retrieval. The spectral transmittance also reveals another possible window for CO2 and H2O detection at 2051.6 nm, since their transmittance both near 0.5. Moreover the corresponding Doppler return signals at 2051.6 nm channel can be used to retrieve wind field. Thus we suggest 2051 nm channel may better be centered at 2051.6 nm. Using the threshold for the signal-to-noise ratio (SNR) of return signals, the detection ranges for three representative distribution scenarios for the six types of aerosols at four typical lidar channels are determined. The results clearly show that high SNR values can be seen ubiquitously in the atmosphere ranging from the height of aerosol layer top to 25 km at 355 nm, and can been found at 2051.6 nm in the lower troposphere that highly depends on aerosol distribution scenario in the vertical. This indicates that the Doppler space-based lidar system with a double-channel joint detection mode is able to retrieve atmospheric wind field or profile from 0 to 25 km.

  18. Real-space grids and the Octopus code as tools for the development of new simulation approaches for electronic systems (United States)

    Andrade, Xavier; Strubbe, David; De Giovannini, Umberto; Larsen, Ask Hjorth; Oliveira, Micael J. T.; Alberdi-Rodriguez, Joseba; Varas, Alejandro; Theophilou, Iris; Helbig, Nicole; Verstraete, Matthieu J.; Stella, Lorenzo; Nogueira, Fernando; Aspuru-Guzik, Alán; Castro, Alberto; Marques, Miguel A. L.; Rubio, Angel

    Real-space grids are a powerful alternative for the simulation of electronic systems. One of the main advantages of the approach is the flexibility and simplicity of working directly in real space where the different fields are discretized on a grid, combined with competitive numerical performance and great potential for parallelization. These properties constitute a great advantage at the time of implementing and testing new physical models. Based on our experience with the Octopus code, in this article we discuss how the real-space approach has allowed for the recent development of new ideas for the simulation of electronic systems. Among these applications are approaches to calculate response properties, modeling of photoemission, optimal control of quantum systems, simulation of plasmonic systems, and the exact solution of the Schr\\"odinger equation for low-dimensionality systems.

  19. Can crawl space temperature and moisture conditions be calculated with a whole-building hygrothermal simulation tool?

    DEFF Research Database (Denmark)

    Vanhoutteghem, Lies; Morelli, Martin; Sørensen, Lars Schiøtt


    of measurements was compared with simulations of temperature and moisture condition in the floor structure and crawl space. The measurements showed that the extra 50 mm insulation placed below the beams reduced moisture content in the beams below 20 weight% all year. A reasonable agreement between......The hygrothermal behaviour of an outdoor ventilated crawl space with two different designs of the floor structure was investigated. The first design had 250 mm insulation and visible wooden beams towards the crawl space. The second design had 300 mm insulation and no visible wooden beams. One year...... the measurements and simulations was found; however, the evaporation from the soil was a dominant parameter affecting the hygrothermal response in the crawl space and floor structure....

  20. Evaluation of SPACE code for simulation of inadvertent opening of spray valve in Shin Kori unit 1

    International Nuclear Information System (INIS)

    Kim, Seyun; Youn, Bumsoo


    SPACE code is expected to be applied to the safety analysis for LOCA (Loss of Coolant Accident) and Non-LOCA scenarios. SPACE code solves two-fluid, three-field governing equations and programmed with C++ computer language using object-oriented concepts. To evaluate the analysis capability for the transient phenomena in the actual nuclear power plant, an inadvertent opening of spray valve in startup test phase of Shin Kori unit 1 was simulated with SPACE code. To evaluate the analysis capability for the transient phenomena in the actual nuclear power plant, an inadvertent opening of spray valve in startup test phase of Shin Kori unit 1 was simulated with SPACE code

  1. Ground-based simulation of telepresence for materials science experiments. [remote viewing and control of processes aboard Space Station (United States)

    Johnston, James C.; Rosenthal, Bruce N.; Bonner, Mary JO; Hahn, Richard C.; Herbach, Bruce


    A series of ground-based telepresence experiments have been performed to determine the minimum video frame rate and resolution required for the successive performance of materials science experiments in space. The approach used is to simulate transmission between earth and space station with transmission between laboratories on earth. The experiments include isothermal dendrite growth, physical vapor transport, and glass melting. Modifications of existing apparatus, software developed, and the establishment of an inhouse network are reviewed.

  2. Generation of initial kinetic distributions for simulation of long-pulse charged particle beams with high space-charge intensity

    Directory of Open Access Journals (Sweden)

    Steven M. Lund


    Full Text Available Self-consistent Vlasov-Poisson simulations of beams with high space-charge intensity often require specification of initial phase-space distributions that reflect properties of a beam that is well adapted to the transport channel—both in terms of low-order rms (envelope properties as well as the higher-order phase-space structure. Here, we first review broad classes of kinetic distributions commonly in use as initial Vlasov distributions in simulations of unbunched or weakly bunched beams with intense space-charge fields including the following: the Kapchinskij-Vladimirskij (KV equilibrium, continuous-focusing equilibria with specific detailed examples, and various nonequilibrium distributions, such as the semi-Gaussian distribution and distributions formed from specified functions of linear-field Courant-Snyder invariants. Important practical details necessary to specify these distributions in terms of standard accelerator inputs are presented in a unified format. Building on this presentation, a new class of approximate initial kinetic distributions are constructed using transformations that preserve linear focusing, single-particle Courant-Snyder invariants to map initial continuous-focusing equilibrium distributions to a form more appropriate for noncontinuous focusing channels. Self-consistent particle-in-cell simulations are employed to show that the approximate initial distributions generated in this manner are better adapted to the focusing channels for beams with high space-charge intensity. This improved capability enables simulations that more precisely probe intrinsic stability properties and machine performance.

  3. The Space-Time Conservative Schemes for Large-Scale, Time-Accurate Flow Simulations with Tetrahedral Meshes (United States)

    Venkatachari, Balaji Shankar; Streett, Craig L.; Chang, Chau-Lyan; Friedlander, David J.; Wang, Xiao-Yen; Chang, Sin-Chung


    Despite decades of development of unstructured mesh methods, high-fidelity time-accurate simulations are still predominantly carried out on structured, or unstructured hexahedral meshes by using high-order finite-difference, weighted essentially non-oscillatory (WENO), or hybrid schemes formed by their combinations. In this work, the space-time conservation element solution element (CESE) method is used to simulate several flow problems including supersonic jet/shock interaction and its impact on launch vehicle acoustics, and direct numerical simulations of turbulent flows using tetrahedral meshes. This paper provides a status report for the continuing development of the space-time conservation element solution element (CESE) numerical and software framework under the Revolutionary Computational Aerosciences (RCA) project. Solution accuracy and large-scale parallel performance of the numerical framework is assessed with the goal of providing a viable paradigm for future high-fidelity flow physics simulations.

  4. Space Suit Simulator (S3) for Partial Gravity EVA Experimentation and Training, Phase II (United States)

    National Aeronautics and Space Administration — Pressurized space suits impose high joint torques on the wearer, reducing mobility for upper and lower body motions. Using actual space suits in training or...

  5. Molecular Dynamic Simulation of High Thermal Conductivity Synthetic Spider Silk for Thermal Management in Space (United States)

    National Aeronautics and Space Administration — Thermal management is crucial to space technology. Because electronic and other thermally sensitive materials will be located in an essentially airless environment,...

  6. Observation and simulation of space-charge effects in a radio-frequency photoinjector using a transverse multibeamlet distribution

    Directory of Open Access Journals (Sweden)

    M. Rihaoui


    Full Text Available We report on an experimental study of space-charge effects in a radio-frequency (rf photoinjector. A 5 MeV electron bunch, consisting of a number of beamlets separated transversely, was generated in an rf photocathode gun and propagated in the succeeding drift space. The collective interaction of these beamlets was studied for different experimental conditions. The experiment allowed the exploration of space-charge effects and its comparison with 3D particle-in-cell simulations. Our observations also suggest the possible use of a multibeam configuration to tailor the transverse distribution of an electron beam.

  7. Networked simulation for team training of Space Station astronauts, ground controllers, and scientists - A training and development environment (United States)

    Hajare, Ankur R.; Wick, Daniel T.; Bovenzi, James J.


    The purpose of this paper is to describe plans for the Space Station Training Facility (SSTF) which has been designed to meet the envisioned training needs for Space Station Freedom. To meet these needs, the SSTF will integrate networked simulators with real-world systems in five training modes: Stand-Alone, Combined, Joint-Combined, Integrated, and Joint-Integrated. This paper describes the five training modes within the context of three training scenaries. In addition, this paper describes an authoring system which will support the rapid integration of new real-world system changes in the Space Station Freedom Program.

  8. Extraction of space-charge-dominated ion beams from an ECR ion source: Theory and simulation (United States)

    Alton, G. D.; Bilheux, H.


    Extraction of high quality space-charge-dominated ion beams from plasma ion sources constitutes an optimization problem centered about finding an optimal concave plasma emission boundary that minimizes half-angular divergence for a given charge state, independent of the presence or lack thereof of a magnetic field in the extraction region. The curvature of the emission boundary acts to converge/diverge the low velocity beam during extraction. Beams of highest quality are extracted whenever the half-angular divergence, ω, is minimized. Under minimum half-angular divergence conditions, the plasma emission boundary has an optimum curvature and the perveance, P, current density, j+ext, and extraction gap, d, have optimum values for a given charge state, q. Optimum values for each of the independent variables (P, j+ext and d) are found to be in close agreement with those derived from elementary analytical theory for extraction with a simple two-electrode extraction system, independent of the presence of a magnetic field. The magnetic field only increases the emittances of beams through additional aberrational effects caused by increased angular divergences through coupling of the longitudinal to the transverse velocity components of particles as they pass though the mirror region of the electron cyclotron resonance (ECR) ion source. This article reviews the underlying theory of elementary extraction optics and presents results derived from simulation studies of extraction of space-charge dominated heavy-ion beams of varying mass, charge state, and intensity from an ECR ion source with emphasis on magnetic field induced effects.

  9. Extraction of space-charge-dominated ion beams from an ECR ion source: Theory and simulation

    International Nuclear Information System (INIS)

    Alton, G.D.; Bilheux, H.


    Extraction of high quality space-charge-dominated ion beams from plasma ion sources constitutes an optimization problem centered about finding an optimal concave plasma emission boundary that minimizes half-angular divergence for a given charge state, independent of the presence or lack thereof of a magnetic field in the extraction region. The curvature of the emission boundary acts to converge/diverge the low velocity beam during extraction. Beams of highest quality are extracted whenever the half-angular divergence, ω, is minimized. Under minimum half-angular divergence conditions, the plasma emission boundary has an optimum curvature and the perveance, P, current density, j +ext , and extraction gap, d, have optimum values for a given charge state, q. Optimum values for each of the independent variables (P, j +ext and d) are found to be in close agreement with those derived from elementary analytical theory for extraction with a simple two-electrode extraction system, independent of the presence of a magnetic field. The magnetic field only increases the emittances of beams through additional aberrational effects caused by increased angular divergences through coupling of the longitudinal to the transverse velocity components of particles as they pass though the mirror region of the electron cyclotron resonance (ECR) ion source. This article reviews the underlying theory of elementary extraction optics and presents results derived from simulation studies of extraction of space-charge dominated heavy-ion beams of varying mass, charge state, and intensity from an ECR ion source with emphasis on magnetic field induced effects

  10. Space Weathering of Super-Earths: Model Simulations of Exospheric Sodium Escape from 61 Virgo b

    Energy Technology Data Exchange (ETDEWEB)

    Yoneda, M.; Berdyugina, S.; Kuhn, J. [Kiepenheuer Institute for Solar Physics, Schöneckstraße 6, 79104 Freiburg im Breisgau (Germany)


    Rocky exoplanets are expected to be eroded by space weather in a similar way as in the solar system. In particular, Mercury is one of the dramatically eroded planets whose material continuously escapes into its exosphere and further into space. This escape is well traced by sodium atoms scattering sunlight. Due to solar wind impact, micrometeorite impacts, photo-stimulated desorption and thermal desorption, sodium atoms are released from surface regolith. Some of these released sodium atoms are escaping from Mercury’s gravitational-sphere. They are dragged anti-Sun-ward and form a tail structure. We expect similar phenomena on exoplanets. The hot super-Earth 61 Vir b orbiting a G3V star at only 0.05 au may show a similar structure. Because of its small separation from the star, the sodium release mechanisms may be working more efficiently on hot super-Earths than on Mercury, although the strong gravitational force of Earth-sized or even more massive planets may be keeping sodium atoms from escaping from the planet. Here, we performed model simulations for Mercury (to verify our model) and 61 Vir b as a representative super-Earth. We have found that sodium atoms can escape from this exoplanet due to stellar wind sputtering and micrometeorite impacts, to form a sodium tail. However, in contrast to Mercury, the tail on this hot super-Earth is strongly aligned with the anti-starward direction because of higher light pressure. Our model suggests that 61 Vir b seems to have an exo-base atmosphere like that of Mercury.

  11. Space Weathering of Super-Earths: Model Simulations of Exospheric Sodium Escape from 61 Virgo b

    International Nuclear Information System (INIS)

    Yoneda, M.; Berdyugina, S.; Kuhn, J.


    Rocky exoplanets are expected to be eroded by space weather in a similar way as in the solar system. In particular, Mercury is one of the dramatically eroded planets whose material continuously escapes into its exosphere and further into space. This escape is well traced by sodium atoms scattering sunlight. Due to solar wind impact, micrometeorite impacts, photo-stimulated desorption and thermal desorption, sodium atoms are released from surface regolith. Some of these released sodium atoms are escaping from Mercury’s gravitational-sphere. They are dragged anti-Sun-ward and form a tail structure. We expect similar phenomena on exoplanets. The hot super-Earth 61 Vir b orbiting a G3V star at only 0.05 au may show a similar structure. Because of its small separation from the star, the sodium release mechanisms may be working more efficiently on hot super-Earths than on Mercury, although the strong gravitational force of Earth-sized or even more massive planets may be keeping sodium atoms from escaping from the planet. Here, we performed model simulations for Mercury (to verify our model) and 61 Vir b as a representative super-Earth. We have found that sodium atoms can escape from this exoplanet due to stellar wind sputtering and micrometeorite impacts, to form a sodium tail. However, in contrast to Mercury, the tail on this hot super-Earth is strongly aligned with the anti-starward direction because of higher light pressure. Our model suggests that 61 Vir b seems to have an exo-base atmosphere like that of Mercury.

  12. DynMo: Dynamic Simulation Model for Space Reactor Power Systems

    International Nuclear Information System (INIS)

    El-Genk, Mohamed; Tournier, Jean-Michel


    A Dynamic simulation Model (DynMo) for space reactor power systems is developed using the SIMULINK registered platform. DynMo is modular and could be applied to power systems with different types of reactors, energy conversion, and heat pipe radiators. This paper presents a general description of DynMo-TE for a space power system powered by a Sectored Compact Reactor (SCoRe) and that employs off-the-shelf SiGe thermoelectric converters. SCoRe is liquid metal cooled and designed for avoidance of a single point failure. The reactor core is divided into six equal sectors that are neutronically, but not thermal-hydraulically, coupled. To avoid a single point failure in the power system, each reactor sector has its own primary and secondary loops, and each loop is equipped with an electromagnetic (EM) pump. A Power Conversion assembly (PCA) and a Thermoelectric Conversion Assembly (TCA) of the primary and secondary EM pumps thermally couple each pair of a primary and a secondary loop. The secondary loop transports the heat rejected by the PCA and the pumps TCA to a rubidium heat pipes radiator panel. The primary loops transport the thermal power from the reactor sector to the PCAs for supplying a total of 145-152 kWe to the load at 441-452 VDC, depending on the selections of the primary and secondary liquid metal coolants. The primary and secondary coolant combinations investigated are lithium (Li)/Li, Li/sodium (Na), Na-Na, Li/NaK-78 and Na/NaK-78, for which the reactor exit temperature is kept below 1250 K. The results of a startup transient of the system from an initial temperature of 500 K are compared and discussed

  13. Keeping it real: revisiting a real-space approach to running ensembles of cosmological N-body simulations

    International Nuclear Information System (INIS)

    Orban, Chris


    In setting up initial conditions for ensembles of cosmological N-body simulations there are, fundamentally, two choices: either maximizing the correspondence of the initial density field to the assumed fourier-space clustering or, instead, matching to real-space statistics and allowing the DC mode (i.e. overdensity) to vary from box to box as it would in the real universe. As a stringent test of both approaches, I perform ensembles of simulations using power law and a ''powerlaw times a bump'' model inspired by baryon acoustic oscillations (BAO), exploiting the self-similarity of these initial conditions to quantify the accuracy of the matter-matter two-point correlation results. The real-space method, which was originally proposed by Pen 1997 [1] and implemented by Sirko 2005 [2], performed well in producing the expected self-similar behavior and corroborated the non-linear evolution of the BAO feature observed in conventional simulations, even in the strongly-clustered regime (σ 8 ∼>1). In revisiting the real-space method championed by [2], it was also noticed that this earlier study overlooked an important integral constraint correction to the correlation function in results from the conventional approach that can be important in ΛCDM simulations with L box ∼ −1 Gpc and on scales r∼>L box /10. Rectifying this issue shows that the fourier space and real space methods are about equally accurate and efficient for modeling the evolution and growth of the correlation function, contrary to previous claims. An appendix provides a useful independent-of-epoch analytic formula for estimating the importance of the integral constraint bias on correlation function measurements in ΛCDM simulations

  14. Dynamical electron diffraction simulation for non-orthogonal crystal system by a revised real space method. (United States)

    Lv, C L; Liu, Q B; Cai, C Y; Huang, J; Zhou, G W; Wang, Y G


    In the transmission electron microscopy, a revised real space (RRS) method has been confirmed to be a more accurate dynamical electron diffraction simulation method for low-energy electron diffraction than the conventional multislice method (CMS). However, the RRS method can be only used to calculate the dynamical electron diffraction of orthogonal crystal system. In this work, the expression of the RRS method for non-orthogonal crystal system is derived. By taking Na2 Ti3 O7 and Si as examples, the correctness of the derived RRS formula for non-orthogonal crystal system is confirmed by testing the coincidence of numerical results of both sides of Schrödinger equation; moreover, the difference between the RRS method and the CMS for non-orthogonal crystal system is compared at the accelerating voltage range from 40 to 10 kV. Our results show that the CMS method is almost the same as the RRS method for the accelerating voltage above 40 kV. However, when the accelerating voltage is further lowered to 20 kV or below, the CMS method introduces significant errors, not only for the higher-order Laue zone diffractions, but also for zero-order Laue zone. These indicate that the RRS method for non-orthogonal crystal system is necessary to be used for more accurate dynamical simulation when the accelerating voltage is low. Furthermore, the reason for the increase of differences between those diffraction patterns calculated by the RRS method and the CMS method with the decrease of the accelerating voltage is discussed. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  15. LIFE experiment: isolation of cryptoendolithic organisms from Antarctic colonized sandstone exposed to space and simulated Mars conditions on the international space station. (United States)

    Scalzi, Giuliano; Selbmann, Laura; Zucconi, Laura; Rabbow, Elke; Horneck, Gerda; Albertano, Patrizia; Onofri, Silvano


    Desiccated Antarctic rocks colonized by cryptoendolithic communities were exposed on the International Space Station (ISS) to space and simulated Mars conditions (LiFE-Lichens and Fungi Experiment). After 1.5 years in space samples were retrieved, rehydrated and spread on different culture media. Colonies of a green alga and a pink-coloured fungus developed on Malt-Agar medium; they were isolated from a sample exposed to simulated Mars conditions beneath a 0.1 % T Suprasil neutral density filter and from a sample exposed to space vacuum without solar radiation exposure, respectively. None of the other flight samples showed any growth after incubation. The two organisms able to grow were identified at genus level by Small SubUnit (SSU) and Internal Transcribed Spacer (ITS) rDNA sequencing as Stichococcus sp. (green alga) and Acarospora sp. (lichenized fungal genus) respectively. The data in the present study provide experimental information on the possibility of eukaryotic life transfer from one planet to another by means of rocks and of survival in Mars environment.

  16. Operation and evaluation of the Terminal Configured Vehicle Mission Simulator in an automated terminal area metering and spacing ATC environment (United States)

    Houck, J. A.


    This paper describes the work being done at the National Aeronautics and Space Administration's Langley Research Center on the development of a mission simulator for use in the Terminal Configured Vehicle Program. A brief description of the goals and objectives of the Terminal Configured Vehicle Program is presented. A more detailed description of the Mission Simulator, in its present configuration, and its components is provided. Finally, a description of the first research study conducted in the Mission Simulator is presented along with a discussion of some preliminary results from this study.

  17. Bidirectional reflectance and VIS-NIR spectroscopy of cometary analogues under simulated space conditions (United States)

    Jost, Bernhard; Pommerol, Antoine; Poch, Olivier; Yoldi, Zuriñe; Fornasier, Sonia; Hasselmann, Pedro Henrique; Feller, Clément; Carrasco, Nathalie; Szopa, Cyril; Thomas, Nicolas


    This work is intended to be the second publication in a series of papers reporting on the spectro-photometric properties of cometary analogues measured in the laboratory. Herein, we provide in-situ hyperspectral imaging data in the 0.40-2.35 μm range from three sublimation experiments under simulated space conditions in thermal vacuum from samples made of water ice, carbonaceous compounds and complex organic molecules. The dataset is complemented by measurements of the bidirectional reflectance in the visible (750 nm) spectral range before and after sublimation. A qualitative characterization of surface evolution processes is provided as well as a description of morphological changes during the simulation experiment. The aim of these experiments is to mimic the spectrum of comet 67P/Churyumov-Gerasimenko (67P) as acquired by the Rosetta mission by applying sublimation experiments on the mixtures of water ice with a complex organic material (tholins) and carbonaceous compounds (carbon black; activated charcoal) studied in our companion publication (Jost et al., submitted). Sublimation experiments are needed to develop the particular texture (high porosity), expected on the nucleus' surface, which might have a strong influence on spectro-photometric properties. The spectrally best matching mixtures of non volatile organic molecules from Jost et al. (submitted) are mixed with fine grained water ice particles and evolved in a thermal vacuum chamber, in order to monitor the influence of the sublimation process on their spectro-photometric properties. We demonstrate that the way the water ice and the non-volatile constituents are mixed, plays a major role in the formation and evolution of a surface residue mantle as well as having influence on the consolidation processes of the underlying ice. Additionally it results in different activity patterns under simulated insolation cycles. Further we show that the phase curves of samples having a porous surface mantle layer

  18. Simulation of the space-time evolution of color-flux tubes (guidelines to the TERMITE program)

    International Nuclear Information System (INIS)

    Dyrek, A.


    We give the description of the computer program which simulates boost-invariant evolution of color-flux tubes in high-energy processes. The program provides a graphic demonstration of space-time trajectories of created particles and can also be used as Monte-Carlo generator of events. (author)

  19. An exploration of the option space in student design projects for uncertainty and sensitivity analysis with performance simulation

    NARCIS (Netherlands)

    Struck, C.; Wilde, de P.J.C.J.; Hopfe, C.J.; Hensen, J.L.M.


    This paper describes research conducted to gather empirical evidence on extent, character and content of the option space in building design projects, from the perspective of a climate engineer using building performance simulation for concept evaluation. The goal is to support uncertainty analysis

  20. Effect of Prolonged Simulated Microgravity on Metabolic Proteins in Rat Hippocampus: Steps toward Safe Space Travel. (United States)

    Wang, Yun; Javed, Iqbal; Liu, Yahui; Lu, Song; Peng, Guang; Zhang, Yongqian; Qing, Hong; Deng, Yulin


    Mitochondria are not only the main source of energy in cells but also produce reactive oxygen species (ROS), which result in oxidative stress when in space. This oxidative stress is responsible for energy imbalances and cellular damage. In this study, a rat tail suspension model was used in individual experiments for 7 and 21 days to explore the effect of simulated microgravity (SM) on metabolic proteins in the hippocampus, a vital brain region involved in learning, memory, and navigation. A comparative (18)O-labeled quantitative proteomic strategy was used to observe the differential expression of metabolic proteins. Forty-two and sixty-seven mitochondrial metabolic proteins were differentially expressed after 21 and 7 days of SM, respectively. Mitochondrial Complex I, III, and IV, isocitrate dehydrogenase and malate dehydrogenase were down-regulated. Moreover, DJ-1 and peroxiredoxin 6, which defend against oxidative damage, were up-regulated in the hippocampus. Western blot analysis of proteins DJ-1 and COX 5A confirmed the mass spectrometry results. Despite these changes in mitochondrial protein expression, no obvious cell apoptosis was observed after 21 days of SM. The results of this study indicate that the oxidative stress induced by SM has profound effects on metabolic proteins.

  1. Numerical simulations and analyses of temperature control loop heat pipe for space CCD camera (United States)

    Meng, Qingliang; Yang, Tao; Li, Chunlin


    As one of the key units of space CCD camera, the temperature range and stability of CCD components affect the image's indexes. Reasonable thermal design and robust thermal control devices are needed. One kind of temperature control loop heat pipe (TCLHP) is designed, which highly meets the thermal control requirements of CCD components. In order to study the dynamic behaviors of heat and mass transfer of TCLHP, particularly in the orbital flight case, a transient numerical model is developed by using the well-established empirical correlations for flow models within three dimensional thermal modeling. The temperature control principle and details of mathematical model are presented. The model is used to study operating state, flow and heat characteristics based upon the analyses of variations of temperature, pressure and quality under different operating modes and external heat flux variations. The results indicate that TCLHP can satisfy the thermal control requirements of CCD components well, and always ensure good temperature stability and uniformity. By comparison between flight data and simulated results, it is found that the model is to be accurate to within 1°C. The model can be better used for predicting and understanding the transient performance of TCLHP.

  2. Analysis of simulated hypervelocity impacts on a titanium fuel tank from the Salyut 7 space station (United States)

    Jantou, V.; McPhail, D. S.; Chater, R. J.; Kearsley, A.


    The aim of this project was to gain a better understanding of the microstructural effects of hypervelocity impacts (HVI) in titanium alloys. We investigated a titanium fuel tank recovered from the Russian Salyut 7 space station, which was launched on April 19, 1982 before being destroyed during an un-controlled re-entry in 1991, reportedly scattering debris over parts of South America. Several sections were cut out from the tank in order to undergo HVI simulations using a two-stage light gas gun. In addition, a Ti-6Al-4V alloy was studied for further comparison. The crater morphologies produced were successfully characterised using microscope-based white light interferometry (Zygo ® Corp, USA), while projectile remnants were identified via secondary ion mass spectrometry (SIMS). Microstructural alterations were investigated using focused ion beam (FIB) milling and depth profiling, as well as transmission electron microscopy (TEM). There was evidence of a very high density of dislocations in the vicinity of the crater. The extent of the deformation was localised in a region of about one to two radii of the impact craters. No notable differences were observed between the titanium alloys used during the hypervelocity impact tests.

  3. A multiprocessor computer simulation model employing a feedback scheduler/allocator for memory space and bandwidth matching and TMR processing (United States)

    Bradley, D. B.; Irwin, J. D.


    A computer simulation model for a multiprocessor computer is developed that is useful for studying the problem of matching multiprocessor's memory space, memory bandwidth and numbers and speeds of processors with aggregate job set characteristics. The model assumes an input work load of a set of recurrent jobs. The model includes a feedback scheduler/allocator which attempts to improve system performance through higher memory bandwidth utilization by matching individual job requirements for space and bandwidth with space availability and estimates of bandwidth availability at the times of memory allocation. The simulation model includes provisions for specifying precedence relations among the jobs in a job set, and provisions for specifying precedence execution of TMR (Triple Modular Redundant and SIMPLEX (non redundant) jobs.

  4. Virtual Environment User Interfaces to Support RLV and Space Station Simulations in the ANVIL Virtual Reality Lab (United States)

    Dumas, Joseph D., II


    Several virtual reality I/O peripherals were successfully configured and integrated as part of the author's 1997 Summer Faculty Fellowship work. These devices, which were not supported by the developers of VR software packages, use new software drivers and configuration files developed by the author to allow them to be used with simulations developed using those software packages. The successful integration of these devices has added significant capability to the ANVIL lab at MSFC. In addition, the author was able to complete the integration of a networked virtual reality simulation of the Space Shuttle Remote Manipulator System docking Space Station modules which was begun as part of his 1996 Fellowship. The successful integration of this simulation demonstrates the feasibility of using VR technology for ground-based training as well as on-orbit operations.

  5. A Low Fidelity Simulation To Examine The Design Space For An Expendable Active Decoy (United States)


    SIMULATIONS FOR ANALYSIS ................................11 C. BENEFITS OF MODELING AND SIMULATION IN THE SYSTEM ENGINEERING PROCESS ... simulation may be able to predict the performance parameters of the system of interest (SOI) accurately. The systems engineering process utilizes the simulation developed in this thesis during the early phases of the systems acquisition process : namely, the concept exploration, concept of

  6. Comparative proteomic analysis of rice after seed ground simulated radiation and spaceflight explains the radiation effects of space environment (United States)

    Wang, Wei; Shi, Jinming; Liang, Shujian; Lei, Huang; Shenyi, Zhang; Sun, Yeqing

    In previous work, we compared the proteomic profiles of rice plants growing after seed space-flights with ground controls by two-dimensional difference gel electrophoresis (2-D DIGE) and found that the protein expression profiles were changed after seed space environment exposures. Spaceflight represents a complex environmental condition in which several interacting factors such as cosmic radiation, microgravity and space magnetic fields are involved. Rice seed is in the process of dormant of plant development, showing high resistance against stresses, so the highly ionizing radiation (HZE) in space is considered as main factor causing biological effects to seeds. To further investigate the radiation effects of space environment, we performed on-ground simulated HZE particle radiation and compared between the proteomes of seed irra-diated plants and seed spaceflight (20th recoverable satellite) plants from the same rice variety. Space ionization shows low-dose but high energy particle effects, for searching the particle effects, ground radiations with the same low-dose (2mGy) but different liner energy transfer (LET) values (13.3KeV/µm-C, 30KeV/µm-C, 31KeV/µm-Ne, 62.2KeV/µm-C, 500Kev/µm-Fe) were performed; using 2-D DIGE coupled with clustering and principle component analysis (PCA) for data process and comparison, we found that the holistic protein expression patterns of plants irradiated by LET-62.2KeV/µm carbon particles were most similar to spaceflight. In addition, although space environment presents a low-dose radiation (0.177 mGy/day on the satellite), the equivalent simulated radiation dose effects should still be evaluated: radiations of LET-62.2KeV/µm carbon particles with different cumulative doses (2mGy, 20mGy, 200mGy, 2000mGy) were further carried out and resulted that the 2mGy radiation still shared most similar proteomic profiles with spaceflight, confirming the low-dose effects of space radiation. Therefore, in the protein expression level

  7. Impact of Minimum Driveway Spacing Policies on Safety Performance: An Integrated Traffic Micro-Simulation and Automated Conflict Analysis

    Directory of Open Access Journals (Sweden)

    Chu C. Minh


    Full Text Available A key strategy for successful access management is the adoption of driveway spacing guidelines that consider both safety and operations. The goal is to provide sufficient distance from one driveway to the next so that drivers can perceive and react to the conditions at each potential conflict point in succession. State DOTs across the country have adopted different driveway spacing standards that vary according to the access class and characteristics of the adjacent roadway, such as type of roadway, posted speed limit, and traffic volume. Utilizing the VISSIM microscopic traffic simulation tool and FHWA's Surrogate Safety Assessment Model (SSAM, this research examined safety implications of four different driveway spacing policies representing 13 states. The analysis involved calibrating the VISSIM model for an arterial roadway corridor in West Columbia, SC, and then using the calibrated model to simulate various operational changes to the corridor, including speed limits, traffic volumes, and the associated minimum driveway spacing criteria for the four different policies. SSAM was used to analyze vehicle trajectories derived from VISSIM to determine the number of conflict points. Experimental results indicate that posted speed limit and traffic volume are the primary impact factors for driveway safety, and thus, these parameters should be considered in establishing minimum driveway spacing. Findings from this study indicate that there are significant differences in safety impacts between the different driveway spacing policies adopted by various state DOTs.

  8. Simulating Emerging Space Industries with Agent-Based Modeling, Phase I (United States)

    National Aeronautics and Space Administration — The Vision for Space Exploration (VSE) calls for encouraging commercial participation as a top-level objective. Given current and future commercial activities, how...



    Mariano Bizzarri; Enrico Saggese


    Manned space flight has been the great human and technological adventure of the past half-century. By putting people into places and situations unprecedented in history, it has stirred the imagination while expanding and redefining the human experience. However, space exploration obliges men to confront a hostile environment of cosmic radiation, microgravity, isolation and changes in the magnetic field. Any space traveler is therefore submitted to relevant health threats. In the twenty-first ...

  10. Simulation of tree shade impacts on residential energy use for space conditioning in Sacramento (United States)

    Simpson, J. R.; McPherson, E. G.

    Tree shade reduces summer air conditioning demand and increases winter heating load by intercepting solar energy that would otherwise heat the shaded structure. We evaluate the magnitude of these effects here for 254 residential properties participating in a utility sponsored tree planting program in Sacramento, California. Tree and building characteristics and typical weather data are used to model hourly shading and energy used for space conditioning for each building for a period of one year. There were an average of 3.1 program trees per property which reduced annual and peak (8 h average from 1 to 9 p.m. Pacific Daylight Time) cooling energy use 153 kWh (7.1%) and 0.08 kW (2.3%) per tree, respectively. Annual heating load increased 0.85 GJ (0.80 MBtu, 1.9%) per tree. Changes in cooling load were smaller, but percentage changes larger, for newer buildings. Averaged over all homes, annual cooling savings of 15.25 per tree were reduced by a heating penalty of 5.25 per tree, for net savings of 10.00 per tree from shade. We estimate an annual cooling penalty of 2.80 per tree and heating savings of 6.80 per tree from reduced wind speed, for a net savings of 4.00 per tree, and total annual savings of 14.00 per tree (43.00 per property). Results are found to be consistent with previous simulations and the limited measurements available.

  11. Behavioral and biological effects of autonomous versus scheduled mission management in simulated space-dwelling groups (United States)

    Roma, Peter G.; Hursh, Steven R.; Hienz, Robert D.; Emurian, Henry H.; Gasior, Eric D.; Brinson, Zabecca S.; Brady, Joseph V.


    Logistical constraints during long-duration space expeditions will limit the ability of Earth-based mission control personnel to manage their astronaut crews and will thus increase the prevalence of autonomous operations. Despite this inevitability, little research exists regarding crew performance and psychosocial adaptation under such autonomous conditions. To this end, a newly-initiated study on crew management systems was conducted to assess crew performance effectiveness under rigid schedule-based management of crew activities by Mission Control versus more flexible, autonomous management of activities by the crews themselves. Nine volunteers formed three long-term crews and were extensively trained in a simulated planetary geological exploration task over the course of several months. Each crew then embarked on two separate 3-4 h missions in a counterbalanced sequence: Scheduled, in which the crews were directed by Mission Control according to a strict topographic and temporal region-searching sequence, and Autonomous, in which the well-trained crews received equivalent baseline support from Mission Control but were free to explore the planetary surface as they saw fit. Under the autonomous missions, performance in all three crews improved (more high-valued geologic samples were retrieved), subjective self-reports of negative emotional states decreased, unstructured debriefing logs contained fewer references to negative emotions and greater use of socially-referent language, and salivary cortisol output across the missions was attenuated. The present study provides evidence that crew autonomy may improve performance and help sustain if not enhance psychosocial adaptation and biobehavioral health. These controlled experimental data contribute to an emerging empirical database on crew autonomy which the international astronautics community may build upon for future research and ultimately draw upon when designing and managing missions.

  12. Validation of Varian TrueBeam electron phase–spaces for Monte Carlo simulation of MLC-shaped fields

    International Nuclear Information System (INIS)

    Lloyd, Samantha A. M.; Gagne, Isabelle M.; Zavgorodni, Sergei; Bazalova-Carter, Magdalena


    Purpose: This work evaluates Varian’s electron phase–space sources for Monte Carlo simulation of the TrueBeam for modulated electron radiation therapy (MERT) and combined, modulated photon and electron radiation therapy (MPERT) where fields are shaped by the photon multileaf collimator (MLC) and delivered at 70 cm SSD. Methods: Monte Carlo simulations performed with EGSnrc-based BEAMnrc/DOSXYZnrc and PENELOPE-based PRIMO are compared against diode measurements for 5 × 5, 10 × 10, and 20 × 20 cm 2 MLC-shaped fields delivered with 6, 12, and 20 MeV electrons at 70 cm SSD (jaws set to 40 × 40 cm 2 ). Depth dose curves and profiles are examined. In addition, EGSnrc-based simulations of relative output as a function of MLC-field size and jaw-position are compared against ion chamber measurements for MLC-shaped fields between 3 × 3 and 25 × 25 cm 2 and jaw positions that range from the MLC-field size to 40 × 40 cm 2 . Results: Percent depth dose curves generated by BEAMnrc/DOSXYZnrc and PRIMO agree with measurement within 2%, 2 mm except for PRIMO’s 12 MeV, 20 × 20 cm 2 field where 90% of dose points agree within 2%, 2 mm. Without the distance to agreement, differences between measurement and simulation are as large as 7.3%. Characterization of simulated dose parameters such as FWHM, penumbra width and depths of 90%, 80%, 50%, and 20% dose agree within 2 mm of measurement for all fields except for the FWHM of the 6 MeV, 20 × 20 cm 2 field which falls within 2 mm distance to agreement. Differences between simulation and measurement exist in the profile shoulders and penumbra tails, in particular for 10 × 10 and 20 × 20 cm 2 fields of 20 MeV electrons, where both sets of simulated data fall short of measurement by as much as 3.5%. BEAMnrc/DOSXYZnrc simulated outputs agree with measurement within 2.3% except for 6 MeV MLC-shaped fields. Discrepancies here are as great as 5.5%. Conclusions: TrueBeam electron phase–spaces available from Varian have been

  13. Validation of Varian TrueBeam electron phase–spaces for Monte Carlo simulation of MLC-shaped fields

    Energy Technology Data Exchange (ETDEWEB)

    Lloyd, Samantha A. M. [Department of Physics and Astronomy, University of Victoria, Victoria, British Columbia V8P 3P6 5C2 (Canada); Gagne, Isabelle M., E-mail:; Zavgorodni, Sergei [Department of Medical Physics, BC Cancer Agency–Vancouver Island Centre, Victoria, British Columbia V8R 6V5, Canada and Department of Physics and Astronomy, University of Victoria, Victoria, British Columbia V8W 3P6 5C2 (Canada); Bazalova-Carter, Magdalena [Department of Physics and Astronomy, University of Victoria, Victoria, British Columbia V8W 3P6 5C2 (Canada)


    Purpose: This work evaluates Varian’s electron phase–space sources for Monte Carlo simulation of the TrueBeam for modulated electron radiation therapy (MERT) and combined, modulated photon and electron radiation therapy (MPERT) where fields are shaped by the photon multileaf collimator (MLC) and delivered at 70 cm SSD. Methods: Monte Carlo simulations performed with EGSnrc-based BEAMnrc/DOSXYZnrc and PENELOPE-based PRIMO are compared against diode measurements for 5 × 5, 10 × 10, and 20 × 20 cm{sup 2} MLC-shaped fields delivered with 6, 12, and 20 MeV electrons at 70 cm SSD (jaws set to 40 × 40 cm{sup 2}). Depth dose curves and profiles are examined. In addition, EGSnrc-based simulations of relative output as a function of MLC-field size and jaw-position are compared against ion chamber measurements for MLC-shaped fields between 3 × 3 and 25 × 25 cm{sup 2} and jaw positions that range from the MLC-field size to 40 × 40 cm{sup 2}. Results: Percent depth dose curves generated by BEAMnrc/DOSXYZnrc and PRIMO agree with measurement within 2%, 2 mm except for PRIMO’s 12 MeV, 20 × 20 cm{sup 2} field where 90% of dose points agree within 2%, 2 mm. Without the distance to agreement, differences between measurement and simulation are as large as 7.3%. Characterization of simulated dose parameters such as FWHM, penumbra width and depths of 90%, 80%, 50%, and 20% dose agree within 2 mm of measurement for all fields except for the FWHM of the 6 MeV, 20 × 20 cm{sup 2} field which falls within 2 mm distance to agreement. Differences between simulation and measurement exist in the profile shoulders and penumbra tails, in particular for 10 × 10 and 20 × 20 cm{sup 2} fields of 20 MeV electrons, where both sets of simulated data fall short of measurement by as much as 3.5%. BEAMnrc/DOSXYZnrc simulated outputs agree with measurement within 2.3% except for 6 MeV MLC-shaped fields. Discrepancies here are as great as 5.5%. Conclusions: TrueBeam electron phase–spaces

  14. A New Approach to Reducing Search Space and Increasing Efficiency in Simulation Optimization Problems via the Fuzzy-DEA-BCC

    Directory of Open Access Journals (Sweden)

    Rafael de Carvalho Miranda


    Full Text Available The development of discrete-event simulation software was one of the most successful interfaces in operational research with computation. As a result, research has been focused on the development of new methods and algorithms with the purpose of increasing simulation optimization efficiency and reliability. This study aims to define optimum variation intervals for each decision variable through a proposed approach which combines the data envelopment analysis with the Fuzzy logic (Fuzzy-DEA-BCC, seeking to improve the decision-making units’ distinction in the face of uncertainty. In this study, Taguchi’s orthogonal arrays were used to generate the necessary quantity of DMUs, and the output variables were generated by the simulation. Two study objects were utilized as examples of mono- and multiobjective problems. Results confirmed the reliability and applicability of the proposed method, as it enabled a significant reduction in search space and computational demand when compared to conventional simulation optimization techniques.

  15. Co-Simulation of Hybrid Systems with SpaceEx and Uppaal

    DEFF Research Database (Denmark)

    Bogomolov, Sergiy; Greitschus, Marius; Jensen, Peter Gjøl


    The Functional Mock-up Interface (FMI) is an industry standard which enables co-simulation of complex heterogeneous systems using multiple simulation engines. In this paper, we show how to use FMI in order to co-simulate hybrid systems modeled in the model checkers SPACEEX and UPPAAL. We show how...

  16. Large-Scale Testing and High-Fidelity Simulation Capabilities at Sandia National Laboratories to Support Space Power and Propulsion

    International Nuclear Information System (INIS)

    Dobranich, Dean; Blanchat, Thomas K.


    Sandia National Laboratories, as a Department of Energy, National Nuclear Security Agency, has major responsibility to ensure the safety and security needs of nuclear weapons. As such, with an experienced research staff, Sandia maintains a spectrum of modeling and simulation capabilities integrated with experimental and large-scale test capabilities. This expertise and these capabilities offer considerable resources for addressing issues of interest to the space power and propulsion communities. This paper presents Sandia's capability to perform thermal qualification (analysis, test, modeling and simulation) using a representative weapon system as an example demonstrating the potential to support NASA's Lunar Reactor System

  17. Development and computational simulation of thermoelectric electromagnetic pumps for controlling the fluid flow in liquid metal cooled space nuclear reactors

    International Nuclear Information System (INIS)

    Borges, E.M.


    Thermoelectric Electromagnetic (TEEM) Pumps can be used for controlling the fluid flow in the primary and secondary circuits of liquid metal cooled space nuclear reactor. In order to simulate and to evaluate the pumps performance, in steady-state, the computer program BEMTE has been developed to study the main operational parameters and to determine the system actuation point, for a given reactor operating power. The results for each stage of the program were satisfactory, compared to experimental data. The program shows to be adequate for the design and simulating of direct current electromagnetic pumps. (author)

  18. Space-charge-dominated beam dynamics simulations using the massively parallel processors (MPPs) of the Cray T3D

    International Nuclear Information System (INIS)

    Liu, H.


    Computer simulations using the multi-particle code PARMELA with a three-dimensional point-by-point space charge algorithm have turned out to be very helpful in supporting injector commissioning and operations at Thomas Jefferson National Accelerator Facility (Jefferson Lab, formerly called CEBAF). However, this algorithm, which defines a typical N 2 problem in CPU time scaling, is very time-consuming when N, the number of macro-particles, is large. Therefore, it is attractive to use massively parallel processors (MPPs) to speed up the simulations. Motivated by this, the authors modified the space charge subroutine for using the MPPs of the Cray T3D. The techniques used to parallelize and optimize the code on the T3D are discussed in this paper. The performance of the code on the T3D is examined in comparison with a Parallel Vector Processing supercomputer of the Cray C90 and an HP 735/15 high-end workstation

  19. Application of real space Kerker method in simulating gate-all-around nanowire transistors with realistic discrete dopants*

    International Nuclear Information System (INIS)

    Li Chang-Sheng; Ma Lei; Guo Jie-Rong


    We adopt a self-consistent real space Kerker method to prevent the divergence from charge sloshing in the simulating transistors with realistic discrete dopants in the source and drain regions. The method achieves efficient convergence by avoiding unrealistic long range charge sloshing but keeping effects from short range charge sloshing. Numerical results show that discrete dopants in the source and drain regions could have a bigger influence on the electrical variability than the usual continuous doping without considering charge sloshing. Few discrete dopants and the narrow geometry create a situation with short range Coulomb screening and oscillations of charge density in real space. The dopants induced quasi-localized defect modes in the source region experience short range oscillations in order to reach the drain end of the device. The charging of the defect modes and the oscillations of the charge density are identified by the simulation of the electron density. (paper)

  20. Thermography During Thermal Test of the Gaia Deployable Sunshield Assembly Qualification Model in the ESTEC Large Space Simulator (United States)

    Simpson, R.; Broussely, M.; Edwards, G.; Robinson, D.; Cozzani, A.; Casarosa, G.


    The National Physical Laboratory (NPL) and The European Space Research and Technology Centre (ESTEC) have performed for the first time successful surface temperature measurements using infrared thermal imaging in the ESTEC Large Space Simulator (LSS) under vacuum and with the Sun Simulator (SUSI) switched on during thermal qualification tests of the GAIA Deployable Sunshield Assembly (DSA). The thermal imager temperature measurements, with radiosity model corrections, show good agreement with thermocouple readings on well characterised regions of the spacecraft. In addition, the thermal imaging measurements identified potentially misleading thermocouple temperature readings and provided qualitative real-time observations of the thermal and spatial evolution of surface structure changes and heat dissipation during hot test loadings, which may yield additional thermal and physical measurement information through further research.

  1. Modification of the RTMTRACE program for numerical simulation of particle dynamics at racetrack microtrons with account of space charge forces

    International Nuclear Information System (INIS)

    Surma, I.V.; Shvedunov, V.I.


    The paper presents modification results of the program for simulation of particle dynamics in cyclic accelerators with RTMTRACE linear gap. The program was modified with regard for the effect of space charge effect on particle dynamics. Calculation results of particle dynamics in 1 MeV energy continuous-duty accelerator with 10 kw beam were used to develop continuous powerful commercial accelerator. 3 refs., 2 figs

  2. Distribution function approach to redshift space distortions. Part II: N-body simulations

    International Nuclear Information System (INIS)

    Okumura, Teppei; Seljak, Uroš; McDonald, Patrick; Desjacques, Vincent


    Measurement of redshift-space distortions (RSD) offers an attractive method to directly probe the cosmic growth history of density perturbations. A distribution function approach where RSD can be written as a sum over density weighted velocity moment correlators has recently been developed. In this paper we use results of N-body simulations to investigate the individual contributions and convergence of this expansion for dark matter. If the series is expanded as a function of powers of μ, cosine of the angle between the Fourier mode and line of sight, then there are a finite number of terms contributing at each order. We present these terms and investigate their contribution to the total as a function of wavevector k. For μ 2 the correlation between density and momentum dominates on large scales. Higher order corrections, which act as a Finger-of-God (FoG) term, contribute 1% at k ∼ 0.015hMpc −1 , 10% at k ∼ 0.05hMpc −1 at z = 0, while for k > 0.15hMpc −1 they dominate and make the total negative. These higher order terms are dominated by density-energy density correlations which contributes negatively to the power, while the contribution from vorticity part of momentum density auto-correlation adds to the total power, but is an order of magnitude lower. For μ 4 term the dominant term on large scales is the scalar part of momentum density auto-correlation, while higher order terms dominate for k > 0.15hMpc −1 . For μ 6 and μ 8 we find it has very little power for k −1 , shooting up by 2–3 orders of magnitude between k −1 and k −1 . We also compare the expansion to the full 2-d P ss (k,μ), as well as to the monopole, quadrupole, and hexadecapole integrals of P ss (k,μ). For these statistics an infinite number of terms contribute and we find that the expansion achieves percent level accuracy for kμ −1 at 6-th order, but breaks down on smaller scales because the series is no longer perturbative. We explore resummation of the terms into Fo

  3. Suited versus unsuited analog astronaut performance using the Aouda.X space suit simulator: the DELTA experiment of MARS2013. (United States)

    Soucek, Alexander; Ostkamp, Lutz; Paternesi, Roberta


    Space suit simulators are used for extravehicular activities (EVAs) during Mars analog missions. Flight planning and EVA productivity require accurate time estimates of activities to be performed with such simulators, such as experiment execution or traverse walking. We present a benchmarking methodology for the Aouda.X space suit simulator of the Austrian Space Forum. By measuring and comparing the times needed to perform a set of 10 test activities with and without Aouda.X, an average time delay was derived in the form of a multiplicative factor. This statistical value (a second-over-second time ratio) is 1.30 and shows that operations in Aouda.X take on average a third longer than the same operations without the suit. We also show that activities predominantly requiring fine motor skills are associated with larger time delays (between 1.17 and 1.59) than those requiring short-distance locomotion or short-term muscle strain (between 1.10 and 1.16). The results of the DELTA experiment performed during the MARS2013 field mission increase analog mission planning reliability and thus EVA efficiency and productivity when using Aouda.X.

  4. Demonstration of Self-Training Autonomous Neural Networks in Space Vehicle Docking Simulations (United States)

    Patrick, M. Clinton; Thaler, Stephen L.; Stevenson-Chavis, Katherine


    Neural Networks have been under examination for decades in many areas of research, with varying degrees of success and acceptance. Key goals of computer learning, rapid problem solution, and automatic adaptation have been elusive at best. This paper summarizes efforts at NASA's Marshall Space Flight Center harnessing such technology to autonomous space vehicle docking for the purpose of evaluating applicability to future missions.

  5. Role of collective effects in dominance of scattering off thermal ions over Langmuir wave decay: Analysis, simulations, and space applications

    International Nuclear Information System (INIS)

    Cairns, Iver H.


    Langmuir waves driven to high levels by beam instabilities are subject to nonlinear processes, including the closely related processes of scattering off thermal ions (STI) and a decay process in which the ion response is organized into a product ion acoustic wave. Calculations of the nonlinear growth rates predict that the decay process should always dominate STI, creating two paradoxes. The first is that three independent computer simulation studies show STI proceeding, with no evidence for the decay at all. The second is that observations in space of type III solar radio bursts and Earth's foreshock, which the simulations were intended to model, show evidence for the decay proceeding but no evidence for STI. Resolutions to these paradoxes follow from the realization that a nonlinear process cannot proceed when its growth rate exceeds the minimum frequency of the participating waves, since the required collective response cannot be maintained and the waves cannot respond appropriately, and that a significant number of e-foldings and wave periods must be contained in the time available. It is shown that application of these ''collective'' and ''time scale'' constraints to the simulations explains why the decay does not proceed in them, as well as why STI proceeds in specific simulations. This appears to be the first demonstration that collective constraints are important in understanding nonlinear phenomena. Furthermore, applying these constraints to space observations, it is predicted that the decay should proceed (and dominate STI) in type III sources and the high beam speed regions of Earth's foreshock for a specific range of wave levels, with a possible role for STI alone at slightly higher wave levels. Deeper in the foreshock, for slower beams and weaker wave levels, the decay and STI are predicted to become ineffective. Suggestions are given for future testing of the collective constraint and an explanation for why waves in space are usually much weaker than

  6. Sensitivity analysis on the interfacial drag in SPACE code to simulate UPTF separate effect test about loop seal clearance phenomenon

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sukho; Lim, Sanggyu; You, Gukjong; Park, Youngsheop [Korea Hydro and Nuclear Power Company, Ltd., Daejeon (Korea, Republic of)


    The nuclear thermal hydraulic system code known as SPACE (Safety and Performance Analysis CodE) was developed and its V and V (Verification and Validation) have been conducted using well-known SETs (Separate Effect Tests) and IETs (Integral Effect Tests). At the same time, the SBLOCA (Small Break Loss of Coolant Accident) methodology in accordance with Appendix K of 10CFR50 for the APR1400 (Advanced Power Reactor 1400) was developed and applied to regulatory body for licensing in 2013. Especially, the SBLOCA methodology developed using SPACE v2.14 code adopts inherent test matrix independent of V and V test to show its conservatism for important phenomena. In this paper, the predictability of SPACE code for UPTF (Upper Plenum Test Facility) test simulating loop seal clearance of SBLOCA important phenomena and the related sensitivity analysis are introduced.

  7. Parallel Finite Element Particle-In-Cell Code for Simulations of Space-charge Dominated Beam-Cavity Interactions

    International Nuclear Information System (INIS)

    Candel, A.; Kabel, A.; Ko, K.; Lee, L.; Li, Z.; Limborg, C.; Ng, C.; Prudencio, E.; Schussman, G.; Uplenchwar, R.


    Over the past years, SLAC's Advanced Computations Department (ACD) has developed the parallel finite element (FE) particle-in-cell code Pic3P (Pic2P) for simulations of beam-cavity interactions dominated by space-charge effects. As opposed to standard space-charge dominated beam transport codes, which are based on the electrostatic approximation, Pic3P (Pic2P) includes space-charge, retardation and boundary effects as it self-consistently solves the complete set of Maxwell-Lorentz equations using higher-order FE methods on conformal meshes. Use of efficient, large-scale parallel processing allows for the modeling of photoinjectors with unprecedented accuracy, aiding the design and operation of the next-generation of accelerator facilities. Applications to the Linac Coherent Light Source (LCLS) RF gun are presented

  8. Space-resolved characterization of high frequency atmospheric-pressure plasma in nitrogen, applying optical emission spectroscopy and numerical simulation

    International Nuclear Information System (INIS)

    Rajasekaran, Priyadarshini; Ruhrmann, Cornelia; Bibinov, Nikita; Awakowicz, Peter


    Averaged plasma parameters such as electron distribution function and electron density are determined by characterization of high frequency (2.4 GHz) nitrogen plasma using both experimental methods, namely optical emission spectroscopy (OES) and microphotography, and numerical simulation. Both direct and step-wise electron-impact excitation of nitrogen emissions are considered. The determination of space-resolved electron distribution function, electron density, rate constant for electron-impact dissociation of nitrogen molecule and the production of nitrogen atoms, applying the same methods, is discussed. Spatial distribution of intensities of neutral nitrogen molecule and nitrogen molecular ion from the microplasma is imaged by a CCD camera. The CCD images are calibrated using the corresponding emissions measured by absolutely calibrated OES, and are then subjected to inverse Abel transformation to determine space-resolved intensities and other parameters. The space-resolved parameters are compared, respectively, with the averaged parameters, and an agreement between them is established. (paper)

  9. De-individualized psychophysiological strain assessment during a flight simulation test—Validation of a space methodology (United States)

    Johannes, Bernd; Salnitski, Vyacheslav; Soll, Henning; Rauch, Melina; Hoermann, Hans-Juergen

    For the evaluation of an operator's skill reliability indicators of work quality as well as of psychophysiological states during the work have to be considered. The herein presented methodology and measurement equipment were developed and tested in numerous terrestrial and space experiments using a simulation of a spacecraft docking on a space station. However, in this study the method was applied to a comparable terrestrial task—the flight simulator test (FST) used in the DLR selection procedure for ab initio pilot applicants for passenger airlines. This provided a large amount of data for a statistical verification of the space methodology. For the evaluation of the strain level of applicants during the FST psychophysiological measurements were used to construct a "psychophysiological arousal vector" (PAV) which is sensitive to various individual reaction patterns of the autonomic nervous system to mental load. Its changes and increases will be interpreted as "strain". In the first evaluation study, 614 subjects were analyzed. The subjects first underwent a calibration procedure for the assessment of their autonomic outlet type (AOT) and on the following day they performed the FST, which included three tasks and was evaluated by instructors applying well-established and standardized rating scales. This new method will possibly promote a wide range of other future applications in aviation and space psychology.

  10. Visualization of simulated urban spaces: inferring parameterized generation of streets, parcels, and aerial imagery. (United States)

    Vanegas, Carlos A; Aliaga, Daniel G; Benes, Bedrich; Waddell, Paul


    Urban simulation models and their visualization are used to help regional planning agencies evaluate alternative transportation investments, land use regulations, and environmental protection policies. Typical urban simulations provide spatially distributed data about number of inhabitants, land prices, traffic, and other variables. In this article, we build on a synergy of urban simulation, urban visualization, and computer graphics to automatically infer an urban layout for any time step of the simulation sequence. In addition to standard visualization tools, our method gathers data of the original street network, parcels, and aerial imagery and uses the available simulation results to infer changes to the original urban layout and produce a new and plausible layout for the simulation results. In contrast with previous work, our approach automatically updates the layout based on changes in the simulation data and thus can scale to a large simulation over many years. The method in this article offers a substantial step forward in building integrated visualization and behavioral simulation systems for use in community visioning, planning, and policy analysis. We demonstrate our method on several real cases using a 200 GB database for a 16,300 km2 area surrounding Seattle.

  11. Qualitative Simulation of Photon Transport in Free Space Based on Monte Carlo Method and Its Parallel Implementation

    Directory of Open Access Journals (Sweden)

    Xueli Chen


    Full Text Available During the past decade, Monte Carlo method has obtained wide applications in optical imaging to simulate photon transport process inside tissues. However, this method has not been effectively extended to the simulation of free-space photon transport at present. In this paper, a uniform framework for noncontact optical imaging is proposed based on Monte Carlo method, which consists of the simulation of photon transport both in tissues and in free space. Specifically, the simplification theory of lens system is utilized to model the camera lens equipped in the optical imaging system, and Monte Carlo method is employed to describe the energy transformation from the tissue surface to the CCD camera. Also, the focusing effect of camera lens is considered to establish the relationship of corresponding points between tissue surface and CCD camera. Furthermore, a parallel version of the framework is realized, making the simulation much more convenient and effective. The feasibility of the uniform framework and the effectiveness of the parallel version are demonstrated with a cylindrical phantom based on real experimental results.

  12. Simulated non-contact atomic force microscopy for GaAs surfaces based on real-space pseudopotentials

    International Nuclear Information System (INIS)

    Kim, Minjung; Chelikowsky, James R.


    We simulate non-contact atomic force microscopy (AFM) with a GaAs(1 1 0) surface using a real-space ab initio pseudopotential method. While most ab initio simulations include an explicit model for the AFM tip, our method does not introduce the tip modeling step. This approach results in a considerable reduction of computational work, and also provides complete AFM images, which can be directly compared to experiment. By analyzing tip-surface interaction forces in both our results and previous ab initio simulations, we find that our method provides very similar force profile to the pure Si tip results. We conclude that our method works well for systems in which the tip is not chemically active.

  13. High-Fidelity Space-Time Adaptive Multiphysics Simulations in Nuclear Engineering

    Energy Technology Data Exchange (ETDEWEB)

    Solin, Pavel [Univ. of Reno, NV (United States); Ragusa, Jean [Texas A & M Univ., College Station, TX (United States)


    We delivered a series of fundamentally new computational technologies that have the potential to significantly advance the state-of-the-art of computer simulations of transient multiphysics nuclear reactor processes. These methods were implemented in the form of a C++ library, and applied to a number of multiphysics coupled problems relevant to nuclear reactor simulations.

  14. High-Fidelity Space-Time Adaptive Multiphysics Simulations in Nuclear Engineering

    International Nuclear Information System (INIS)

    Solin, Pavel; Ragusa, Jean


    We delivered a series of fundamentally new computational technologies that have the potential to significantly advance the state-of-the-art of computer simulations of transient multiphysics nuclear reactor processes. These methods were implemented in the form of a C++ library, and applied to a number of multiphysics coupled problems relevant to nuclear reactor simulations.

  15. Using Blackboard Wiki Pages as a Shared Space for Simulating the Professional Translation Work Environment (United States)

    Vine, Juliet


    The Work-Integrated Simulation for Translators module is part of a three year undergraduate degree in translation. The semester long module aims to simulate several aspects of the translation process using the Blackboard virtual learning environment's Wikis as the interface for completing translation tasks. For each translation task, one of the…

  16. Advanced Simulation Framework for Design and Analysis of Space Propulsion Systems, Phase I (United States)

    National Aeronautics and Space Administration — The innovation proposed here is a computational framework for high performance, high fidelity computational fluid dynamics (CFD) to enable accurate, fast and robust...

  17. Advanced Simulation Framework for Design and Analysis of Space Propulsion Systems, Phase II (United States)

    National Aeronautics and Space Administration — The innovation proposed here is a high-performance, high-fidelity framework in the computational fluid dynamics (CFD) code called Loci-STREAM to enable accurate,...

  18. Ion Irradiation Experiments on the Murchison CM2 Carbonaceous Chondrite: Simulating Space Weathering of Primitive Asteroids (United States)

    Keller, L. P.; Christoffersen, R.; Dukes, C. A.; Baragiola, R. A.; Rahman, Z.


    Remote sensing observations show that space weathering processes affect all airless bodies in the Solar System to some degree. Sample analyses and lab experiments provide insights into the chemical, spectroscopic and mineralogic effects of space weathering and aid in the interpretation of remote- sensing data. For example, analyses of particles returned from the S-type asteroid Itokawa by the Hayabusa mission revealed that space-weathering on that body was dominated by interactions with the solar wind acting on LL ordinary chondrite-like materials [1, 2]. Understanding and predicting how the surface regoliths of primitive carbonaceous asteroids respond to space weathering processes is important for future sample return missions (Hayabusa 2 and OSIRIS-REx) that are targeting objects of this type. Here, we report the results of our preliminary ion irradiation experiments on a hydrated carbonaceous chondrite with emphasis on microstructural and infrared spectral changes.

  19. An Improved Treatment of AC Space Charge Fields in Large Signal Simulation Codes

    National Research Council Canada - National Science Library

    Dialetis, D; Chernin, D; Antonsen, Jr., T. M; Levush, B


    An accurate representation of the AC space charge electric field is required in order to be able to predict the performance of linear beam tubes, including TWT's and klystrons, using a steady state...

  20. Goal driven kinematic simulation of flexible arm robot for space station missions (United States)

    Janssen, P.; Choudry, A.


    Flexible arms offer a great degree of flexibility in maneuvering in the space environment. The problem of transporting an astronaut for extra-vehicular activity using a space station based flexible arm robot was studied. Inverse kinematic solutions of the multilink structure were developed. The technique is goal driven and can support decision making for configuration selection as required for stability and obstacle avoidance. Details of this technique and results are given.

  1. End-to-end simulations and planning of a small space telescopes: Galaxy Evolution Spectroscopic Explorer: a case study (United States)

    Heap, Sara; Folta, David; Gong, Qian; Howard, Joseph; Hull, Tony; Purves, Lloyd


    Large astronomical missions are usually general-purpose telescopes with a suite of instruments optimized for different wavelength regions, spectral resolutions, etc. Their end-to-end (E2E) simulations are typically photons-in to flux-out calculations made to verify that each instrument meets its performance specifications. In contrast, smaller space missions are usually single-purpose telescopes, and their E2E simulations start with the scientific question to be answered and end with an assessment of the effectiveness of the mission in answering the scientific question. Thus, E2E simulations for small missions consist a longer string of calculations than for large missions, as they include not only the telescope and instrumentation, but also the spacecraft, orbit, and external factors such as coordination with other telescopes. Here, we illustrate the strategy and organization of small-mission E2E simulations using the Galaxy Evolution Spectroscopic Explorer (GESE) as a case study. GESE is an Explorer/Probe-class space mission concept with the primary aim of understanding galaxy evolution. Operation of a small survey telescope in space like GESE is usually simpler than operations of large telescopes driven by the varied scientific programs of the observers or by transient events. Nevertheless, both types of telescopes share two common challenges: maximizing the integration time on target, while minimizing operation costs including communication costs and staffing on the ground. We show in the case of GESE how these challenges can be met through a custom orbit and a system design emphasizing simplification and leveraging information from ground-based telescopes.

  2. SAFSIM: A computer program for engineering simulations of space reactor system performance

    International Nuclear Information System (INIS)

    Dobranich, D.


    SAFSIM (System Analysis Flow SIMulator) is a FORTRAN computer program that provides engineering simulations of user-specified flow networks at the system level. It includes fluid mechanics, heat transfer, and reactor dynamics capabilities. SAFSIM provides sufficient versatility to allow the simulation of almost any flow system, from a backyard sprinkler system to a clustered nuclear reactor propulsion system. In addition to versatility, speed and robustness are primary goals of SAFSIM. The current capabilities of SAFSIM are summarized, and some illustrative example results are presented

  3. Multiple Hypothesis Tracking (MHT) for Space Surveillance: Results and Simulation Studies (United States)

    Singh, N.; Poore, A.; Sheaff, C.; Aristoff, J.; Jah, M.


    With the anticipated installation of more accurate sensors and the increased probability of future collisions between space objects, the potential number of observable space objects is likely to increase by an order of magnitude within the next decade, thereby placing an ever-increasing burden on current operational systems. Moreover, the need to track closely-spaced objects due, for example, to breakups as illustrated by the recent Chinese ASAT test or the Iridium-Kosmos collision, requires new, robust, and autonomous methods for space surveillance to enable the development and maintenance of the present and future space catalog and to support the overall space surveillance mission. The problem of correctly associating a stream of uncorrelated tracks (UCTs) and uncorrelated optical observations (UCOs) into common objects is critical to mitigating the number of UCTs and is a prerequisite to subsequent space catalog maintenance. Presently, such association operations are mainly performed using non-statistical simple fixed-gate association logic. In this paper, we report on the salient features and the performance of a newly-developed statistically-robust system-level multiple hypothesis tracking (MHT) system for advanced space surveillance. The multiple-frame assignment (MFA) formulation of MHT, together with supporting astrodynamics algorithms, provides a new joint capability for space catalog maintenance, UCT/UCO resolution, and initial orbit determination. The MFA-MHT framework incorporates multiple hypotheses for report to system track data association and uses a multi-arc construction to accommodate recently developed algorithms for multiple hypothesis filtering (e.g., AEGIS, CAR-MHF, UMAP, and MMAE). This MHT framework allows us to evaluate the benefits of many different algorithms ranging from single- and multiple-frame data association to filtering and uncertainty quantification. In this paper, it will be shown that the MHT system can provide superior

  4. The Survival and Resistance of Halobacterium salinarum NRC-1, Halococcus hamelinensis, and Halococcus morrhuae to Simulated Outer Space Solar Radiation. (United States)

    Leuko, S; Domingos, C; Parpart, A; Reitz, G; Rettberg, P


    Solar radiation is among the most prominent stress factors organisms face during space travel and possibly on other planets. Our analysis of three different halophilic archaea, namely Halobacterium salinarum NRC-1, Halococcus morrhuae, and Halococcus hamelinensis, which were exposed to simulated solar radiation in either dried or liquid state, showed tremendous differences in tolerance and survivability. We found that Hcc. hamelinensis is not able to withstand high fluences of simulated solar radiation compared to the other tested organisms. These results can be correlated to significant differences in genomic integrity following exposure, as visualized by random amplified polymorphic DNA (RAPD)-PCR. In contrast to the other two tested strains, Hcc. hamelinensis accumulates compatible solutes such as trehalose for osmoprotection. The addition of 100 mM trehalose to the growth medium of Hcc. hamelinensis improved its survivability following exposure. Exposure of cells in liquid at different temperatures suggests that Hbt. salinarum NRC-1 is actively repairing cellular and DNA damage during exposure, whereas Hcc. morrhuae exhibits no difference in survival. For Hcc. morrhuae, the high resistance against simulated solar radiation may be explained with the formation of cell clusters. Our experiments showed that these clusters shield cells on the inside against simulated solar radiation, which results in better survival rates at higher fluences when compared to Hbt. salinarum NRC-1 and Hcc. hamelinensis. Overall, this study shows that some halophilic archaea are highly resistant to simulated solar radiation and that they are of high astrobiological significance. Halophiles-Solar radiation-Stress resistance-Survival.

  5. Simulation of DNA Damage in Human Cells from Space Radiation Using a Physical Model of Stochastic Particle Tracks and Chromosomes (United States)

    Ponomarev, Artem; Plante, Ianik; Hada, Megumi; George, Kerry; Wu, Honglu


    The formation of double-strand breaks (DSBs) and chromosomal aberrations (CAs) is of great importance in radiation research and, specifically, in space applications. We are presenting a recently developed model, in which chromosomes simulated by NASARTI (NASA Radiation Tracks Image) is combined with nanoscopic dose calculations performed with the Monte-Carlo simulation by RITRACKS (Relativistic Ion Tracks) in a voxelized space. The model produces the number of DSBs, as a function of dose for high-energy iron, oxygen, and carbon ions, and He ions. The combined model calculates yields of radiation-induced CAs and unrejoined chromosome breaks in normal and repair deficient cells. The merged computational model is calibrated using the relative frequencies and distributions of chromosomal aberrations reported in the literature. The model considers fractionated deposition of energy to approximate dose rates of the space flight environment. The merged model also predicts of the yields and sizes of translocations, dicentrics, rings, and more complex-type aberrations formed in the G0/G1 cell cycle phase during the first cell division after irradiation.

  6. Simulations

    CERN Document Server

    Ngada, Narcisse


    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  7. Simulation of the 23 July 2012 Extreme Space Weather Event: What if This Extremely Rare CME Was Earth Directed? (United States)

    Ngwira, Chigomezyo M.; Pulkkinen, Antti; Mays, M. Leila; Kuznetsova, Maria M.; Galvin, A. B.; Simunac, Kristin; Baker, Daniel N.; Li, Xinlin; Zheng, Yihua; Glocer, Alex


    Extreme space weather events are known to cause adverse impacts on critical modern day technological infrastructure such as high-voltage electric power transmission grids. On 23 July 2012, NASA's Solar Terrestrial Relations Observatory-Ahead (STEREO-A) spacecraft observed in situ an extremely fast coronal mass ejection (CME) that traveled 0.96 astronomical units (approx. 1 AU) in about 19 h. Here we use the SpaceWeather Modeling Framework (SWMF) to perform a simulation of this rare CME.We consider STEREO-A in situ observations to represent the upstream L1 solar wind boundary conditions. The goal of this study is to examine what would have happened if this Rare-type CME was Earth-bound. Global SWMF-generated ground geomagnetic field perturbations are used to compute the simulated induced geoelectric field at specific ground-based active INTERMAGNET magnetometer sites. Simulation results show that while modeled global SYM-H index, a high-resolution equivalent of the Dst index, was comparable to previously observed severe geomagnetic storms such as the Halloween 2003 storm, the 23 July CME would have produced some of the largest geomagnetically induced electric fields, making it very geoeffective. These results have important practical applications for risk management of electrical power grids.

  8. Simulation of total loss of feed water in ATLAS test facility using SPACE code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Minhee; Kim, Seyun [Korea Hydro and Nuclear Power Co., Daejeon (Korea, Republic of). Central Research Inst.


    A total loss of feedwater (TLOFW) with additional failures in ATLAS test facility was analyzed using SPACE code, which is an advanced thermal-hydraulic system analysis code developed by the Korea nuclear industry. Partial failure of the safety injection pumps (SIPs) and the pilot-operated safety relief valves (POSRVs) of pressurizer were selected as additional failures. In order to assess the capability of SPACE code, partial failure was modeled, and compared with results of OECD-ATLAS A3.1 results. Reasonably good agreement with major thermal-hydraulic parameters was obtained by analyzing the transient behavior. From the results, this indicated that SPACE code has capabilities to design extension conditions, and feed and bleed operation using POSRVs and SIPs were effective for RCS cooling capability during TLOFW.


    International Nuclear Information System (INIS)



    The NASA Space Radiation Laboratory (NSRL) was constructed in collaboration with NASA for the purpose of performing radiation effect studies for the NASA space program. The NSRL makes use of heavy ions in the range of 0.05 to 3 GeV/n slow extracted from BNL's AGS Booster. NASA is interested in reproducing the energy spectrum from a solar flare in the space environment for a single ion species. To do this we have built and tested a set of software tools which allow the state of the Booster and the NSRL beam line to be changed automatically. In this report we will describe the system and present results of beam tests

  10. Simulation of a cascaded longitudinal space charge amplifier for coherent radiation generation

    Energy Technology Data Exchange (ETDEWEB)

    Halavanau, A., E-mail: [Department of Physics and Northern Illinois, Center for Accelerator & Detector Development, Northern Illinois University, DeKalb, IL 60115 (United States); Accelerator Physics Center, Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Piot, P. [Department of Physics and Northern Illinois, Center for Accelerator & Detector Development, Northern Illinois University, DeKalb, IL 60115 (United States); Accelerator Physics Center, Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States)


    Longitudinal space charge (LSC) effects are generally considered as harmful in free-electron lasers as they can seed unfavorable energy modulations that can result in density modulations with associated emittance dilution. This “micro-bunching instabilities” is naturally broadband and could possibly support the generation of coherent radiation over a broad region of the spectrum. Therefore there has been an increasing interest in devising accelerator beam lines capable of controlling LSC induced density modulations. In the present paper we refine these previous investigations by combining a grid-less space charge algorithm with the popular particle-tracking program ELEGANT. This high-fidelity model of the space charge is used to benchmark conventional LSC models. We finally employ the developed model to investigate the performance of a cascaded LSC amplifier using beam parameters comparable to the ones achievable at Fermilab Accelerator Science & Technology (FAST) facility currently under commissioning at Fermilab.

  11. Phase space simulation of collisionless stellar systems on the massively parallel processor

    International Nuclear Information System (INIS)

    White, R.L.


    A numerical technique for solving the collisionless Boltzmann equation describing the time evolution of a self gravitating fluid in phase space was implemented on the Massively Parallel Processor (MPP). The code performs calculations for a two dimensional phase space grid (with one space and one velocity dimension). Some results from calculations are presented. The execution speed of the code is comparable to the speed of a single processor of a Cray-XMP. Advantages and disadvantages of the MPP architecture for this type of problem are discussed. The nearest neighbor connectivity of the MPP array does not pose a significant obstacle. Future MPP-like machines should have much more local memory and easier access to staging memory and disks in order to be effective for this type of problem

  12. Conditional Stochastic Models in Reduced Space: Towards Efficient Simulation of Tropical Cyclone Precipitation Patterns (United States)

    Dodov, B.


    Stochastic simulation of realistic and statistically robust patterns of Tropical Cyclone (TC) induced precipitation is a challenging task. It is even more challenging in a catastrophe modeling context, where tens of thousands of typhoon seasons need to be simulated in order to provide a complete view of flood risk. Ultimately, one could run a coupled global climate model and regional Numerical Weather Prediction (NWP) model, but this approach is not feasible in the catastrophe modeling context and, most importantly, may not provide TC track patterns consistent with observations. Rather, we propose to leverage NWP output for the observed TC precipitation patterns (in terms of downscaled reanalysis 1979-2015) collected on a Lagrangian frame along the historical TC tracks and reduced to the leading spatial principal components of the data. The reduced data from all TCs is then grouped according to timing, storm evolution stage (developing, mature, dissipating, ETC transitioning) and central pressure and used to build a dictionary of stationary (within a group) and non-stationary (for transitions between groups) covariance models. Provided that the stochastic storm tracks with all the parameters describing the TC evolution are already simulated, a sequence of conditional samples from the covariance models chosen according to the TC characteristics at a given moment in time are concatenated, producing a continuous non-stationary precipitation pattern in a Lagrangian framework. The simulated precipitation for each event is finally distributed along the stochastic TC track and blended with a non-TC background precipitation using a data assimilation technique. The proposed framework provides means of efficient simulation (10000 seasons simulated in a couple of days) and robust typhoon precipitation patterns consistent with observed regional climate and visually undistinguishable from high resolution NWP output. The framework is used to simulate a catalog of 10000 typhoon

  13. Selection of a Data Acquisition and Controls System Communications and Software Architecture for Johnson Space Center's Space Environment Simulation Laboratory Thermal and Vacuum Test Facilities (United States)

    Jordan, Eric A.


    Upgrade of data acquisition and controls systems software at Johnson Space Center's Space Environment Simulation Laboratory (SESL) involved the definition, evaluation and selection of a system communication architecture and software components. A brief discussion of the background of the SESL and its data acquisition and controls systems provides a context for discussion of the requirements for each selection. Further framework is provided as upgrades to these systems accomplished in the 1990s and in 2003 are compared to demonstrate the role that technological advances have had in their improvement. Both of the selections were similar in their three phases; 1) definition of requirements, 2) identification of candidate products and their evaluation and testing and 3) selection by comparison of requirement fulfillment. The candidates for the communication architecture selection embraced several different methodologies which are explained and contrasted. Requirements for this selection are presented and the selection process is described. Several candidates for the software component of the data acquisition and controls system are identified, requirements for evaluation and selection are presented, and the evaluation process is described.

  14. The Virtual Glovebox (VGX): An Immersive Simulation System for Training Astronauts to Perform Glovebox Experiments in Space (United States)

    Smith, Jeffrey D.; Dalton, Bonnie (Technical Monitor)


    The era of the International Space Station (ISS) has finally arrived, providing researchers on Earth a unique opportunity to study long-term effects of weightlessness and the space environment on structures, materials and living systems. Many of the physical, biological and material science experiments planned for ISS will require significant input and expertise from astronauts who must conduct the research, follow complicated assay procedures and collect data and samples in space. Containment is essential for Much of this work, both to protect astronauts from potentially harmful biological, chemical or material elements in the experiments as well as to protect the experiments from contamination by air-born particles In the Space Station environment. When astronauts must open the hardware containing such experiments, glovebox facilities provide the necessary barrier between astronaut and experiment. On Earth, astronauts are laced with the demanding task of preparing for the many glovebox experiments they will perform in space. Only a short time can be devoted to training for each experimental task and gl ovebox research only accounts for a small portion of overall training and mission objectives on any particular ISS mission. The quality of the research also must remain very high, requiring very detailed experience and knowledge of instrumentation, anatomy and specific scientific objectives for those who will conduct the research. This unique set of needs faced by NASA has stemmed the development of a new computer simulation tool, the Virtual Glovebox (VGB), which is designed to provide astronaut crews and support personnel with a means to quickly and accurately prepare and train for glovebox experiments in space.

  15. Simulation of space charge effects in particle accelerators. Annual report, August 1, 1983-September 30, 1984

    International Nuclear Information System (INIS)

    Haber, I.


    Progress during the FY83/84 period has involved both the use of existing numerical tools to investigate current issues, and the development of new techniques for future simulations of increasing sophistication. A balance has been sought with a view towards maximizing the utility of simulations to both present and future decisions in accelerator design. Emphasis during this contract has centered on investigating the nonlinear dynamics of a very low emittance beam with a realistic distribution function - especially when complications such as the image forces from a nearby conducting electrode are considered. A significant part of the effort during this period was also expended in spreading the simulation capabilities already developed. Versions of the SHIFT (Simulation of Heavy Ion Fusion Transport) series of computer codes have been installed on machines available to the HIF community. The enhanced availability of these codes has facilitated their use outside of NRL. For example, simulation results with a significant impact on MBE design were obtained at LBL using the MFECC Version of SHIFT-XY

  16. Effect of simulated microgravity on growth and production of exopolymeric substances of Micrococcus luteus space and earth isolates. (United States)

    Mauclaire, Laurie; Egli, Marcel


    Microorganisms tend to form biofilms on surfaces, thereby causing deterioration of the underlaying material. In addition, biofilm is a potential health risk to humans. Therefore, microorganism growth is not only an issue on Earth but also in manned space habitats like the International Space Station (ISS). The aim of the study was to identify physiological processes relevant for Micrococcus luteus attachment under microgravity conditions. The results demonstrate that simulated microgravity influences physiological processes which trigger bacterial attachment and biofilm formation. The ISS strains produced larger amounts of exopolymeric substances (EPS) compared with a reference strain from Earth. In contrast, M. luteus strains were growing faster, and Earth as well as ISS isolates produced a higher yield of biomass under microgravity conditions than under normal gravity. Furthermore, microgravity caused a reduction of the colloidal EPS production of ISS isolates in comparison with normal gravity, which probably influences biofilm thickness and stability as well.

  17. Using Discrete Event Simulation to Model Integrated Commodities Consumption for a Launch Campaign of the Space Launch System (United States)

    Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant


    In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.

  18. Automorphosis of higher plants in space is simulated by using a 3-dimensional clinostat or by application of chemicals (United States)

    Miyamoto, K.; Hoshino, T.; Hitotsubashi, R.; Yamashita, M.; Ueda, J.

    In STS-95 space experiments, etiolated pea seedlings grown under microgravity conditions in space have shown to be automorphosis. Epicotyls were almost straight but the most oriented toward the direction far from their cotyledons with ca. 45 degrees from the vertical line as compared with that on earth. In order to know the mechanism of microgravity conditions in space to induce automorphosis, we introduced simulated microgravity conditions on a 3-dimensional clinostat, resulting in the successful induction of automorphosis-like growth and development. Kinetic studies revealed that epicotyls bent at their basal region or near cotyledonary node toward the direction far from the cotyledons with about 45 degrees in both seedlings grown on 1 g and under simulated microgravity conditions on the clinostat within 48 hrs after watering. Thereafter epicotyls grew keeping this orientation under simulated microgravity conditions on the clinostat, whereas those grown on 1 g changed the growth direction to vertical direction by negative gravitropic response. Automorphosis-like growth and development was induced by the application of auxin polar transport inhibitors (2,3,5-triiodobenzoic acid, N-(1-naphtyl)phthalamic acid, 9-hydroxyfluorene-9-carboxylic acid), but not an anti-auxin, p-chlorophenoxyisobutyric acid. Automorphosis-like epicotyl bending was also phenocopied by the application of inhibitors of stretch-activated channel, LaCl3 and GdCl3, and by the application of an inhibitor of protein kinase, cantharidin. These results suggest that automorphosis-like growth in epicotyls of etiolated pea seedlings is due to suppression of negative gravitropic responses on 1 g, and the growth and development of etiolated pea seedlings under 1 g conditions requires for normal activities of auxin polar transport and the gravisensing system relating to calcium channels. Possible mechanisms of perception and transduction of gravity signals to induce automorphosis are discussed.

  19. Design of the Experimental Exposure Conditions to Simulate Ionizing Radiation Effects on Candidate Replacement Materials for the Hubble Space Telescope (United States)

    Smith, L. Montgomery


    In this effort, experimental exposure times for monoenergetic electrons and protons were determined to simulate the space radiation environment effects on Teflon components of the Hubble Space Telescope. Although the energy range of the available laboratory particle accelerators was limited, optimal exposure times for 50 keV, 220 keV, 350 keV, and 500 KeV electrons were calculated that produced a dose-versus-depth profile that approximated the full spectrum profile, and were realizable with existing equipment. For the case of proton exposure, the limited energy range of the laboratory accelerator restricted simulation of the dose to a depth of .5 mil. Also, while optimal exposure times were found for 200 keV, 500 keV and 700 keV protons that simulated the full spectrum dose-versus-depth profile to this depth, they were of such short duration that the existing laboratory could not be controlled to within the required accuracy. In addition to the obvious experimental issues, other areas exist in which the analytical work could be advanced. Improved computer codes for the dose prediction- along with improved methodology for data input and output- would accelerate and make more accurate the calculational aspects. This is particularly true in the case of proton fluxes where a paucity of available predictive software appears to exist. The dated nature of many of the existing Monte Carlo particle/radiation transport codes raises the issue as to whether existing codes are sufficient for this type of analysis. Other areas that would result in greater fidelity of laboratory exposure effects to the space environment is the use of a larger number of monoenergetic particle fluxes and improved optimization algorithms to determine the weighting values.

  20. Distributed Geant4 simulation in medical and space science applications using DIANE framework and the GRID

    CERN Document Server

    Moscicki, J T; Mantero, A; Pia, M G


    Distributed computing is one of the most important trends in IT which has recently gained significance for large-scale scientific applications. Distributed analysis environment (DIANE) is a R&D study, focusing on semiinteractive parallel and remote data analysis and simulation, which has been conducted at CERN. DIANE provides necessary software infrastructure for parallel scientific applications in the master-worker model. Advanced error recovery policies, automatic book-keeping of distributed jobs and on-line monitoring and control tools are provided. DIANE makes a transparent use of a number of different middleware implementations such as load balancing service (LSF, PBS, GRID Resource Broker, Condor) and security service (GSI, Kerberos, openssh). A number of distributed Geant 4 simulations have been deployed and tested, ranging from interactive radiotherapy treatment planning using dedicated clusters in hospitals, to globally-distributed simulations of astrophysics experiments using the European data g...