WorldWideScience

Sample records for twenty-first space simulation

  1. The twenty-first century in space

    CERN Document Server

    Evans, Ben

    2015-01-01

    This final entry in the History of Human Space Exploration mini-series by Ben Evans continues with an in-depth look at the latter part of the 20th century and the start of the new millennium. Picking up where Partnership in Space left off, the story commemorating the evolution of manned space exploration unfolds in further detail. More than fifty years after Yuri Gagarin’s pioneering journey into space, Evans extends his overview of how that momentous voyage continued through the decades which followed. The Twenty-first Century in Space, the sixth book in the series, explores how the fledgling partnership between the United States and Russia in the 1990s gradually bore fruit and laid the groundwork for today’s International Space Station. The narrative follows the convergence of the Shuttle and Mir programs, together with standalone missions, including servicing the Hubble Space Telescope, many of whose technical and human lessons enabled the first efforts to build the ISS in orbit. The book also looks to...

  2. The twenty-first century commercial space imperative

    CERN Document Server

    Young, Anthony

    2015-01-01

    Young addresses the impressive expansion across existing and developing commercial space business markets, with multiple private companies competing in the payload launch services sector. The author pinpoints the new markets, technologies, and players in the industry, as well as highlighting the overall reasons why it is important for us to develop space. NASA now relies on commercial partners to supply cargo and crew spacecraft and services to and from the International Space Station. The sizes of satellites are diminishing and their capabilities expanding, while costs to orbit are decreasing. Suborbital space tourism holds the potential of new industries and jobs. Commercial space exploration of the Moon and the planets also holds promise. All this activity is a catalyst for anyone interested in joining the developing space industry, from students and researchers to engineers and entrepreneurs. As more and more satellites and rockets are launched and the business of space is expanding at a signifi...

  3. Automation and robotics for Space Station in the twenty-first century

    Science.gov (United States)

    Willshire, K. F.; Pivirotto, D. L.

    1986-01-01

    Space Station telerobotics will evolve beyond the initial capability into a smarter and more capable system as we enter the twenty-first century. Current technology programs including several proposed ground and flight experiments to enable development of this system are described. Advancements in the areas of machine vision, smart sensors, advanced control architecture, manipulator joint design, end effector design, and artificial intelligence will provide increasingly more autonomous telerobotic systems.

  4. Twenty-first century Arctic climate change in the CCSM3 IPCC scenario simulations

    Energy Technology Data Exchange (ETDEWEB)

    Teng, Haiyan; Washington, Warren M.; Meehl, Gerald A.; Buja, Lawrence E.; Strand, Gary W. [National Center for Atmospheric Research, Boulder, CO (United States)

    2006-05-15

    Arctic climate change in the Twenty-first century is simulated by the Community Climate System Model version 3.0 (CCSM3). The simulations from three emission scenarios (A2, A1B and B1) are analyzed using eight (A1B and B1) or five (A2) ensemble members. The model simulates a reasonable present-day climate and historical climate trend. The model projects a decline of sea-ice extent in the range of 1.4-3.9% per decade and 4.8-22.2% per decade in winter and summer, respectively, corresponding to the range of forcings that span the scenarios. At the end of the Twenty-first century, the winter and summer Arctic mean surface air temperature increases in a range of 4-14 C (B1 and A2) and 0.7-5 C (B1 and A2) relative to the end of the Twentieth century. The Arctic becomes ice-free during summer at the end of the Twenty-first century in the A2 scenario. Similar to the observations, the Arctic Oscillation (AO) is the dominant factor in explaining the variability of the atmosphere and sea ice in the 1870-1999 historical runs. The AO shifts to the positive phase in response to greenhouse gas forcings in the Twenty-first century. But the simulated trends in both Arctic mean sea-level pressure and the AO index are smaller than what has been observed. The Twenty-first century Arctic warming mainly results from the radiative forcing of greenhouse gases. The 1st empirical orthogonal function (explains 72.2-51.7% of the total variance) of the wintertime surface air temperature during 1870-2099 is characterized by a strong warming trend and a ''polar amplification''-type of spatial pattern. The AO, which plays a secondary role, contributes to less than 10% of the total variance in both surface temperature and sea-ice concentration. (orig.)

  5. Space Science in the Twenty-First Century: Imperatives for the Decades 1995 to 2015. Overview

    Science.gov (United States)

    1988-01-01

    The opportunities for space science in the period from 1995 to 2015 are discussed. A perspective on progress in the six disciplines (the planet Earth; planetary and lunar exploration; solar system space physics; astronomy and astrophysics; fundamental physics and chemistry; and life sciences) of space science are reviewed. The prospectives for major achievements by 1995 from missions already underway or awaiting new starts are included. A set of long range goals for these disciplines are presented for the first two decades of the twenty-first century. Broad themes for future scientific pursuits are presented and some examples of high-priority missions for the turn of the century are highlighted. A few recommendations are cited for each discipline to suggest how these themes might be developed.

  6. A twenty-first century perspective. [NASA space communication infrastructure to support space missions

    Science.gov (United States)

    Aller, Robert O.; Miller, Albert

    1990-01-01

    The status of the NASA assets which are operated by the Office of Space Operations is briefly reviewed. These assets include the ground network, the space network, and communications and data handling facilities. The current plans for each element are examined, and a projection of each is made to meet the user needs in the 21st century. The following factors are noted: increasingly responsive support will be required by the users; operational support concepts must be cost-effective to serve future missions; and a high degree of system reliability and availability will be required to support manned exploration and increasingly complex missions.

  7. CLARREO shortwave observing system simulation experiments of the twenty-first century: Simulator design and implementation

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, D.R.; Algieri, C.A.; Ong, J.R.; Collins, W.D.

    2011-04-01

    Projected changes in the Earth system will likely be manifested in changes in reflected solar radiation. This paper introduces an operational Observational System Simulation Experiment (OSSE) to calculate the signals of future climate forcings and feedbacks in top-of-atmosphere reflectance spectra. The OSSE combines simulations from the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report for the NCAR Community Climate System Model (CCSM) with the MODTRAN radiative transfer code to calculate reflectance spectra for simulations of current and future climatic conditions over the 21st century. The OSSE produces narrowband reflectances and broadband fluxes, the latter of which have been extensively validated against archived CCSM results. The shortwave reflectance spectra contain atmospheric features including signals from water vapor, liquid and ice clouds, and aerosols. The spectra are also strongly influenced by the surface bidirectional reflectance properties of predicted snow and sea ice and the climatological seasonal cycles of vegetation. By comparing and contrasting simulated reflectance spectra based on emissions scenarios with increasing projected and fixed present-day greenhouse gas and aerosol concentrations, we find that prescribed forcings from increases in anthropogenic sulfate and carbonaceous aerosols are detectable and are spatially confined to lower latitudes. Also, changes in the intertropical convergence zone and poleward shifts in the subsidence zones and the storm tracks are all detectable along with large changes in snow cover and sea ice fraction. These findings suggest that the proposed NASA Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission to measure shortwave reflectance spectra may help elucidate climate forcings, responses, and feedbacks.

  8. Simulating care: technology-mediated learning in twenty-first century nursing education.

    Science.gov (United States)

    Diener, Elizabeth; Hobbs, Nelda

    2012-01-01

    The increased reliance on simulation classrooms has proven successful in learning skills. Questions persist concerning the ability of technology-driven robotic devices to form and cultivate caring behaviors, or sufficiently develop interactive nurse-client communication necessary in the context of nursing. This article examines the disconnects created by use of simulation technology in nursing education, raising the question: "Can learning of caring-as-being, be facilitated in simulation classrooms?" We propose that unless time is spent with human beings in the earliest stages of nursing education, transpersonal caring relationships do not have space to develop. Learning, crafting, and maturation of caring behaviors threatens to become a serendipitous event or is no longer perceived as an essential characteristic of nursing. Technology does not negate caring-the isolation it fosters makes transpersonal caring all the more important. We are called to create a new paradigm for nursing education that merges Nightingale's vision with technology's promise. © 2012 Wiley Periodicals, Inc.

  9. Space science to the twenty-first century and the technological implications for implementation

    Science.gov (United States)

    Herman, D. H.

    1979-01-01

    The paper presents the specific plan for NASA space science missions to the 21st century and highlights the major technological advances that must be effected to accomplish the planned missions. Separate consideration is given to plans for astrophysics, planetary exploration, the solar terrestrial area, and life sciences. The technological consequences of the plans in these separate areas are discussed.

  10. Twenty-first century quantum mechanics Hilbert space to quantum computers mathematical methods and conceptual foundations

    CERN Document Server

    Fano, Guido

    2017-01-01

    This book is designed to make accessible to nonspecialists the still evolving concepts of quantum mechanics and the terminology in which these are expressed. The opening chapters summarize elementary concepts of twentieth century quantum mechanics and describe the mathematical methods employed in the field, with clear explanation of, for example, Hilbert space, complex variables, complex vector spaces and Dirac notation, and the Heisenberg uncertainty principle. After detailed discussion of the Schrödinger equation, subsequent chapters focus on isotropic vectors, used to construct spinors, and on conceptual problems associated with measurement, superposition, and decoherence in quantum systems. Here, due attention is paid to Bell’s inequality and the possible existence of hidden variables. Finally, progression toward quantum computation is examined in detail: if quantum computers can be made practicable, enormous enhancements in computing power, artificial intelligence, and secure communication will result...

  11. The U.S. Air Force in Space 1945 to the Twenty-first Century

    Science.gov (United States)

    1998-01-01

    badminton game, making presentations to people in the Pentagon and to the Congress. “Why can’t we go faster?” they demanded. “Why can’t we do something...at the Aerospace Research Pilot School at Edwards AFB. During the Vietnam War, General Kutyna served with the 44th Tactical Fighter Squadron...courses in every school at every level. We must practice to use space systems. Unless we practice it in every exercise, it will not be used in war. I

  12. Space Station user traffic model analysis for mission payload servicing into the twenty-first century

    Science.gov (United States)

    Gould, G. J.

    1986-01-01

    The Space Station-based Customer Servicing Facility service bay requirements for service accommodation to the Initial Orbit Capability (IOC) and far-term Station Accommodation Test Sets (SETS) missions are analyzed using the developed mission traffic model. Analysis results are presented which indicate that one servicing bay will be sufficient to accommodate IOC customer servicing requirements. Growth servicing requirements indicate that an additional servicing bay will be needed for accommodation of the far-term SATS mission payloads. Even though the level of total mission accommodation is below 100 percent for one bay at IOC and two bays during growth operations, the levels are such that operational work-around exists so that additional servicing bays will not be required.

  13. Space Station user traffic model analysis for mission payload servicing into the twenty-first century

    Science.gov (United States)

    Gould, G. J.

    1986-01-01

    The Space Station-based Customer Servicing Facility service bay requirements for service accommodation to the Initial Orbit Capability (IOC) and far-term Station Accommodation Test Sets (SETS) missions are analyzed using the developed mission traffic model. Analysis results are presented which indicate that one servicing bay will be sufficient to accommodate IOC customer servicing requirements. Growth servicing requirements indicate that an additional servicing bay will be needed for accommodation of the far-term SATS mission payloads. Even though the level of total mission accommodation is below 100 percent for one bay at IOC and two bays during growth operations, the levels are such that operational work-around exists so that additional servicing bays will not be required.

  14. Climate simulation of the twenty-first century with interactive land-use changes

    Energy Technology Data Exchange (ETDEWEB)

    Voldoire, Aurore; Royer, Jean-Francois; Chauvin, Fabrice [GAME/CNRM (Meteo-France, CNRS), Toulouse (France); Eickhout, Bas [Netherlands Environmental Assessment Agency, Bilthoven (Netherlands); Schaeffer, Michiel [Wageningen University, Wageningen (Netherlands)

    2007-08-15

    To include land-use dynamics in a general circulation model (GCM), the physical system has to be linked to a system that represents socio-economy. This issue is addressed by coupling an integrated assessment model, IMAGE2.2, to an ocean-atmosphere GCM, CNRM-CM3. In the new system, IMAGE2.2 provides CNRM-CM3 with all the external forcings that are scenario dependent: greenhouse gas (GHGs) concentrations, sulfate aerosols charge and land cover. Conversely, the GCM gives IMAGE changes in mean temperature and precipitation. With this new system, we have run an adapted scenario of the IPCC SRES scenario family. We have chosen a single scenario with maximum land-use changes (SRES A2), to illustrate some important feedback issues. Even in this two-way coupled model set-up, land use in this scenario is mainly driven by demographic and agricultural practices, which overpowers a potential influence of climate feedbacks on land-use patterns. This suggests that for scenarios in which socio-economically driven land-use change is very large, land-use changes can be incorporated in GCM simulations as a one-way driving force, without taking into account climate feedbacks. The dynamics of natural vegetation is more closely linked to climate but the time-scale of changes is of the order of a century. Thus, the coupling between natural vegetation and climate could generate important feedbacks but these effects are relevant mainly for multi-centennial simulations. (orig.)

  15. Simulating changes in the leaf unfolding time of 20 plant species in China over the twenty-first century

    Science.gov (United States)

    Ge, Quansheng; Wang, Huanjiong; Dai, Junhu

    2014-05-01

    Recent shifts in phenology reflect the biological response to current climate change. Aiming to enhance our understanding of phenological responses to climate change, we developed, calibrated and validated spatio-temporal models of first leaf date (FLD) for 20 broadleaved deciduous plants in China. Using daily meteorological data from the Chinese Meteorological Administration and the Community Climate System Model, version 3 (CCSM3) created using three IPCC scenarios (A2, A1B and B1), we described the FLD time series of each species over the past 50 years, extrapolating from these results to simulate estimated FLD changes for each species during the twenty-first century. Model validation suggests that our spatio-temporal models can simulate FLD accurately with R 2 (explained variance) >0.60. Model simulations show that, from 1952 to 2007, the FLD in China advanced at a rate of -1.14 days decade-1 on average. Furthermore, changes in FLD showed noticeable variation between regions, with clearer advances observed in the north than in the south of the country. The model indicates that the advances in FLD observed from 1952-2007 in China will continue over the twenty-first century, although significant differences among species and different climate scenarios are expected. The average trend of FLD advance in China during the twenty-first century is modeled as being -1.92 days decade-1 under the A2 scenario, -1.10 days decade-1 under the A1B scenario and -0.74 days decade-1 under the B2 scenario. The spatial pattern of FLD change for the period 2011-2099 is modeled as being similar but showing some difference from patterns in the 1952-2007 period. At the interspecific level, early-leafing species were found to show a greater advance in FLD, while species with larger distributions tended to show a weaker advance in FLD. These simulated changes in phenology may have significant implications for plant distribution as well as ecosystem structure and function.

  16. New opportunities in space; Proceedings of the twenty-first space congress, Cocoa Beach, FL, April 24-26, 1984

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-01

    Various papers on space technology are presented. The general topics discussed include: international aerospace programs, machines to augment man, space communications, flight and ground operations, space station technology, innovative technology applications, future space transportation and missions, STS flight experiments, and commercialization of space. The wide scope of technologies that contribute to today's successes in space and point the way to future operations are emphasized.

  17. Evaluation of the twenty-first century RCM simulations driven by multiple GCMs over the Eastern Mediterranean-Black Sea region

    Science.gov (United States)

    Önol, Barış; Bozkurt, Deniz; Turuncoglu, Ufuk Utku; Sen, Omer Lutfi; Dalfes, H. Nuzhet

    2014-04-01

    In this study, human-induced climate change over the Eastern Mediterranean-Black Sea region has been analyzed for the twenty-first century by performing regional climate model simulations forced with large-scale fields from three different global circulation models (GCMs). Climate projections have been produced with Special Report on Emissions Scenarios A2, A1FI and B1 scenarios, which provide greater diversity in climate information for future period. The gradual increases for temperature are widely apparent during the twenty-first century for each scenario simulation, but ECHAM5-driven simulation generally has a weaker signal for all seasons compared to CCSM3 simulations except for the Fertile Crescent. The contrast in future temperature change between the winter and summer seasons is very strong for CCSM3-A2-driven and HadCM3-A2-driven simulations over Carpathians and Balkans, 4-5 °C. In addition, winter runoff over mountainous region of Turkey, which feeds many river systems including the Euphrates and Tigris, increases in second half of the century since the snowmelt process accelerates where the elevation is higher than 1,500 m. Moreover, analysis of daily temperature outputs reveals that the gradual decrease in daily minimum temperature variability for January during the twenty-first century is apparent over Carpathians and Balkans. Analysis of daily precipitation extremes shows that positive trend is clear during the last two decades of the twenty-first century over Carpathians for both CCSM3-driven and ECHAM5-driven simulations. Multiple-GCM driven regional climate simulations contribute to the quantification of the range of climate change over a region by performing detailed comparisons between the simulations.

  18. Transient climate change scenario simulation of the Mediterranean Sea for the twenty-first century using a high-resolution ocean circulation model

    Energy Technology Data Exchange (ETDEWEB)

    Somot, S.; Sevault, F.; Deque, M. [Centre National de Recherches Meteorologiques, Meteo-France, Toulouse cedex 1 (France)

    2006-12-15

    A scenario of the Mediterranean Sea is performed for the twenty-first century based on an ocean modelling approach. A climate change IPCC-A2 scenario run with an atmosphere regional climate model is used to force a Mediterranean Sea high-resolution ocean model over the 1960-2099 period. For comparison, a control simulation as long as the scenario has also been carried out under present climate fluxes. This control run shows air-sea fluxes in agreement with observations, stable temperature and salinity characteristics and a realistic thermohaline circulation simulating the different intermediate and deep water masses described in the literature. During the scenario, warming and saltening are simulated for the surface (+3.1 C and + 0.48 psu for the Mediterranean Sea at the end of the twenty-first century) and for the deeper layers (+1.5 C and + 0.23 psu on average). These simulated trends are in agreement with observed trends for the Mediterranean Sea over the last decades. In addition, the Mediterranean thermohaline circulation (MTHC) is strongly weakened at the end of the twenty-first century. This behaviour is mainly due to the decrease in surface density and so the decrease in winter deep-water formation. At the end of the twenty-first century, the MTHC weakening can be evaluated as -40% for the intermediate waters and -80% for the deep circulation with respect to present-climate conditions. The characteristics of the Mediterranean Outflow Waters flowing into the Atlantic Ocean are also strongly influenced during the scenario. (orig.)

  19. Space Science in the Twenty-First Century: Imperatives for the Decades 1995 to 2015. Mission to Planet Earth

    Science.gov (United States)

    1988-01-01

    A unified program is outlined for studying the Earth, from its deep interior to its fluid envelopes. A system is proposed for measuring devices involving both space-based and in-situ observations that can accommodate simultaneously a large range of scientific needs. The scientific objectices served by this integrated infrastructure are cased into a framework of four grand themes. In summary these are: to determine the composition, structure, dynamics, and evolution of the Earth's crust and deeper interior; to establish and understand the structure, dynamics, and chemistry of the oceans, atmosphere, and cryosphere, and their interaction with the solid Earth; to characterize the history and dynamics of living organisms and their interaction with the environment; and to monitor and understand the interaction of human activities with the natural environment. A focus on these grand themes will help to understand the origin and fate of the planet, and to place it in the context of the solar system.

  20. European climate in the late twenty-first century: regional simulations with two driving global models and two forcing scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Raeisaenen, J. [Division of Atmospheric Sciences, Department of Physical Sciences, University of Helsinki, Gustaf Haellstroemin katu 2, PO Box 64, 00014, Helsinki (Finland); Rossby Centre, Swedish Meteorological and Hydrological Institute, 60176, Norrkoeping (Sweden); Hansson, U.; Ullerstig, A.; Doescher, R.; Graham, L.P.; Jones, C.; Meier, H.E.M.; Samuelsson, P.; Willen, U. [Rossby Centre, Swedish Meteorological and Hydrological Institute, 60176, Norrkoeping (Sweden)

    2004-01-01

    A basic analysis is presented for a series of regional climate change simulations that were conducted by the Swedish Rossby Centre and contribute to the PRUDENCE (Prediction of Regional scenarios and Uncertainties for Defining EuropeaN Climate change risks and Effects) project. For each of the two driving global models HadAM3H and ECHAM4/OPYC3, a 30-year control run and two 30-year scenario runs (based on the SRES A2 and B2 emission scenarios) were made with the regional model. In this way, four realizations of climate change from 1961-1990 to 2071-2100 were obtained. The simulated changes are larger for the A2 than the B2 scenario (although with few qualitative differences) and in most cases in the ECHAM4/OPYC3-driven (RE) than in the HadAM3H-driven (RH) regional simulations. In all the scenario runs, the warming in northern Europe is largest in winter or late autumn. In central and southern Europe, the warming peaks in summer when it locally reaches 10 C in the RE-A2 simulation and 6-7 C in the RH-A2 and RE-B2 simulations. The four simulations agree on a general increase in precipitation in northern Europe especially in winter and on a general decrease in precipitation in southern and central Europe in summer, but the magnitude and the geographical patterns of the change differ markedly between RH and RE. This reflects very different changes in the atmospheric circulation during the winter half-year, which also lead to quite different simulated changes in windiness. All four simulations show a large increase in the lowest minimum temperatures in northern, central and eastern Europe, most likely due to reduced snow cover. Extreme daily precipitation increases even in most of those areas where the mean annual precipitation decreases. (orig.)

  1. A transient climate change simulation with greenhouse gas and aerosol forcing: projected climate to the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Boer, G.J.; Flato, G.; Ramsden, D. [Canadian Centre for Climate Modelling and Analysis, Victoria, BC (Canada)

    2000-06-01

    The potential climatic consequences of increasing atmospheric greenhouse gas (GHG) concentration and sulfate aerosol loading are investigated for the years 1900 to 2100 based on five simulations with the CCCma coupled climate model. The five simulations comprise a control experiment without change in GHG or aerosol amount, three independent simulations with increasing GHG and aerosol forcing, and a simulation with increasing GHG forcing only. Climate warming accelerates from the present with global mean temperatures simulated to increase by 1.7 C to the year 2050 and by a further 2.7 C by the year 2100. The warming is nonuniform as to hemisphere, season, and underlying surface. Changes in interannual variability of temperature show considerable structure and seasonal dependence. The effect of the comparatively localized negative radiative forcing associated with the aerosol is to retard and reduce the warming by about 0.9 C at 2050 and 1.2 C at 2100. Its primary effect on temperature is to counteract the global pattern of GHG-induced warming and only secondarily to affect local temperatures suggesting that the first order transient climate response of the system is determined by feedback processes and only secondarily by the local pattern of radiative forcing. The warming is accompanied by a more active hydrological cycle with increases in precipitation and evaporation rates that are delayed by comparison with temperature increases. There is an ''El Nino-like'' shift in precipitation and an overall increase in the interannual variability of precipitation. The effect of the aerosol forcing is again primarily to delay and counteract the GHG-induced increase. Decreases in soil moisture are common but regionally dependent and interannual variability changes show considerable structure. (orig.)

  2. An ENSO stability analysis. Part II: results from the twentieth and twenty-first century simulations of the CMIP3 models

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seon Tae [University of Hawaii at Manoa, Department of Meteorology, Honolulu, HI (United States); University of California, Department of Earth System Science, Irvine, CA (United States); Jin, Fei-Fei [University of Hawaii at Manoa, Department of Meteorology, Honolulu, HI (United States)

    2011-04-15

    In this study, a Bjerknes stability (BJ) index, proposed by Jin et al. (2006), is adopted to assess the overall stability of El Nino and Southern Oscillation (ENSO) in state-of-the-art coupled models. The twentieth and twenty-first century simulations of 12 coupled models among the coupled model intercomparison project phase 3 models used in the intergovernmental panel on climate change forth assessment report demonstrate a significant positive correlation between ENSO amplitude and ENSO stability as measured by the BJ index. The simulations also show a diversity of behavior regarding the ENSO stability among the coupled models, which can be attributed to different mean state and sensitivity of an oceanic and atmospheric response to wind and SST forcing from model to model. When respective components of the BJ index obtained from the coupled models are compared with those from observations, it is revealed that most coupled models underestimate the thermodynamic damping effect and the positive effect of the zonal advective and thermocline feedback. Under increased CO{sub 2} induced warm climate, changes, relative to the twentieth century simulations, in the damping and feedback terms responsible for the ENSO stability measured by the BJ index can be linked to mean state changes and associated atmospheric and oceanic response sensitivity changes. There is a clear multi-model trend in the damping terms and positive zonal advective feedback, thermocline feedback, and Ekman feedback terms under enhanced greenhouse gas conditions. However, the various behavior among the coupled models in competition between the positive feedback and negative damping terms in the BJ index formula prevent the formation of a definitive conclusion regarding future projections of ENSO stability using the current coupled models. (orig.)

  3. Twenty-first century vaccines

    Science.gov (United States)

    Rappuoli, Rino

    2011-01-01

    In the twentieth century, vaccination has been possibly the greatest revolution in health. Together with hygiene and antibiotics, vaccination led to the elimination of many childhood infectious diseases and contributed to the increase in disability-free life expectancy that in Western societies rose from 50 to 78–85 years (Crimmins, E. M. & Finch, C. E. 2006 Proc. Natl Acad. Sci. USA 103, 498–503; Kirkwood, T. B. 2008 Nat. Med 10, 1177–1185). In the twenty-first century, vaccination will be expected to eliminate the remaining childhood infectious diseases, such as meningococcal meningitis, respiratory syncytial virus, group A streptococcus, and will address the health challenges of this century such as those associated with ageing, antibiotic resistance, emerging infectious diseases and poverty. However, for this to happen, we need to increase the public trust in vaccination so that vaccines can be perceived as the best insurance against most diseases across all ages. PMID:21893537

  4. Twenty-first Century Space Science in The Urban High School Setting: The NASA/John Dewey High School Educational Outreach Partnership

    Science.gov (United States)

    Fried, B.; Levy, M.; Reyes, C.; Austin, S.

    2003-05-01

    A unique and innovative partnership has recently developed between NASA and John Dewey High School, infusing Space Science into the curriculum. This partnership builds on an existing relationship with MUSPIN/NASA and their regional center at the City University of New York based at Medgar Evers College. As an outgrowth of the success and popularity of our Remote Sensing Research Program, sponsored by the New York State Committee for the Advancement of Technology Education (NYSCATE), and the National Science Foundation and stimulated by MUSPIN-based faculty development workshops, our science department has branched out in a new direction - the establishment of a Space Science Academy. John Dewey High School, located in Brooklyn, New York, is an innovative inner city public school with students of a diverse multi-ethnic population and a variety of economic backgrounds. Students were recruited from this broad spectrum, which covers the range of learning styles and academic achievement. This collaboration includes students of high, average, and below average academic levels, emphasizing participation of students with learning disabilities. In this classroom without walls, students apply the strategies and methodologies of problem-based learning in solving complicated tasks. The cooperative learning approach simulates the NASA method of problem solving, as students work in teams, share research and results. Students learn to recognize the complexity of certain tasks as they apply Earth Science, Mathematics, Physics, Technology and Engineering to design solutions. Their path very much follows the NASA model as they design and build various devices. Our Space Science curriculum presently consists of a one-year sequence of elective classes taken in conjunction with Regents-level science classes. This sequence consists of Remote Sensing, Planetology, Mission to Mars (NASA sponsored research program), and Microbiology, where future projects will be astronomy related. This

  5. A Study on the Commercialization of Space-Based Remote Sensing in the Twenty-First Century and Its Implications to United States National Security

    Science.gov (United States)

    2011-06-01

    Japan’s JERS and India’s IRS-1C and 1D satellite are major remote sensing programs (O’Connell, 2001). According to Figure 2, the commercial remote...Sensing of the Earth From Outer Space Yin, Lijie. (2008). Encryption Techniques Remote Sensing Images Based On EZW and Chaos from The 9th

  6. Me acuerdo… ¿Te acuerdas?: Memory, Space and the Individualizing Transformation of the Subject in Twenty-First-Century Mexican Fiction

    OpenAIRE

    Elsa Treviño Ramírez

    2014-01-01

    During the second half of the twentieth-century, Mexican fictions operated under a revisionist historical logic that employed national spaces to allegorize the relationship between the individual, society and the nation. Countering this trend, since the mid-nineties, Mexican literature has witnessed a departure from an interest in collectivizing discourses of identity, displaying instead a growing faith in individualism as a means to resist state-driven cultural visions. To analyze this empha...

  7. Capital in the Twenty-First Century

    DEFF Research Database (Denmark)

    Hansen, Per H.

    2014-01-01

    Review essay on: Capital in the Twenty-First Century. By Thomas Piketty . Translated by Arthur Goldhammer . Cambridge, Mass.: The Belknap Press of Harvard University Press, 2014. viii + 685 pp......Review essay on: Capital in the Twenty-First Century. By Thomas Piketty . Translated by Arthur Goldhammer . Cambridge, Mass.: The Belknap Press of Harvard University Press, 2014. viii + 685 pp...

  8. Me acuerdo… ¿Te acuerdas?: Memory, Space and the Individualizing Transformation of the Subject in Twenty-First-Century Mexican Fiction

    Directory of Open Access Journals (Sweden)

    Elsa Treviño Ramírez

    2014-12-01

    Full Text Available During the second half of the twentieth-century, Mexican fictions operated under a revisionist historical logic that employed national spaces to allegorize the relationship between the individual, society and the nation. Countering this trend, since the mid-nineties, Mexican literature has witnessed a departure from an interest in collectivizing discourses of identity, displaying instead a growing faith in individualism as a means to resist state-driven cultural visions. To analyze this emphasis in individual personal emergence, this paper proposes a comparative reading of subject-formation in Álvaro Enrigue's Vidas perpendiculares (2008, and in José Emilio Pacheco’s canonical novella Las batallas en el desierto (1981. The publication of Vidas and Las batallas coincides with two moments of crisis and transformation in Mexico. Consequently, these novels of formation reflect the reconceptualization of the multiple relations between individuals, communities, and the state prompted by such changes. These coming-of-age fictions use the personal recollections of their protagonists to articulate the narration of their characters’ emergence into adulthood. Vidas and Las batallas present two highly divergent visions of the subject and her or his relationship to the social body, where in the case of Vidas the individual takes primacy over the community. Following Ulrich Beck’s insights regarding individualization in industrial societies, and informed by theories of memory and nostalgia, this study explores how literary understandings of identity have transformed to reflect the experience of late modernity in Mexico. This paper argues that in recent Mexican fiction history is spatialized as a way of examining individual subjectivity outside the framework that views history in literature as a discourse directly linked to collective, often national, identity.

  9. Twenty-First Century Space Propulsion Study

    Science.gov (United States)

    1990-10-01

    dependence of coherent radiation from crystals", Physical Review Letters 58, 1176-1179 (23 March 1987). 11 Y. Aharonov , F.T. Avignone, III, A. Casher , and...B-i C THE WEBER EFFECT ....................................... C-i D 2020 A.D. TECHNOLOGIES FOR AFAL...most energetic fuel known. The most effective particle of antimatter for propulsion is the antiproton rather than the antielectron. To make a compact

  10. Intelligence in the Twenty-First Century

    OpenAIRE

    2000-01-01

    The author concludes that the world will most probably remain rife with conflict even in the twenty first century and that the traditional role of intelligence will not only continue but will increase in importance. He characterizes the international situation as being "more of the same historically"; that is, the existence of several different centers of power and mutual conflicts based solely on national interests. In order to protect and promote one's national interests, sovereign states w...

  11. Servicing the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Fisk, D. [DTLR, London (United Kingdom)

    2002-04-01

    Twentieth century governments have committed themselves to the principle of sustainable development. Efforts to fulfil this goal offer an insight into changes in building services provision in the opening decades of the new century. Sustainable development indicators are used to identify possible trends. The analysis also forms the basis for some speculative conjectures as a basis for a research agenda for the twenty-first century. (Author)

  12. Twenty-first century learning in afterschool.

    Science.gov (United States)

    Schwarz, Eric; Stolow, David

    2006-01-01

    Twenty-first century skills increasingly represent the ticket to the middle class. Yet, the authors argue, in-school learning is simply not enough to help students develop these skills. The authors make the case that after-school (or out-of-school) learning programs are emerging as one of the nation's most promising strategies for preparing young people for the workforce and civic life. Most school systems have significant limitations for teaching twenty-first century skills. They have the limits of time: with only six hours per day there is barely enough time to teach even the basic skills, especially for those students starting already behind. They have the limits of structure: typical school buildings and classrooms are not physically set up for innovative learning. They have the limits of inertia and bureaucracy: school systems are notoriously resistant to change. And perhaps most important, they have the limits of priorities: especially with the onset of the No Child Left Behind Act, schools are laserlike in their focus on teaching the basics and therefore have less incentive to incorporate twenty-first century skills. Meanwhile, the authors argue that after-school programs are an untapped resource with three competitive advantages. First, they enable students to work collaboratively in small groups, a setup on which the modern economy will increasingly rely. Second, they are well suited to project-based learning and the development of mastery. Third, they allow students to learn in the real-world contexts that make sense. Yet the after-school sector is fraught with challenges. It lacks focus-Is it child care, public safety, homework tutoring? And it lacks rigorous results. The authors argue that the teaching of twenty-first century skills should become the new organizing principle for afterschool that will propel the field forward and more effectively bridge in-school and out-of-school learning.

  13. Earth observations in the twenty-first century

    Science.gov (United States)

    Geller, M. A.

    1986-01-01

    Some of the achievements of earth observations from past space missions are described. Also discussed are the achievements to be anticipated from currently approved and planned earth observation missions. In looking forward to what the objectives of earth observations from space are expected to be in the future, together with what technology is expected to enable, what the earth observing program will look like during the first part of the twenty-first century is discussed. It is concluded that a key part of this program will be long-term observations holistically viewing the earth system.

  14. Early twenty-first-century droughts during the warmest climate

    Directory of Open Access Journals (Sweden)

    Felix Kogan

    2016-01-01

    Full Text Available The first 13 years of the twenty-first century have begun with a series of widespread, long and intensive droughts around the world. Extreme and severe-to-extreme intensity droughts covered 2%–6% and 7%–16% of the world land, respectively, affecting environment, economies and humans. These droughts reduced agricultural production, leading to food shortages, human health deterioration, poverty, regional disturbances, population migration and death. This feature article is a travelogue of the twenty-first-century global and regional droughts during the warmest years of the past 100 years. These droughts were identified and monitored with the National Oceanic and Atmospheric Administration operational space technology, called vegetation health (VH, which has the longest period of observation and provides good data quality. The VH method was used for assessment of vegetation condition or health, including drought early detection and monitoring. The VH method is based on operational satellites data estimating both land surface greenness (NDVI and thermal conditions. The twenty-first-century droughts in the USA, Russia, Australia and Horn of Africa were intensive, long, covered large areas and caused huge losses in agricultural production, which affected food security and led to food riots in some countries. This research also investigates drought dynamics presenting no definite conclusion about drought intensification or/and expansion during the time of the warmest globe.

  15. Twenty-first century power needs. Challenges, and supply options

    Energy Technology Data Exchange (ETDEWEB)

    Criswell, D.R. [Houston Univ., TX (United States). Solar Energy Lab.

    1997-11-01

    The challenge of providing adequate power to enable world prosperity in the twenty-first century to continue was discussed. It was estimated that by 2050, a prosperous world of 10 billion people will require 60 TWt of thermal power. Conventional power systems will not be able to provide the needed energy because of limited fuels, contamination of the biosphere and costs. A viable, cost effective alternative will be solar energy that is captured in space and from facilities on the Moon, and that is imported to Earth by microwaves. Global electric power systems that use the Moon and deliver 1,000 TWe-Y of energy by 2070 was suggested as the most obvious alternative. Despite the huge initial cost of 20 to 100 trillion dollars, the long-term cost was said to be small compared to terrestrial and Earth-orbital options. 30 refs., 2 figs.

  16. Projection of drought hazards in China during twenty-first century

    Science.gov (United States)

    Liang, Yulian; Wang, Yongli; Yan, Xiaodong; Liu, Wenbin; Jin, Shaofei; Han, Mingchen

    2017-06-01

    Drought is occurring with increased frequency under climate warming. To understand the behavior of drought and its variation in the future, current and future drought in the twenty-first century over China is discussed. The drought frequency and trend of drought intensity are assessed using the Palmer Drought Severity Index (PDSI), which is calculated based on historical meteorological observations and outputs of the fifth Coupled Model Intercomparison Project (CMIP5) under three representative concentration pathway (RCP) scenarios. The simulation results of drought period, defined by PDSI class, could capture more than 90% of historical drought events. Projection results indicate that drought frequency will increase over China in the twenty-first century under the RCP4.5 and RCP8.5 scenarios. In the mid-twenty-first century (2021-2050), similar patterns of drought frequency are found under the three emission scenarios, and annual drought duration would last 3.5-4 months. At the end of the twenty-first century (2071-2100), annual drought duration could exceed 5 months in northwestern China as well as coastal areas of eastern and southern China under the RCP8.5 scenario. Drought is slightly reduced over the entire twenty-first century under the RCP2.6 scenario, whereas drought hazards will be more serious in most regions of China under the RCP8.5 scenario.

  17. The Work Place of the Early Twenty-First Century.

    Science.gov (United States)

    Brown, James M.

    1991-01-01

    Major issues affecting the workplace of the twenty-first century include productivity growth, globalization, resistance to change, worker alienation, and telecommunications. Opposing views of technology are that (1) it will improve the economy and create jobs or (2) the majority of new jobs will not require high skills. (SK)

  18. Membership, Belonging, and Identity in the Twenty-First Century

    Science.gov (United States)

    Motteram, Gary

    2016-01-01

    This article takes a case study approach to exploring membership, belonging, and identity amongst English language teachers in the twenty-first century. It explores findings from two membership surveys conducted for the International Association of Teachers of English as a Foreign Language (IATEFL), and considers the impact of recommendations…

  19. Tall Fescue for the Twenty-first Century

    Science.gov (United States)

    Tall Fescue for the Twenty-first Century is a comprehensive monograph by experts from around the world about the science of tall fescue [Lolium arundinaceum (Schreb.) Darbysh. = Schedonorus arundinaceus (Schreb.) Dumort., formerly Fes¬tuca arundinacea Schreb. var. arundinacea] and its applications. ...

  20. Powering into the twenty-first century [Singapore Power Limited

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1997-07-01

    To meet the challenges of the twenty-first century power industry, Singapore Power was incorporated as a commercial entity in October 1995. As the leading energy company in Singapore, SP continues to invest heavily in infrastructure development to improve its service efficiency and reliability, and to maintain its reputation as one of the world`s best power suppliers. (UK)

  1. Afterword: Victorian Sculpture for the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    David J. Getsy

    2016-06-01

    Full Text Available Commenting on the directions proposed by this issue of '19', the afterword discusses the broad trends in twenty-first century studies of Victorian sculpture and the opportunity for debate arising from the first attempt at a comprehensive exhibition.

  2. Membership, Belonging, and Identity in the Twenty-First Century

    Science.gov (United States)

    Motteram, Gary

    2016-01-01

    This article takes a case study approach to exploring membership, belonging, and identity amongst English language teachers in the twenty-first century. It explores findings from two membership surveys conducted for the International Association of Teachers of English as a Foreign Language (IATEFL), and considers the impact of recommendations…

  3. The Presidential Platform on Twenty-First Century Education Goals

    Science.gov (United States)

    Tichnor-Wagner, Ariel; Socol, Allison Rose

    2016-01-01

    As social and economic problems change, so do the goals of education reformers. This content analysis of presidential debates transcripts, state of the union addresses, and education budgets from 2000 to 2015 reveals the ways in which presidents and presidential candidates have framed education goals thus far in the twenty-first century. Using…

  4. The Work Place of the Early Twenty-First Century.

    Science.gov (United States)

    Brown, James M.

    1991-01-01

    Major issues affecting the workplace of the twenty-first century include productivity growth, globalization, resistance to change, worker alienation, and telecommunications. Opposing views of technology are that (1) it will improve the economy and create jobs or (2) the majority of new jobs will not require high skills. (SK)

  5. Digital earth applications in the twenty-first century

    NARCIS (Netherlands)

    de By, R.A.; Georgiadou, P.Y.

    2014-01-01

    In these early years of the twenty-first century, we must look at how the truly cross-cutting information technology supports other innovations, and how it will fundamentally change the information positions of government, private sector and the scientific domain as well as the citizen. In those

  6. Decadal potential predictability of twenty-first century climate

    Energy Technology Data Exchange (ETDEWEB)

    Boer, George J. [Canadian Centre for Climate Modelling and Analysis, Environment Canada, PO Box 3065, Victoria, BC (Canada)

    2011-03-15

    Decadal prediction of the coupled climate system is potentially possible given enough information and knowledge. Predictability will reside in both externally forced and in long timescale internally generated variability. The ''potential predictability'' investigated here is characterized by the fraction of the total variability accounted for by these two components in the presence of short-timescale unpredictable ''noise'' variability. Potential predictability is not a classical measure of predictability nor a measure of forecast skill but it does identify regions where long timescale variability is an appreciable fraction of the total and hence where prediction on these scale may be possible. A multi-model estimate of the potential predictability variance fraction (ppvf) as it evolves through the first part of the twenty-first century is obtained using simulation data from the CMIP3 archive. Two estimates of potential predictability are used which depend on the treatment of the forced component. The multi-decadal estimate considers the magnitude of the forced component as the change from the beginning of the century and so becomes largely a measure of climate change as the century progresses. The next-decade estimate considers the change in the forced component from the past decade and so is more pertinent to an actual forecast for the next decade. Long timescale internally generated variability provides additional potential predictability beyond that of the forced component. The ppvf may be expressed in terms of a signal-to-noise ratio and takes on values between 0 and 1. The largest values of the ppvf for temperature are found over tropical and mid-latitude oceans, with the exception of the equatorial Pacific, and some but not all tropical land areas. Overall the potential predictability for temperature generally declines with latitude and is relatively low over mid- to high-latitude land. Potential predictability for

  7. NATO’s Relevance in the Twenty-First Century

    Science.gov (United States)

    2012-03-22

    rules of engagement for force protection.19 NATO Foreign Ministers authorized the Supreme Allied Commander Europe (SACEUR) to begin the next stage of...the mission on 9 December 2004. The activation order for this next stage was given by SACEUR on 16 December 2004. It allowed the deployment of 300...Christopher Coker, Globalisation and Insecurity in the Twenty-first Century: NATO and the Management of Risk (The International Institute for Strategic

  8. Proceedings of the twenty-first LAMPF users group meeting

    Energy Technology Data Exchange (ETDEWEB)

    1988-04-01

    The Twenty-First Annual LAMPF Users Group Meeting was held November 9-10, 1987, at the Clinton P. Anderson Meson Physics Facility. The program included a number of invited talks on various aspects of nuclear and particle physics as well as status reports on LAMPF and discussions of upgrade options. The LAMPF working groups met and discussed plans for the secondary beam lines, experimental programs, and computing facilities.

  9. About capital in the twenty-first century

    OpenAIRE

    2015-01-01

    In this article, I present three key facts about income and wealth inequality in the long run emerging from my book, Capital in the Twenty-First Century, and seek to sharpen and refocus the discussion about those trends. In particular, I clarify the role played by r > g in my analysis of wealth inequality. I also discuss some of the implications for optimal taxation, and the relation between capital-income ratios and capital shares.

  10. Technological sciences society of the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-04-15

    This book introduces information-oriented society of the twenty-first century connected to computer network for example memory of dream : F-ram, information-oriented society : New media, communications network for next generation ; ISDN on what is IDSN?, development of information service industry, from office automation to an intelligent building in the future, home shopping and home banking and rock that hinders information-oriented society.

  11. Twenty-first-century medical microbiology services in the UK.

    Science.gov (United States)

    Duerden, Brian

    2005-12-01

    With infection once again a high priority for the UK National Health Service (NHS), the medical microbiology and infection-control services require increased technology resources and more multidisciplinary staff. Clinical care and health protection need a coordinated network of microbiology services working to consistent standards, provided locally by NHS Trusts and supported by the regional expertise and national reference laboratories of the new Health Protection Agency. Here, I outline my thoughts on the need for these new resources and the ways in which clinical microbiology services in the UK can best meet the demands of the twenty-first century.

  12. Accelerators for the twenty-first century a review

    CERN Document Server

    Wilson, Edmund J N

    1990-01-01

    The development of the synchrotron, and later the storage ring, was based upon the electrical technology at the turn of this century, aided by the microwave radar techniques of World War II. This method of acceleration seems to have reached its limit. Even superconductivity is not likely to lead to devices that will satisfy physics needs into the twenty-first century. Unless a new principle for accelerating elementary particles is discovered soon, it is difficult to imagine that high-energy physics will continue to reach out to higher energies and luminosities.

  13. The Dialectics of Discrimination in the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    John Stone

    2007-12-01

    Full Text Available This article explores some of the latest developments in the scholarship on race relations and nationalism that seek to address the impact of globalization and the changed geo-political relations of the first decade of the twenty-first century. New patterns of identification, some of which challenge existing group boundaries and others that reinforce them, can be seen to flow from the effects of global market changes and the political counter-movements against them. The impact of the “war on terrorism”, the limits of the utility of hard power, and the need for new mechanisms of inter-racial and inter-ethnic conflict resolution are evaluated to emphasize the complexity of these group relations in the new world disorder.

  14. Nuclear energy into the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Hammond, G.P. [Bath Univ. (United Kingdom). School of Mechanical Engineering

    1996-12-31

    The historical development of the civil nuclear power generation industry is examined in the light of the need to meet conflicting energy-supply and environmental pressures over recent decades. It is suggested that fission (thermal and fast) reactors will dominate the market up to the period 2010-2030, with fusion being relegated to the latter part of the twenty-first century. A number of issues affecting the use of nuclear electricity generation in Western Europe are considered including its cost, industrial strategy needs, and the public acceptability of nuclear power. The contribution of nuclear power stations to achieving CO2 targets aimed at relieving global warming is discussed in the context of alternative strategies for sustainable development, including renewable energy sources and energy-efficiency measures. Trends in the generation of nuclear electricity from fission reactors are finally considered in terms of the main geopolitical groupings that make up the world in the mid-1990s. Several recent, but somewhat conflicting, forecasts of the role of nuclear power in the fuel mix to about 2020 are reviewed. It is argued that the only major expansion in generating capacity will take place on the Asia-Pacific Rim and not in the developing countries generally. Nevertheless, the global nuclear industry overall will continue to be dominated by a small number of large nuclear electricity generating countries; principally the USA, France and Japan. (UK).

  15. New Bachelards?: Reveries, Elements and Twenty-First Century Materialisms

    Directory of Open Access Journals (Sweden)

    James L. Smith

    2012-10-01

    Full Text Available Recent years have seen an infusion of new ideas into material philosophy through the work of the so-called ‘new materialists’. Poignant examples appear within two recent books: the first, Vibrant Matter by Jane Bennett (2010, sets out to “enhance receptivity to the impersonal life that surrounds and infuses us” (2010: 4. The second, Elemental Philosophy by David Macauley (2010, advocates an anamnesis or recollection of the elements as imaginatively dynamic matter. Within his essays on the imagination of matter, Gaston Bachelard outlined an archetypal vision of the elements predicated upon the material imagination. He explored the manner in which the imagination inhabits the world, is triggered by the stimulus of material dynamism, and is formed from a co-constitution of subject and object. This article proposes that recent trends in materialist philosophy – as exemplified by the monographs of Bennett and Macauley – reinforce the ideas of Bachelard and take them in new directions. Bachelard provides us with a compelling argument for the rediscovery of material imagination, whereas New Materialism portrays a vision of matter filled with autonomous dynamism that lends itself to entering into a relationship with this imagination. Consequently, this article proposes that Gaston Bachelard has gained a new relevance as a result of contemporary trends in material philosophy, has taken on new possibilities through recent scholarship, and remains a force within the twenty-first century discursive landscape.

  16. Strategies for Teaching Maritime Archaeology in the Twenty First Century

    Science.gov (United States)

    Staniforth, Mark

    2008-12-01

    Maritime archaeology is a multi-faceted discipline that requires both theoretical learning and practical skills training. In the past most universities have approached the teaching of maritime archaeology as a full-time on-campus activity designed for ‘traditional’ graduate students; primarily those in their early twenties who have recently come from full-time undergraduate study and who are able to study on-campus. The needs of mature-age and other students who work and live in different places (or countries) and therefore cannot attend lectures on a regular basis (or at all) have largely been ignored. This paper provides a case study in the teaching of maritime archaeology from Australia that, in addition to ‘traditional’ on-campus teaching, includes four main components: (1) learning field methods through field schools; (2) skills training through the AIMA/NAS avocational training program; (3) distance learning topics available through CD-ROM and using the Internet; and (4) practicums, internships and fellowships. The author argues that programs to teach maritime archaeology in the twenty first century need to be flexible and to address the diverse needs of students who do not fit the ‘traditional’ model. This involves collaborative partnerships with other universities as well as government underwater cultural heritage management agencies and museums, primarily through field schools, practicums and internships.

  17. Twenty-first workshop on geothermal reservoir engineering: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-01-26

    PREFACE The Twenty-First Workshop on Geothermal Reservoir Engineering was held at the Holiday Inn, Palo Alto on January 22-24, 1996. There were one-hundred fifty-five registered participants. Participants came from twenty foreign countries: Argentina, Austria, Canada, Costa Rica, El Salvador, France, Iceland, Indonesia, Italy, Japan, Mexico, The Netherlands, New Zealand, Nicaragua, the Philippines, Romania, Russia, Switzerland, Turkey and the UK. The performance of many geothermal reservoirs outside the United States was described in several of the papers. Professor Roland N. Horne opened the meeting and welcomed visitors. The key note speaker was Marshall Reed, who gave a brief overview of the Department of Energy's current plan. Sixty-six papers were presented in the technical sessions of the workshop. Technical papers were organized into twenty sessions concerning: reservoir assessment, modeling, geology/geochemistry, fracture modeling hot dry rock, geoscience, low enthalpy, injection, well testing, drilling, adsorption and stimulation. Session chairmen were major contributors to the workshop, and we thank: Ben Barker, Bobbie Bishop-Gollan, Tom Box, Jim Combs, John Counsil, Sabodh Garg, Malcolm Grant, Marcel0 Lippmann, Jim Lovekin, John Pritchett, Marshall Reed, Joel Renner, Subir Sanyal, Mike Shook, Alfred Truesdell and Ken Williamson. Jim Lovekin gave the post-dinner speech at the banquet and highlighted the exciting developments in the geothermal field which are taking place worldwide. The Workshop was organized by the Stanford Geothermal Program faculty, staff, and graduate students. We wish to thank our students who operated the audiovisual equipment. Shaun D. Fitzgerald Program Manager.

  18. Is the Classroom Obsolete in the Twenty-First Century?

    Science.gov (United States)

    Benade, Leon

    2017-01-01

    Lefebvre's triadic conception of "spatial practice, representations of space and representational spaces" provides the theoretical framework of this article, which recognises a productive relationship between space and social relations. Its writing stems from a current and ongoing qualitative study of innovative teaching and learning…

  19. Noise Management in Twenty-First Century Libraries: Case Studies of Four U.S. Academic Institutions

    Science.gov (United States)

    Franks, Janet E.; Asher, Darla C.

    2014-01-01

    University libraries have had to provide acceptable noise levels for many years and this pressure has not diminished in the twenty-first century. Library space has to be utilized to ensure noise levels are best managed. A study was undertaken across four university libraries in South Florida to determine how universities utilized their limited…

  20. Twenty-first century changes in snowfall climate in Northern Europe in ENSEMBLES regional climate models

    Science.gov (United States)

    Räisänen, Jouni

    2016-01-01

    Changes in snowfall in northern Europe (55-71°N, 5-35°E) are analysed from 12 regional model simulations of twenty-first century climate under the Special Report on Emissions Scenarios A1B scenario. As an ensemble mean, the models suggest a decrease in the winter total snowfall in nearly all of northern Europe. In the middle of the winter, however, snowfall generally increases in the coldest areas. The borderline between increasing and decreasing snowfall broadly coincides with the -11 °C isotherm in baseline (1980-2010) monthly mean temperature, although with variation between models and grid boxes. High extremes of daily snowfall remain nearly unchanged, except for decreases in the mildest areas, where snowfall as a whole becomes much less common. A smaller fraction of the snow in the simulated late twenty-first century climate falls on severely cold days and a larger fraction on days with near-zero temperatures. Not only do days with low temperatures become less common, but they also typically have more positive anomalies of sea level pressure and less snowfall for the same temperature than in the present-day climate.

  1. Virtual Reality: Teaching Tool of the Twenty-First Century?

    Science.gov (United States)

    Hoffman, Helene; Vu, Dzung

    1997-01-01

    Virtual reality-based procedural and surgical simulations promise to revolutionize medical training. A wide range of simulations representing diverse content areas and varied implementation strategies are under development or in early use. The new systems will make broad-based training experiences available for students at all levels without risks…

  2. Simulating space and time

    CERN Document Server

    Whitworth, B

    2010-01-01

    This chapter asks if a virtual space-time could appear to those within it as our space-time does to us. A processing grid network is proposed to underlie not just matter and energy, but also space and time. The suggested "screen" for our familiar three dimensional world is a hyper-sphere surface simulated by a grid network. Light and matter then travel, or are transmitted, in the "directions" of the grid architecture. The processing sequences of grid nodes create time, as the static states of movies run together emulate events. Yet here what exists are not the static states, but the dynamic processing between them. Quantum collapse is the irreversible event that gives time its direction. In this model, empty space is null processing, directions are node links, time is processing cycles, light is a processing wave, objects are wave tangles and energy is the processing transfer rate. It describes a world where empty space is not empty, space warps, time dilates, and everything began when this virtual universe "...

  3. Microsurgery Training for the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    Simon Richard Myers

    2013-07-01

    Full Text Available Current educational interventions and training courses in microsurgery are often predicated on theories of skill acquisition and development that follow a ‘practice makes perfect’ model. Given the changing landscape of surgical training and advances in educational theories related to skill development, research is needed to assess current training tools in microsurgery education and devise alternative methods that would enhance training . Simulation is an increasingly important tool for educators because, whilst facilitating improved technical proficiency, it provides a way to reduce risks to both trainees and patients. The International Microsurgery Simulation Society has been founded in 2012 in order to consolidate the global effort in promoting excellence in microsurgical training. The society’s aim to achieve standarisation of microsurgical training worldwide could be realised through the development of evidence based educational interventions and sharing best practices.

  4. Microsurgery training for the twenty-first century.

    Science.gov (United States)

    Myers, Simon Richard; Froschauer, Stefan; Akelina, Yelena; Tos, Pierluigi; Kim, Jeong Tae; Ghanem, Ali M

    2013-07-01

    Current educational interventions and training courses in microsurgery are often predicated on theories of skill acquisition and development that follow a 'practice makes perfect' model. Given the changing landscape of surgical training and advances in educational theories related to skill development, research is needed to assess current training tools in microsurgery education and devise alternative methods that would enhance training. Simulation is an increasingly important tool for educators because, whilst facilitating improved technical proficiency, it provides a way to reduce risks to both trainees and patients. The International Microsurgery Simulation Society has been founded in 2012 in order to consolidate the global effort in promoting excellence in microsurgical training. The society's aim to achieve standarisation of microsurgical training worldwide could be realised through the development of evidence based educational interventions and sharing best practices.

  5. Drone Warfare: Twenty-First Century Empire and Communications

    Directory of Open Access Journals (Sweden)

    Kevin Howley

    2017-02-01

    Full Text Available This paper, part of a larger project that examines drones from a social-construction of technology perspective, considers drone warfare in light of Harold Innis’s seminal work on empire and communication. Leveraging leading-edge aeronautics with advanced optics, data processing, and networked communication, drones represent an archetypal “space-biased” technology. Indeed, by allowing remote operators and others to monitor, select, and strike targets from half a world away, and in real-time, these weapon systems epitomize the “pernicious neglect of time” Innis sought to identify and remedy in his later writing. With Innis’s time-space dialectic as a starting point, then, the paper considers drones in light of a longstanding paradox of American culture: the impulse to collapse the geographical distance between the United States and other parts of the globe, while simultaneously magnifying the cultural difference between Americans and other peoples and societies. In the midst of the worldwide proliferation of drones, this quintessentially sublime technology embodies this (disconnect in important, profound, and ominous ways.

  6. Cyber Attacks and Terrorism: A Twenty-First Century Conundrum.

    Science.gov (United States)

    Albahar, Marwan

    2017-01-05

    In the recent years, an alarming rise in the incidence of cyber attacks has made cyber security a major concern for nations across the globe. Given the current volatile socio-political environment and the massive increase in the incidence of terrorism, it is imperative that government agencies rapidly realize the possibility of cyber space exploitation by terrorist organizations and state players to disrupt the normal way of life. The threat level of cyber terrorism has never been as high as it is today, and this has created a lot of insecurity and fear. This study has focused on different aspects of cyber attacks and explored the reasons behind their increasing popularity among the terrorist organizations and state players. This study proposes an empirical model that can be used to estimate the risk levels associated with different types of cyber attacks and thereby provide a road map to conceptualize and formulate highly effective counter measures and cyber security policies.

  7. Analysis of the projected regional sea-ice changes in the Southern Ocean during the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Lefebvre, W.; Goosse, H. [Universite Catholique de Louvain, Institut d' Astronomie et de Geophysique Georges Lemaitre, Louvain-la-Neuve (Belgium)

    2008-01-15

    Using the set of simulations performed with atmosphere-ocean general circulation models (AOGCMs) for the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR4), the projected regional distribution of sea ice for the twenty-first century has been investigated. Averaged over all those model simulations, the current climate is reasonably well reproduced. However, this averaging procedure hides the errors from individual models. Over the twentieth century, the multimodel average simulates a larger sea-ice concentration decrease around the Antarctic Peninsula compared to other regions, which is in qualitative agreement with observations. This is likely related to the positive trend in the Southern Annular Mode (SAM) index over the twentieth century, in both observations and in the multimodel average. Despite the simulated positive future trend in SAM, such a regional feature around the Antarctic Peninsula is absent in the projected sea-ice change for the end of the twenty-first century. The maximum decrease is indeed located over the central Weddell Sea and the Amundsen-Bellingshausen Seas. In most models, changes in the oceanic currents could play a role in the regional distribution of the sea ice, especially in the Ross Sea, where stronger southward currents could be responsible for a smaller sea-ice decrease during the twenty-first century. Finally, changes in the mixed layer depth can be found in some models, inducing locally strong changes in the sea-ice concentration. (orig.)

  8. A Critical Feminist and Race Critique of Thomas Piketty's "Capital in the Twenty-First Century"

    Science.gov (United States)

    Moeller, Kathryn

    2016-01-01

    Thomas Piketty's "Capital in the Twenty-first Century" documents the foreboding nature of rising wealth inequality in the twenty-first century. In an effort to promote a more just and democratic global society and rein in the unfettered accumulation of wealth by the few, Piketty calls for a global progressive annual tax on corporate…

  9. A Critical Feminist and Race Critique of Thomas Piketty's "Capital in the Twenty-First Century"

    Science.gov (United States)

    Moeller, Kathryn

    2016-01-01

    Thomas Piketty's "Capital in the Twenty-first Century" documents the foreboding nature of rising wealth inequality in the twenty-first century. In an effort to promote a more just and democratic global society and rein in the unfettered accumulation of wealth by the few, Piketty calls for a global progressive annual tax on corporate…

  10. Slowmation: A Twenty-First Century Educational Tool for Science and Mathematics Pre-Service Teachers

    Science.gov (United States)

    Paige, Kathryn; Bentley, Brendan; Dobson, Stephen

    2016-01-01

    Slowmation is a twenty-first century digital literacy educational tool. This teaching and learning tool has been incorporated as an assessment strategy in the curriculum area of science and mathematics with pre-service teachers (PSTs). This paper explores two themes: developing twenty-first century digital literacy skills and modelling best…

  11. Why the twenty-first century tropical Pacific trend pattern cannot significantly influence ENSO amplitude?

    Science.gov (United States)

    An, Soon-Il; Choi, Jung

    2015-01-01

    Although the climate is highly expected to change due to global warming, it is unclear whether the El Nino-Southern Oscillation (ENSO) will be more or less active in the future. One may argue that this uncertainty is due to the intrinsic uncertainties in current climate models or the strong natural long-term modulation of ENSO. Here, we propose that the global warming trend cannot significantly modify ENSO amplitude due to weak feedback between the global warming induced tropical climate change and ENSO. By analyzing Coupled Model Intercomparison Project Phase 5 and observation data, we found that the zonal dipole pattern of sea surface temperature [SST; warming in the eastern Pacific and cooling in the western Pacific or vice versa; `Pacific zonal mode' (PZM)] is highly correlated to change in ENSO amplitude. Additionally, this PZM is commonly identified in control experiments (pre-industrial conditions), twentieth century observations, and twenty-first century scenario experiments [representative concentration pathways 4.5 and 8.5 W m-2 (RCP 4.5, 8.5)]. PZM provides favorable conditions for the intensification of ENSO by strengthening air-sea coupling and modifying ENSO pattern. On the other hand, the twenty-first century SST trend pattern, which is different from PZM, is not favorable towards changing ENSO amplitude. Furthermore, we performed an intermediate ocean-atmosphere coupled model simulations, in which the SST trend pattern and PZM are imposed as an external anomalous heat flux or prescribed as a basic state. It was concluded that the SST trend pattern forcing insignificantly changes ENSO amplitude, and the PZM forcing intensifies ENSO amplitude.

  12. Projected Changes on the Global Surface Wave Drift Climate towards the END of the Twenty-First Century

    Science.gov (United States)

    Carrasco, Ana; Semedo, Alvaro; Behrens, Arno; Weisse, Ralf; Breivik, Øyvind; Saetra, Øyvind; Håkon Christensen, Kai

    2016-04-01

    The global wave-induced current (the Stokes Drift - SD) is an important feature of the ocean surface, with mean values close to 10 cm/s along the extra-tropical storm tracks in both hemispheres. Besides the horizontal displacement of large volumes of water the SD also plays an important role in the ocean mix-layer turbulence structure, particularly in stormy or high wind speed areas. The role of the wave-induced currents in the ocean mix-layer and in the sea surface temperature (SST) is currently a hot topic of air-sea interaction research, from forecast to climate ranges. The SD is mostly driven by wind sea waves and highly sensitive to changes in the overlaying wind speed and direction. The impact of climate change in the global wave-induced current climate will be presented. The wave model WAM has been forced by the global climate model (GCM) ECHAM5 wind speed (at 10 m height) and ice, for present-day and potential future climate conditions towards the end of the end of the twenty-first century, represented by the Intergovernmental Panel for Climate Change (IPCC) CMIP3 (Coupled Model Inter-comparison Project phase 3) A1B greenhouse gas emission scenario (usually referred to as a ''medium-high emissions'' scenario). Several wave parameters were stored as output in the WAM model simulations, including the wave spectra. The 6 hourly and 0.5°×0.5°, temporal and space resolution, wave spectra were used to compute the SD global climate of two 32-yr periods, representative of the end of the twentieth (1959-1990) and twenty-first (1969-2100) centuries. Comparisons of the present climate run with the ECMWF (European Centre for Medium-Range Weather Forecasts) ERA-40 reanalysis are used to assess the capability of the WAM-ECHAM5 runs to produce realistic SD results. This study is part of the WRCP-JCOMM COWCLIP (Coordinated Ocean Wave Climate Project) effort.

  13. Mediterranean Sea response to climate change in an ensemble of twenty first century scenarios

    Science.gov (United States)

    Adloff, Fanny; Somot, Samuel; Sevault, Florence; Jordà, Gabriel; Aznar, Roland; Déqué, Michel; Herrmann, Marine; Marcos, Marta; Dubois, Clotilde; Padorno, Elena; Alvarez-Fanjul, Enrique; Gomis, Damià

    2015-11-01

    The Mediterranean climate is expected to become warmer and drier during the twenty-first century. Mediterranean Sea response to climate change could be modulated by the choice of the socio-economic scenario as well as the choice of the boundary conditions mainly the Atlantic hydrography, the river runoff and the atmospheric fluxes. To assess and quantify the sensitivity of the Mediterranean Sea to the twenty-first century climate change, a set of numerical experiments was carried out with the regional ocean model NEMOMED8 set up for the Mediterranean Sea. The model is forced by air-sea fluxes derived from the regional climate model ARPEGE-Climate at a 50-km horizontal resolution. Historical simulations representing the climate of the period 1961-2000 were run to obtain a reference state. From this baseline, various sensitivity experiments were performed for the period 2001-2099, following different socio-economic scenarios based on the Special Report on Emissions Scenarios. For the A2 scenario, the main three boundary forcings (river runoff, near-Atlantic water hydrography and air-sea fluxes) were changed one by one to better identify the role of each forcing in the way the ocean responds to climate change. In two additional simulations (A1B, B1), the scenario is changed, allowing to quantify the socio-economic uncertainty. Our 6-member scenario simulations display a warming and saltening of the Mediterranean. For the 2070-2099 period compared to 1961-1990, the sea surface temperature anomalies range from +1.73 to +2.97 °C and the SSS anomalies spread from +0.48 to +0.89. In most of the cases, we found that the future Mediterranean thermohaline circulation (MTHC) tends to reach a situation similar to the eastern Mediterranean Transient. However, this response is varying depending on the chosen boundary conditions and socio-economic scenarios. Our numerical experiments suggest that the choice of the near-Atlantic surface water evolution, which is very uncertain in

  14. Projected deglaciation of western Canada in the twenty-first century

    Science.gov (United States)

    Clarke, Garry K. C.; Jarosch, Alexander H.; Anslow, Faron S.; Radić, Valentina; Menounos, Brian

    2015-05-01

    Retreat of mountain glaciers is a significant contributor to sea-level rise and a potential threat to human populations through impacts on water availability and regional hydrology. Like most of Earth’s mountain glaciers, those in western North America are experiencing rapid mass loss. Projections of future large-scale mass change are based on surface mass balance models that are open to criticism, because they ignore or greatly simplify glacier physics. Here we use a high-resolution regional glaciation model, developed by coupling physics-based ice dynamics with a surface mass balance model, to project the fate of glaciers in western Canada. We use twenty-first-century climate scenarios from an ensemble of global climate models in our simulations; the results indicate that by 2100, the volume of glacier ice in western Canada will shrink by 70 +/- 10% relative to 2005. According to our simulations, few glaciers will remain in the Interior and Rockies regions, but maritime glaciers, in particular those in northwestern British Columbia, will survive in a diminished state. We project the maximum rate of ice volume loss, corresponding to peak input of deglacial meltwater to streams and rivers, to occur around 2020-2040. Potential implications include impacts on aquatic ecosystems, agriculture, forestry, alpine tourism and water quality.

  15. Divergent trajectories of Antarctic surface melt under two twenty-first-century climate scenarios

    Science.gov (United States)

    Trusel, Luke D.; Frey, Karen E.; Das, Sarah B.; Karnauskas, Kristopher B.; Kuipers Munneke, Peter; van Meijgaard, Erik; van den Broeke, Michiel R.

    2015-12-01

    Ice shelves modulate Antarctic contributions to sea-level rise and thereby represent a critical, climate-sensitive interface between the Antarctic ice sheet and the global ocean. Following rapid atmospheric warming over the past decades, Antarctic Peninsula ice shelves have progressively retreated, at times catastrophically. This decay supports hypotheses of thermal limits of viability for ice shelves via surface melt forcing. Here we use a polar-adapted regional climate model and satellite observations to quantify the nonlinear relationship between surface melting and summer air temperature. Combining observations and multimodel simulations, we examine melt evolution and intensification before observed ice shelf collapse on the Antarctic Peninsula. We then assess the twenty-first-century evolution of surface melt across Antarctica under intermediate and high emissions climate scenarios. Our projections reveal a scenario-independent doubling of Antarctic-wide melt by 2050. Between 2050 and 2100, however, significant divergence in melt occurs between the two climate scenarios. Under the high emissions pathway by 2100, melt on several ice shelves approaches or surpasses intensities that have historically been associated with ice shelf collapse, at least on the northeast Antarctic Peninsula.

  16. Projections of glacier change in the Altai Mountains under twenty-first century climate scenarios

    Science.gov (United States)

    Zhang, Yong; Enomoto, Hiroyuki; Ohata, Tetsuo; Kitabata, Hideyuki; Kadota, Tsutomu; Hirabayashi, Yukiko

    2016-11-01

    We project glacier surface mass balances of the Altai Mountains over the period 2006-2100 for the representative concentration pathway (RCP) 4.5 and RCP8.5 scenarios using daily near-surface air temperature and precipitation from 12 global climate models in combination with a surface mass balance model. The results indicate that the Altai glaciers will undergo sustained mass loss throughout the 21st for both RCPs and reveal the future fate of glaciers of different sizes. By 2100, glacier area in the region will shrink by 26 ± 10 % for RCP4.5, while it will shrink by 60 ± 15 % for RCP8.5. According to our simulations, most disappearing glaciers are located in the western part of the Altai Mountains. For RCP4.5, all glaciers disappearing in the twenty-first century have a present-day size smaller than 5.0 km2, while for RCP8.5, an additional 7 % of glaciers in the initial size class of 5.0-10.0 km2 also vanish. We project different trends in the total meltwater discharge of the region for the two RCPs, which does not peak before 2100, with important consequences for regional water availability, particular for the semi-arid and arid regions. This further highlights the potential implications of change in the Altai glaciers on regional hydrology and environment.

  17. Projections of glacier change in the Altai Mountains under twenty-first century climate scenarios

    Science.gov (United States)

    Zhang, Yong; Enomoto, Hiroyuki; Ohata, Tetsuo; Kitabata, Hideyuki; Kadota, Tsutomu; Hirabayashi, Yukiko

    2016-01-01

    We project glacier surface mass balances of the Altai Mountains over the period 2006-2100 for the representative concentration pathway (RCP) 4.5 and RCP8.5 scenarios using daily near-surface air temperature and precipitation from 12 global climate models in combination with a surface mass balance model. The results indicate that the Altai glaciers will undergo sustained mass loss throughout the 21st for both RCPs and reveal the future fate of glaciers of different sizes. By 2100, glacier area in the region will shrink by 26 ± 10 % for RCP4.5, while it will shrink by 60 ± 15 % for RCP8.5. According to our simulations, most disappearing glaciers are located in the western part of the Altai Mountains. For RCP4.5, all glaciers disappearing in the twenty-first century have a present-day size smaller than 5.0 km2, while for RCP8.5, an additional ~7 % of glaciers in the initial size class of 5.0-10.0 km2 also vanish. We project different trends in the total meltwater discharge of the region for the two RCPs, which does not peak before 2100, with important consequences for regional water availability, particular for the semi-arid and arid regions. This further highlights the potential implications of change in the Altai glaciers on regional hydrology and environment.

  18. Sensitivity of discharge and flood frequency to twenty-first century and late Holocene changes in climate and land use (River Meuse, northwest Europe)

    NARCIS (Netherlands)

    Ward, P.J.; Renssen, H.; Aerts, J.C.J.H.; Verburg, P.H.

    2011-01-01

    We used a calibrated coupled climate–hydrological model to simulate Meuse discharge over the late Holocene (4000–3000 BP and 1000–2000 AD). We then used this model to simulate discharge in the twenty-first century under SRES emission scenarios A2 and B1, with and without future land use change. Mean

  19. Fusion energy from the Moon for the twenty-first century

    Science.gov (United States)

    Kulcinski, G. L.; Cameron, E. N.; Santarius, J. F.; Sviatoslavsky, I. N.; Wittenberg, L. J.; Schmitt, Harrison H.

    1992-01-01

    It is shown in this paper that the D-He-3 fusion fuel cycle is not only credible from a physics standpoint, but that its breakeven and ignition characteristics could be developed on roughly the same time schedule as the DT cycle. It was also shown that the extremely low fraction of power in neutrons, the lack of significant radioactivity in the reactants, and the potential for very high conversion efficiencies, can result in definite advantages for the D-He-3 cycle with respect to DT fusion and fission reactors in the twenty-first century. More specifically, the D-He-3 cycle can accomplish the following: (1) eliminate the need for deep geologic waste burial facilities and the wastes can qualify for Class A, near-surface land burial; (2) allow 'inherently safe' reactors to be built that, under the worst conceivable accident, cannot cause a civilian fatality or result in a significant (greater than 100 mrem) exposure to a member of the public; (3) reduce the radiation damage levels to a point where no scheduled replacement of reactor structural components is required, i.e., full reactor lifetimes (approximately 30 FPY) can be credibly claimed; (4) increase the reliability and availability of fusion reactors compared to DT systems because of the greatly reduced radioactivity, the low neutron damage, and the elimination of T breeding; and (5) greatly reduce the capital costs of fusion power plants (compared to DT systems) by as much as 50 percent and present the potential for a significant reduction on the COE. The concepts presented in this paper tie together two of the most ambitious high-technology endeavors of the twentieth century: the development of controlled thermonuclear fusion for civilian power applications and the utilization of outer space for the benefit of mankind on Earth.

  20. Border Crossing in Contemporary Brazilian Culture: Global Perspectives from the Twenty-First Century Literary Scene

    Directory of Open Access Journals (Sweden)

    Cimara Valim de Melo

    2016-06-01

    Full Text Available Abstract: This paper investigates the process of internationalisation of Brazilian literature in the twenty-first century from the perspective of the publishing market. For this, we analyse how Brazil has responded to globalisation and what effects of cultural globalisation can be seen in the Brazilian literary scene, focusing on the novel. Observing the movement of the novelists throughout the globe, the reception of Brazilian literature in the United Kingdom and the relations between art and the literary market in Brazil, we intend to provoke some reflections on Brazilian cultural history in the light of the twenty-first century.

  1. Border Crossing in Contemporary Brazilian Culture: Global Perspectives from the Twenty-First Century Literary Scene

    Directory of Open Access Journals (Sweden)

    Cimara Valim de Melo

    2016-06-01

    Full Text Available Abstract: This paper investigates the process of internationalisation of Brazilian literature in the twenty-first century from the perspective of the publishing market. For this, we analyse how Brazil has responded to globalisation and what effects of cultural globalisation can be seen in the Brazilian literary scene, focusing on the novel. Observing the movement of the novelists throughout the globe, the reception of Brazilian literature in the United Kingdom and the relations between art and the literary market in Brazil, we intend to provoke some reflections on Brazilian cultural history in the light of the twenty-first century.

  2. Twenty-first century probabilistic projections of precipitation over Ontario, Canada through a regional climate model ensemble

    Science.gov (United States)

    Wang, Xiuquan; Huang, Guohe; Liu, Jinliang

    2016-06-01

    In this study, probabilistic projections of precipitation for the Province of Ontario are developed through a regional climate model ensemble to help investigate how global warming would affect its local climate. The PRECIS regional climate modeling system is employed to perform ensemble simulations, driven by a set of boundary conditions from a HadCM3-based perturbed-physics ensemble. The PRECIS ensemble simulations are fed into a Bayesian hierarchical model to quantify uncertain factors affecting the resulting projections of precipitation and thus generate probabilistic precipitation changes at grid point scales. Following that, reliable precipitation projections throughout the twenty-first century are developed for the entire province by applying the probabilistic changes to the observed precipitation. The results show that the vast majority of cities in Ontario are likely to suffer positive changes in annual precipitation in 2030, 2050, and 2080 s in comparison to the baseline observations. This may suggest that the whole province is likely to gain more precipitation throughout the twenty-first century in response to global warming. The analyses on the projections of seasonal precipitation further demonstrate that the entire province is likely to receive more precipitation in winter, spring, and autumn throughout this century while summer precipitation is only likely to increase slightly in 2030 s and would decrease gradually afterwards. However, because the magnitude of projected decrease in summer precipitation is relatively small in comparison with the anticipated increases in other three seasons, the annual precipitation over Ontario is likely to suffer a progressive increase throughout the twenty-first century (by 7.0 % in 2030 s, 9.5 % in 2050 s, and 12.6 % in 2080 s). Besides, the degree of uncertainty for precipitation projections is analyzed. The results suggest that future changes in spring precipitation show higher degree of uncertainty than other

  3. On the twenty-first-century wet season projections over the Southeastern United States

    Science.gov (United States)

    Selman, Christopher; Misra, Vasu; Stefanova, Lydia; Dinapoli, Steven; Smith, Thomas J.

    2013-01-01

    This paper reconciles the difference in the projections of the wet season over the Southeastern United States (SEUS) from a global climate model (the Community Climate System Model Version 3 [CCSM3]) and from a regional climate model (the Regional Spectral Model [RSM]) nested in the CCSM3. The CCSM3 projects a dipole in the summer precipitation anomaly: peninsular Florida dries in the future climate, and the remainder of the SEUS region becomes wetter. The RSM forced with CCSM3 projects a universal drying of the SEUS in the late twenty-first century relative to the corresponding twentieth-century summer. The CCSM3 pattern is attributed to the “upped-ante” mechanism, whereby the atmospheric boundary layer moisture required for convection increases in a warm, statically stable global tropical environment. This criterion becomes harder to meet along convective margins, which include peninsular Florida, resulting in its drying. CCSM3 also projects a southwestward expansion of the North Atlantic subtropical high that leads to further stabilizing of the atmosphere above Florida, inhibiting convection. The RSM, because of its high (10-km grid) resolution, simulates diurnal variations in summer rainfall over SEUS reasonably well. The RSM improves upon CCSM3 through the RSM’s depiction of the diurnal variance of precipitation, which according to observations accounts for up to 40 % of total seasonal precipitation variance. In the future climate, the RSM projects a significant reduction in the diurnal variability of convection. The reduction is attributed to large-scale stabilization of the atmosphere in the CCSM3 projections.

  4. Movies to the Rescue: Keeping the Cold War Relevant for Twenty-First-Century Students

    Science.gov (United States)

    Gokcek, Gigi; Howard, Alison

    2013-01-01

    What are the challenges of teaching Cold War politics to the twenty-first-century student? How might the millennial generation be educated about the political science theories and concepts associated with this period in history? A college student today, who grew up in the post-Cold War era with the Internet, Facebook, Twitter, smart phones,…

  5. Violating Pedagogy: Literary Theory in the Twenty-First Century College Classroom

    Science.gov (United States)

    Johnson, Heather G. S.

    2015-01-01

    "Violating Pedagogy: Literary Theory in the Twenty-first Century College Classroom" discusses the challenge of teaching literary theory to undergraduate and graduate students in a cultural atmosphere that may at times feel simultaneously anti-intellectual and overpopulated with competing scholarly concerns. Approaching theory as a…

  6. Culture, Power, and the University in the Twenty-First Century

    Science.gov (United States)

    Murphy, Peter

    2012-01-01

    Powerful nations have influential systems of higher education. The article explores the possible pattern of geopolitics in the twenty-first century, and the competing prospects of America and its rivals in higher education and research. Pressures on both the American and non-American worlds are evaluated, along with relative economic strengths,…

  7. A Comment on Class Productions in Elite Secondary Schools in Twenty-First-Century Global Context

    Science.gov (United States)

    Weis, Lois

    2014-01-01

    In this closing essay, Lois Weis offers a broad overview of the contributions of this Special Issue on class production in elite secondary schools in the twenty-first-century global context. Drawing upon her own research within US privileged secondary schools, Weis explores the contemporary social, economic and political landscape as connected to…

  8. How Do Students Value the Importance of Twenty-First Century Skills?

    Science.gov (United States)

    Ahonen, Arto Kalevi; Kinnunen, Päivi

    2015-01-01

    Frameworks of twenty-first century skills have attained a central role in school development and curriculum changes all over the world. There is a common understanding of the need for meta-skills such as problem solving, reasoning, collaboration, and self-regulation. This article presents results from a Finnish study, in which 718 school pupils…

  9. Yeast culture collections in the twenty-first century: new opportunities and challenges

    NARCIS (Netherlands)

    Boundy-Mills, Kyria L.; Glantschnig, Ewald; Roberts, Ian N.; Yurkov, Andrey; Casaregola, Serge; Daniel, Heide-Marie; Groenewald, Marizeth; Turchetti, Benedetta

    2016-01-01

    The twenty-first century has brought new opportunities and challenges to yeast culture collections, whether they are long-standing or recently established. Basic functions such as archiving, characterizing and distributing yeasts continue, but with expanded responsibilities and emerging opportunitie

  10. The Five Cs of Digital Curation: Supporting Twenty-First-Century Teaching and Learning

    Science.gov (United States)

    Deschaine, Mark E.; Sharma, Sue Ann

    2015-01-01

    Digital curation is a process that allows university professors to adapt and adopt resources from multidisciplinary fields to meet the educational needs of twenty-first-century learners. Looking through the lens of new media literacy studies (Vasquez, Harste, & Albers, 2010) and new literacies studies (Gee, 2010), we propose that university…

  11. Movies to the Rescue: Keeping the Cold War Relevant for Twenty-First-Century Students

    Science.gov (United States)

    Gokcek, Gigi; Howard, Alison

    2013-01-01

    What are the challenges of teaching Cold War politics to the twenty-first-century student? How might the millennial generation be educated about the political science theories and concepts associated with this period in history? A college student today, who grew up in the post-Cold War era with the Internet, Facebook, Twitter, smart phones,…

  12. Education for Future-Oriented Citizenship: Implications for the Education of Twenty-First Century Competencies

    Science.gov (United States)

    Lee, Wing On

    2012-01-01

    Globalization and the knowledge economy have opened up worldwide agendas for national development. Following this is the emphasis on the social dimension, otherwise known as social capital. Much of social capital includes "soft skills" and "twenty-first century skills", which broadly cover critical, creative and inventive…

  13. Knowledge and Educational Research in the Context of "Twenty-First Century Learning"

    Science.gov (United States)

    Benade, Leon

    2014-01-01

    Educational researchers and academics cannot ignore the ever-present call for education, and schooling in particular, to reflect the needs of the twenty-first century knowledge economy. Since the 1990s, national curricula and education systems have reflected this call in their focus on technology and shifting pedagogy to increasingly…

  14. Theoretical Contexts and Conceptual Frames for the Study of Twenty-First Century Capitalism

    DEFF Research Database (Denmark)

    Hull Kristensen, Peer; Morgan, Glenn

    2012-01-01

    This chapter argues that the comparative institutionalist approach requires rethinking in the light of developments in the twenty-first century. The chapter emphasizes the following features of the new environment: first, the rise of the BRIC and the emerging economies; secondly, the changed...

  15. Way Forward in the Twenty-First Century in Content-Based Instruction: Moving towards Integration

    Science.gov (United States)

    Ruiz de Zarobe, Yolanda; Cenoz, Jasone

    2015-01-01

    The aim of this paper is to reflect on the theoretical and methodological underpinnings that provide the basis for an understanding of Content-Based Instruction/Content and Language Integrated Learning (CBI/CLIL) in the field and its relevance in education in the twenty-first century. It is argued that the agenda of CBI/CLIL needs to move towards…

  16. Humanities: The Unexpected Success Story of the Twenty-First Century

    Science.gov (United States)

    Davis, Virginia

    2012-01-01

    Humanities within universities faced challenges in the latter half of the twentieth century as their value in the modern world was questioned. This paper argues that there is strong potential for the humanities to thrive in the twenty-first century university sector. It outlines some of the managerial implications necessary to ensure that this…

  17. Visual Literacy: Does It Enhance Leadership Abilities Required for the Twenty-First Century?

    Science.gov (United States)

    Bintz, Carol

    2016-01-01

    The twenty-first century hosts a well-established global economy, where leaders are required to have increasingly complex skills that include creativity, innovation, vision, relatability, critical thinking and well-honed communications methods. The experience gained by learning to be visually literate includes the ability to see, observe, analyze,…

  18. Thomas Piketty – The Adam Smith of the Twenty-First Century?

    Directory of Open Access Journals (Sweden)

    Jacob Dahl Rendtorff

    2014-11-01

    Full Text Available Piketty’s book, Capital in the Twenty-First Century (2014 has become a bestseller in the world. Two month after its publication, it had sold more than 200.000 copies, and this success will surely continue for a long time. Piketty has established a new platform to discuss political economy.

  19. EXOGENOUS CHALLENGES FOR THE TOURISM INDUSTRY IN THE BEGINNING OF THE TWENTY FIRST CENTURY

    Directory of Open Access Journals (Sweden)

    Akosz Ozan

    2009-05-01

    Full Text Available Tourism is one of the fastest growing industries in the world. Besides its sustained growth the tourism industry has shown in the first years of the twenty first century that it can deal with political, military and natural disasters. The present paper ac

  20. Critical Remarks on Piketty's 'Capital in the Twenty-first Century'

    OpenAIRE

    Homburg, Stefan

    2014-01-01

    This paper discusses the central macroeconomic claims that are made in Thomas Piketty's book 'Capital in the Twenty-first Century'. The paper aims to show that Piketty's contentions are not only logically flawed but also contradicted by his own data.

  1. Teaching Middle School Language Arts: Incorporating Twenty-First Century Literacies

    Science.gov (United States)

    Small Roseboro, Anna J.

    2010-01-01

    "Teaching Middle School Language Arts" is the first book on teaching middle school language arts for multiple intelligences and related twenty-first-century literacies in technologically and ethnically diverse communities. More than 670,000 middle school teachers (grades six through eight) are responsible for educating nearly 13 million students…

  2. Testing Students under Cognitive Capitalism: Knowledge Production of Twenty-First Century Skills

    Science.gov (United States)

    Morgan, Clara

    2016-01-01

    Scholars studying the global governance of education have noted the increasingly important role corporations play in educational policy making. I contribute to this scholarship by examining the Assessment and Teaching of twenty-first century skills (ATC21S™) project, a knowledge production apparatus operating under cognitive capitalism. I analyze…

  3. Education for Future-Oriented Citizenship: Implications for the Education of Twenty-First Century Competencies

    Science.gov (United States)

    Lee, Wing On

    2012-01-01

    Globalization and the knowledge economy have opened up worldwide agendas for national development. Following this is the emphasis on the social dimension, otherwise known as social capital. Much of social capital includes "soft skills" and "twenty-first century skills", which broadly cover critical, creative and inventive thinking; information,…

  4. Twenty-first Semiannual Report of the Commission to the Congress, January 1957

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, Lewis L.

    1957-01-31

    The document represents the twenty-first semiannual Atomic Energy Commission (AEC) report to Congress. The report sums up the major activities and developments in the national atomic energy program covering the period July - December 1956. A special part two of this semiannual report addresses specifically Radiation Safety in Atomic Energy Activities.

  5. Way Forward in the Twenty-First Century in Content-Based Instruction: Moving towards Integration

    Science.gov (United States)

    Ruiz de Zarobe, Yolanda; Cenoz, Jasone

    2015-01-01

    The aim of this paper is to reflect on the theoretical and methodological underpinnings that provide the basis for an understanding of Content-Based Instruction/Content and Language Integrated Learning (CBI/CLIL) in the field and its relevance in education in the twenty-first century. It is argued that the agenda of CBI/CLIL needs to move towards…

  6. Testing Students under Cognitive Capitalism: Knowledge Production of Twenty-First Century Skills

    Science.gov (United States)

    Morgan, Clara

    2016-01-01

    Scholars studying the global governance of education have noted the increasingly important role corporations play in educational policy making. I contribute to this scholarship by examining the Assessment and Teaching of twenty-first century skills (ATC21S™) project, a knowledge production apparatus operating under cognitive capitalism. I analyze…

  7. Why American business demands twenty-first century learning: A company perspective.

    Science.gov (United States)

    Knox, Allyson

    2006-01-01

    Microsoft is an innovative corporation demonstrating the kind and caliber of job skills needed in the twenty-first century. It demonstrates its commitment to twenty-first century skills by holding its employees accountable to a set of core competencies, enabling the company to run effectively. The author explores how Microsoft's core competencies parallel the Partnership for 21st Century Skills learning frameworks. Both require advanced problem-solving skills and a passion for technology, both expect individuals to be able to work in teams, both look for a love of learning, and both call for the self-confidence to honestly self-evaluate. Microsoft also works to cultivate twenty-first century skills among future workers, investing in education to help prepare young people for competitive futures. As the need for digital literacy has become imperative, technology companies have taken the lead in facilitating technology training by partnering with schools and communities. Microsoft is playing a direct role in preparing students for what lies ahead in their careers. To further twenty-first century skills, or core competencies, among the nation's youth, Microsoft has established Partners in Learning, a program that helps education organizations build partnerships that leverage technology to improve teaching and learning. One Partners in Learning grantee is Global Kids, a nonprofit organization that trains students to design online games focused on global social issues resonating with civic and global competencies. As Microsoft believes the challenges of competing in today's economy and teaching today's students are substantial but not insurmountable, such partnerships and investments demonstrate Microsoft's belief in and commitment to twenty-first century skills.

  8. Extension and intensification of the Meso-American mid-summer drought in the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Rauscher, Sara A.; Giorgi, Filippo [The Abdus Salam International Centre for Theoretical Physics, Earth System Physics Section, Trieste (Italy); Diffenbaugh, Noah S. [Purdue University, Purdue Climate Change Research Center and Department of Earth and Atmospheric Sciences, West Lafayette, IN (United States); Seth, Anji [University of Connecticut, Department of Geography, Storrs, CT (United States)

    2008-10-15

    Recent global-scale analyses of the CMIP3 model projections for the twenty-first century indicate a strong, coherent decreased precipitation response over Central America and the Intra-America Seas region. We explore this regional response and examine the models' skill in representing present-day climate over this region. For much of Central America, the annual cycle of precipitation is characterized by a rainy season that extends from May to October with a period of reduced precipitation in July and August called the mid-summer drought. A comparison of the climate of the twentieth century simulations (20c3m) with observations over the period 1961-1990 shows that nearly all models underestimate precipitation over Central America, due in part to an underestimation of sea surface temperatures over the tropical North Atlantic and an excessively smooth representation of regional topographical features. However, many of the models capture the mid-summer drought. Differences between the A1B scenario (2061-2090) and 20c3m (1961-1990) simulations show decreased precipitation in the future climate scenario, mostly in June and July, just before and during the onset of the mid-summer drought. We thus hypothesize that the simulated twenty-first century drying over Central America represents an early onset and intensification of the mid-summer drought. An analysis of circulation changes indicates that the westward expansion and intensification of the North Atlantic subtropical high associated with the mid-summer drought occurs earlier in the A1B simulations, along with stronger low-level easterlies. The eastern Pacific inter-tropical convergence zone is also located further southward in the scenario simulations. There are some indications that these changes could be forced by ENSO-like warming of the tropical eastern Pacific and increased land-ocean heating contrasts over the North American continent. (orig.)

  9. New Poetics of the Film Body: Docility, Molecular Fundamentalism and Twenty First Century Destiny

    Directory of Open Access Journals (Sweden)

    Flynn Susan

    2015-06-01

    Full Text Available Twenty first century film evokes a new topology of the body. Science and technology are the new century’s ‘sovereign power’ which enforces biopolitics through bodies which, by virtue of being seen at their most fundamental level, have become docile surfaces. The film body is at once manipulated and coerced into an ethos of optimization; a thoroughly scientific and ‘molecular’ optimization which proffers ‘normalization’ and intimately regulated bodies. In the film bodies of this millennium, bodily intervention results in surveillance becoming internalized. Now the body is both a means and an end of social control. This essay applies the philosophies Michel Foucault and Nikolas Rose to twenty first century Hollywood film, elucidating a new tropos, a new film body/body of film.

  10. Proceedings of the twenty-first workshop on geothermal reservoir engineering

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    This document contains the Proceedings of the Twenty-first Workshop in Geothermal Reservoir Engineering, held at Stanford University, Stanford, California, USA, January 22-24, 1996. Sixty-six papers were presented in the technical sessions of the workshop. Technical papers were organized into twenty sessions including: reservoir assessment, modeling, geology/geochemistry, fracture modeling/hot-dry-rock, low enthalpy, fluid injection, well testing, drilling, adsorption, and well stimulation.

  11. Twenty-First Century Europe: Emergence of Baltic States into European Alliances

    Science.gov (United States)

    2003-04-07

    sources of power supply. Estonia is the only country in the world where oil shale is the primary source of energy, supplying over 75 percent of its total...Unclassified The contributions of Estonia , Latvia, and Lithuania ("the Baltic States") to the North Atlantic Treaty Organization (NATO), the European...23 vi TWENTY-FIRST CENTURY EUROPE: EMERGENCE OF THE BALTIC STATES INTO EUROPEAN ALLIANCES BACKGROUND Estonia , Latvia, and Lithuania are often

  12. Taking Up Space: Museum Exploration in the Twenty-First Century

    Science.gov (United States)

    Sutton, Tiffany

    2007-01-01

    Museums have become a crucible for questions of the role that traditional art and art history should play in contemporary art. Friedrich Nietzsche argued in the nineteenth century that museums can be no more than mausoleums for effete (fine) art. Over the course of the twentieth century, however, curators dispelled such blanket pessimism by…

  13. Taking Up Space: Museum Exploration in the Twenty-First Century

    Science.gov (United States)

    Sutton, Tiffany

    2007-01-01

    Museums have become a crucible for questions of the role that traditional art and art history should play in contemporary art. Friedrich Nietzsche argued in the nineteenth century that museums can be no more than mausoleums for effete (fine) art. Over the course of the twentieth century, however, curators dispelled such blanket pessimism by…

  14. Twenty first century climatic and hydrological changes over Upper Indus Basin of Himalayan region of Pakistan

    Science.gov (United States)

    Ali, Shaukat; Li, Dan; Congbin, Fu; Khan, Firdos

    2015-01-01

    This study is based on both the recent and the predicted twenty first century climatic and hydrological changes over the mountainous Upper Indus Basin (UIB), which are influenced by snow and glacier melting. Conformal-Cubic Atmospheric Model (CCAM) data for the periods 1976-2005, 2006-2035, 2041-2070, and 2071-2100 with RCP4.5 and RCP8.5; and Regional Climate Model (RegCM) data for the periods of 2041-2050 and 2071-2080 with RCP8.5 are used for climatic projection and, after bias correction, the same data are used as an input to the University of British Columbia (UBC) hydrological model for river flow projections. The projections of all of the future periods were compared with the results of 1976-2005 and with each other. Projections of future changes show a consistent increase in air temperature and precipitation. However, temperature and precipitation increase is relatively slow during 2071-2100 in contrast with 2041-2070. Northern parts are more likely to experience an increase in precipitation and temperature in comparison to the southern parts. A higher increase in temperature is projected during spring and winter over southern parts and during summer over northern parts. Moreover, the increase in minimum temperature is larger in both scenarios for all future periods. Future river flow is projected by both models to increase in the twenty first century (CCAM and RegCM) in both scenarios. However, the rate of increase is larger during the first half while it is relatively small in the second half of the twenty first century in RCP4.5. The possible reason for high river flow during the first half of the twenty first century is the large increase in temperature, which may cause faster melting of snow, while in the last half of the century there is a decreasing trend in river flow, precipitation, and temperature (2071-2100) in comparison to 2041-2070 for RCP4.5. Generally, for all future periods, the percentage of increased river flow is larger in winter than in

  15. Projected seasonal mean summer monsoon over India and adjoining regions for the twenty-first century

    Science.gov (United States)

    Dash, Sushil K.; Mishra, Saroj K.; Pattnayak, Kanhu C.; Mamgain, Ashu; Mariotti, Laura; Coppola, Erika; Giorgi, Filippo; Giuliani, Graziano

    2015-11-01

    In this study, we present the projected seasonal mean summer monsoon over India and adjoining regions for the twenty-first century under the representative concentration pathway (RCP) 4.5 and RCP 8.5 scenarios using the regional model RegCM4 driven by the global model GFDL-ESM2M. RegCM4 is integrated from 1970 to 2099 at 50 km horizontal resolution over the South Asia CORDEX domain. The simulated mean summer monsoon circulation and associated rainfall by RegCM4 are validated against observations in the reference period 1975 to 2004 based on the Global Precipitation Climatology Project (GPCP) and India Meteorological Department (IMD) data sets. Regional model results are also compared with those of the global model GFDL which forces the RegCM4, showing that the regional model in particular improves the simulation of precipitation trends during the reference period. Future projections are categorized as near future (2010-2039), mid future (2040-2069), and far future (2070-2099). Comparison of projected seasonal (June-September) mean rainfall from the different time slices indicate a gradual increase in the intensity of changes over some of the regions under both the RCP4.5 and RCP8.5 scenarios. RegCM4 projected rainfall decreases over most of the Indian land mass and the equatorial and northern Indian Ocean, while it increases over the Arabian Sea, northern Bay of Bengal, and the Himalayas. Results show that the monsoon circulation may become weaker in the future associated with a decrease in rainfall over Indian land points. The RegCM4 projected decrease in June, July, August, September (JJAS) rainfall under the RCP8.5 scenario over the central, eastern, and peninsular India by the end of the century is in the range of 25-40 % of their mean reference period values; it is significant at the 95 % confidence level and it is broadly in line with patterns of observed change in recent decades. Surface evaporation is projected to increase over the Indian Ocean, thereby

  16. Establishing the R&D agenda for twenty-first century learning.

    Science.gov (United States)

    Kay, Ken; Honey, Margaret

    2006-01-01

    An infusion of twenty-first century skills into American public education necessitates a plan for research and development to further such reform. While the nation agrees that students must obtain critical thinking, problem-solving, and communication skills to succeed in the current global marketplace, this chapter puts forth a long-term, proactive agenda to invest in targeted research to propel and sustain this shift in education. The authors examine the impact such an R&D agenda would have on pedagogy and assessment and the implications for institutions of higher education. As the United States struggles to maintain dominance in the international economy, it faces a great challenge in keeping up with European and Asian competitors' strategies for preparing youth for the global marketplace. The authors hope the global reality will help contextualize the debate around American education--the current trend toward basics and accountability needs to be broadened. Building on frameworks created by the Partnership for 21st Century Skills, this chapter proposes questions to guide research around teaching, professional development, and assessment significant to twenty-first century skills. Knowing that educational change depends on providing teachers with the tools, support, and training to make fundamental changes in their practice, the authors argue for extensive research around best practices. In addition, if assessments are created to measure the desired outcomes, such measuring tools can drive reform. Furthermore, large-scale changes in teacher preparation programs must take place to allow teachers to adequately employ twenty-first century teaching and assessment strategies.

  17. Why American business demands twenty-first century skills: an industry perspective.

    Science.gov (United States)

    Bruett, Karen

    2006-01-01

    Public education is the key to individual and business prosperity. With a vested stake in education, educators, employers, parents, policymakers, and the public should question how this nation's public education system is faring. Knowing that recent international assessments have shown little or no gains in American students' achievement, the author asserts the clear need for change. As both a large American corporate employer and a provider of technology for schools, Dell is concerned with ensuring that youth will thrive in their adult lives. Changing workplace expectations lead to a new list of skills students will need to acquire before completing their schooling. Through technology, Dell supports schools in meeting educational goals, striving to supply students with the necessary skills, referred to as twenty-first century skills. The Partnership for 21st Century Skills, of which Dell is a member, has led an initiative to define what twenty-first century learning should entail. Through extensive research, the partnership has built a framework outlining twenty-first century skills: analytical thinking, communication, collaboration, global awareness, and technological and economic literacy. Dell and the partnership are working state by state to promote the integration of these skills into curricula, professional development for teachers, and classroom environments. The authors describe two current initiatives, one in Virginia, the other in Texas, which both use technology to help student learning. All stakeholders can take part in preparing young people to compete in the global economy. Educators and administrators, legislators, parents, and employers must play their role in helping students be ready for what the workforce and the world has in store for them.

  18. Managing the twenty-first century reference department challenges and prospects

    CERN Document Server

    Katz, Linda S

    2014-01-01

    Learn the skills needed to update and manage a reference department that efficiently meets the needs of clients today?and tomorrow! Managing the Twenty-First Century Reference Department: Challenges and Prospects provides librarians with the knowledge and skills they need to manage an effective reference service. Full of useful and practical ideas, this book presents successful methods for recruiting and retaining capable reference department staff and management, training new employees and adapting current services to an evolving field. Expert practitioners address the changing role of the r

  19. Digital images and art libraries in the twenty-first century

    CERN Document Server

    Wyngaard, Susan

    2013-01-01

    Increase your knowledge of the digital technology that is essential for art librarianship today! Digital Images and Art Libraries in the Twenty-First Century is your key to cutting-edge discourse on digital image databases and art libraries. Just as early photographers tried to capture the world to make it accessible, now information professionals in art libraries and art museums are creating and sharing digital collections to make them broadly accessible. This collection shares the experience and insight of art information managers who have taken advantage of digital technology to exp

  20. Mid-Twentieth Century Modern Dance in the Twenty-first Century

    OpenAIRE

    Puleio, Dante

    2017-01-01

    Mid-Twentieth Century Modern Dance in the Twenty-First CenturyThe relevance of modern dance being performed today has been a growing topic in the dance field as legacy companies age and the field of contemporary dance continues to expand. This thesis begins with critical response to mid-century modern dance in the work of well-known dance critics John Martin, Edwin Denby and Louis Horst and how they substantiated modern dance’s place in dance history. My interviews with dance critics Alastair...

  1. Authority in the Virtual Sangat : Sikhism, Ritual and Identity in the Twenty-First Century

    OpenAIRE

    Jakobsh, Doris R.

    2006-01-01

    In her paper Authority in the Virtual Sanga. Sikhism, Ritual and Identity in the Twenty-First Century, Doris Jakobsh analyses the change of authority based on her research on Sikhs on the Internet. She stresses the Web as a ‘third place’ of communication among the Sikhs as well as the phenomenon of new authorities online. However, this does not imply the replacement of the traditional seats of authority, the Akal Takht, SGPC, or gurdwara managements, but one can recognize a significant shift ...

  2. A perspective of dental education in the twenty-first century.

    Science.gov (United States)

    Allen, D L; Finnegan, W N

    1989-04-01

    With the beginning of the twenty-first century only twelve years away, it is clearly not too early for dental educators to be planning for it. If one compares the practice of dentistry and the nature of dental education in 1888 with that of 1988, and then tries to project what they will be like in the year 2088, we have a difficult task indeed. As Alvin Toffler has pointed out in his book Future Shock, we are in an ever escalating period of change. Dentistry and dental education will change much more in the next one hundred years than they have in the last one hundred years.

  3. The economics of urologic practice in the twenty-first century.

    Science.gov (United States)

    Holtgrewe, H L

    1998-02-01

    Our nation's health care has been undergoing an economic revolution. The practice of urology has not escaped. Speculation regarding the economics of urologic practice in the twenty-first century must be based on the continuation of policy trends begun over the last decade by government and managed care coupled with the impact of changes in the science of urology, shifting population demographics, and changing social factors. The hallmarks of the years ahead include increased physician accountability, expanded use of clinical care guidelines, continued relentless penetration of managed care, and reduced reimbursement for surgical services.

  4. Rethinking Individual Authorship: Robert Burns, Oral Tradition, and the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    Ruth Knezevich

    2011-10-01

    Full Text Available The songs of late-eighteenth-century Scottish poet Robert Burns provide a rich case study of literature that challenges existing notions of the author as an autonomous entity. Responding to twenty-first-century examples of contested issues of intellectual property and plagiarism in an age of digital media, this project illustrates the ways in which precepts of oral tradition can inform our thinking about cultural production within contexts seemingly permeated by ever-present literacy or text-based thinking in order to provide a new outlook on such situations of artistic borrowing or “plagiarism.”

  5. Ocean heat transport into the Arctic in the twentieth and twenty-first century in EC-Earth

    Science.gov (United States)

    Koenigk, Torben; Brodeau, Laurent

    2014-06-01

    The ocean heat transport into the Arctic and the heat budget of the Barents Sea are analyzed in an ensemble of historical and future climate simulations performed with the global coupled climate model EC-Earth. The zonally integrated northward heat flux in the ocean at 70°N is strongly enhanced and compensates for a reduction of its atmospheric counterpart in the twenty first century. Although an increase in the northward heat transport occurs through all of Fram Strait, Canadian Archipelago, Bering Strait and Barents Sea Opening, it is the latter which dominates the increase in ocean heat transport into the Arctic. Increased temperature of the northward transported Atlantic water masses are the main reason for the enhancement of the ocean heat transport. The natural variability in the heat transport into the Barents Sea is caused to the same extent by variations in temperature and volume transport. Large ocean heat transports lead to reduced ice and higher atmospheric temperature in the Barents Sea area and are related to the positive phase of the North Atlantic Oscillation. The net ocean heat transport into the Barents Sea grows until about year 2050. Thereafter, both heat and volume fluxes out of the Barents Sea through the section between Franz Josef Land and Novaya Zemlya are strongly enhanced and compensate for all further increase in the inflow through the Barents Sea Opening. Most of the heat transported by the ocean into the Barents Sea is passed to the atmosphere and contributes to warming of the atmosphere and Arctic temperature amplification. Latent and sensible heat fluxes are enhanced. Net surface long-wave and solar radiation are enhanced upward and downward, respectively and are almost compensating each other. We find that the changes in the surface heat fluxes are mainly caused by the vanishing sea ice in the twenty first century. The increasing ocean heat transport leads to enhanced bottom ice melt and to an extension of the area with bottom ice

  6. Bits, Bytes and Dinosaurs: Using Levinas and Freire to Address the Concept of "Twenty-First Century Learning"

    Science.gov (United States)

    Benade, Leon

    2015-01-01

    The discourse of twenty-first century learning argues that education should prepare students for successful living in the twenty-first century workplace and society. It challenges all educators with the idea that contemporary education is unable to do so, as it is designed to replicate an industrial age model, essentially rear-focused, rather than…

  7. Long-term climate response to stabilized and overshoot anthropogenic forcings beyond the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Tsutsui, Junichi; Yoshida, Yoshikatsu; Kim, Dong-Hoon; Kitabata, Hideyuki; Nishizawa, Keiichi; Nakashiki, Norikazu; Maruyama, Koki [Central Research Institute of Electric Power Industry, Chiba (Japan)

    2007-02-15

    From multi-ensembles of climate simulations using the Community Climate System Model version 3, global climate changes have been investigated focusing on long-term responses to stabilized anthropogenic forcings. In addition to the standard forcing scenarios for the current international assessment, an overshoot scenario, where radiative forcings are decreased from one stabilized level to another, is also considered. The globally-averaged annual surface air temperature increases during the twenty-first century by 2.58 and 1.56 C for increased forcings under two future scenarios denoted by A1B and B1, respectively. These changes continue but at much slower rates in later centuries under forcings stabilized at year 2100. The overshoot scenario provides a different pathway to the lower B1 level by way of the greater A1B level. This scenario results in a surface climate similar to that in the B1 scenario within 100 years after the forcing reaches the B1 level. Contrasting to the surface changes, responses in the ocean are significantly delayed. It is estimated from the linear response theory that temperature changes under stabilized forcings to a final equilibrium state in the A1B (B1) scenario are factors of 0.3-0.4, 0.9, and 17 (0.3, 0.6, and 11) to changes during the twenty-first century, respectively, for three ocean layers of the surface to 100, 100-500, and 500 m to the bottom. Although responses in the lower ocean layers imply a nonlinear behavior, the ocean temperatures in the overshoot and B1 scenarios are likely to converge in their final equilibrium states. (orig.)

  8. Secondary battery which flaps in the twenty-first century; 21seiki ni habataku nijidenchi

    Energy Technology Data Exchange (ETDEWEB)

    Takehara, Z. [kansai University, Osaka (Japan)

    1999-12-25

    The development of a secondary battery of large safety and high-power high energy density will be advanced in the twenty-first century, while both sides in environment and resources are considered. The high tension battery will be advantageous, and the development of this battery will be promoted mainly by the lithium secondary battery. The battery which uses manganese system compounds for cathode active material was used in the early time practically, and in addition, it will develop to the ferrous compounds. In the electrolyte, the metallic lithium is used in negative electrode active materials, and an organic polymer thin film including an ion cluster which is active in the nonaqueous solvent will be developed. The secondary battery which has only a power generation function at the inside of a container and has active material at the outside of the container is suitable for large-sized and large-capacity secondary batteries. The battery of this conformer of which belongs the redox, flow battery which possesses active material in liquid electrolyte and the fuel cell using a gas active material. The development of the lithium secondary battery using the new material will be promoted competitively with the development of the fuel cell and the redox, flow battery, etc., when the field where the application of the battery expanded in the twenty-first century was considered. (NEDO)

  9. Northern African climate at the end of the twenty-first century: an integrated application of regional and global climate models

    Energy Technology Data Exchange (ETDEWEB)

    Patricola, Christina M. [Cornell University, Department of Earth and Atmospheric Sciences, Ithaca, NY (United States); Cook, Kerry H. [The University of Texas at Austin, Jackson School of Geosciences, Austin, TX (United States)

    2010-07-15

    A method for simulating future climate on regional space scales is developed and applied to northern Africa. Simulation with a regional model allows for the horizontal resolution needed to resolve the region's strong meridional gradients and the optimization of parameterizations and land-surface model. The control simulation is constrained by reanalysis data, and realistically represents the present day climate. Atmosphere-ocean general circulation model (AOGCM) output provides SST and lateral boundary condition anomalies for 2081-2100 under a business-as-usual emissions scenario, and the atmospheric CO{sub 2} concentration is increased to 757 ppmv. A nine-member ensemble of future climate projections is generated by using output from nine AOGCMs. The consistency of precipitation projections for the end of the twenty-first century is much greater for the regional model ensemble than among the AOGCMs. More than 77% of ensemble members produce the same sign rainfall anomaly over much of northern Africa. For West Africa, the regional model projects wetter conditions in spring, but a mid-summer drought develops during June and July, and the heat stoke risk increases across the Sahel. Wetter conditions resume in late summer, and the likelihood of flooding increases. The regional model generally projects wetter conditions over eastern Central Africa in June and drying during August through September. Severe drought impacts parts of East Africa in late summer. Conditions become wetter in October, but the enhanced rainfall does not compensate for the summertime deficit. The risk of heat stroke increases over this region, although the threat is not projected to be as great as in the Sahel. (orig.)

  10. Toward a Social Psychology of Race and Race Relations for the Twenty-First Century.

    Science.gov (United States)

    Richeson, Jennifer A; Sommers, Samuel R

    2016-01-01

    The United States, like many nations, continues to experience rapid growth in its racial minority population and is projected to attain so-called majority-minority status by 2050. Along with these demographic changes, staggering racial disparities persist in health, wealth, and overall well-being. In this article, we review the social psychological literature on race and race relations, beginning with the seemingly simple question: What is race? Drawing on research from different fields, we forward a model of race as dynamic, malleable, and socially constructed, shifting across time, place, perceiver, and target. We then use classic theoretical perspectives on intergroup relations to frame and then consider new questions regarding contemporary racial dynamics. We next consider research on racial diversity, focusing on its effects during interpersonal encounters and for groups. We close by highlighting emerging topics that should top the research agenda for the social psychology of race and race relations in the twenty-first century.

  11. Globalisation and Social Imaginaries: The Changing Ideological Landscape of the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    Manfred B. Steger

    2009-01-01

    Full Text Available The proliferation of prefixes like ‘neo’ and ‘post’ that adorn conventional ‘isms’ has cast a long shadow on the contemporary relevance of traditional political belief systems like liberalism, conservatism, and Marxism. This article explores how the thickening of global consciousness finds its expression in the growing capability of today’s political ideologies to translate the rising global imaginary into concrete political programs and agendas. But these subjective dynamics of denationalization at the heart of globalisation have not yet dispensed with the declining national imaginary. The twenty-first century promises to be an ideational interregnum in which both the global and national stimulate people’s deep-seated understandings of community. The essay also offers a rough outline and basic features of a new classification scheme that divides contemporary political ideologies into ‘market globalism’, ‘justice globalism’, and ‘religious globalism’.

  12. A Farewell to Innocence? African Youth and Violence in the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    Charles Ugochukwu Ukeje

    2012-12-01

    Full Text Available This is a broad examination of the issue of youth violence in twenty-first-century Africa, looking at the context within which a youth culture of violence has evolved and attempting to understand the underlining discourses of hegemony and power that drive it. The article focuses specifically on youth violence as apolitical response to the dynamics of (disempowerment, exclusion, and economic crisis and uses (postconflict states like Liberia, Sierra Leone, and Nigeriato explain not just the overall challenge of youth violence but also the nature of responses that it has elicited from established structures of authority. Youth violence is in many ways an expression of youth agency in the context of a social and economic system that provides little opportunity.

  13. Global threats from invasive alien species in the twenty-first century and national response capacities

    Science.gov (United States)

    Early, Regan; Bradley, Bethany A.; Dukes, Jeffrey S.; Lawler, Joshua J.; Olden, Julian D.; Blumenthal, Dana M.; Gonzalez, Patrick; Grosholz, Edwin D.; Ibañez, Ines; Miller, Luke P.; Sorte, Cascade J. B.; Tatem, Andrew J.

    2016-01-01

    Invasive alien species (IAS) threaten human livelihoods and biodiversity globally. Increasing globalization facilitates IAS arrival, and environmental changes, including climate change, facilitate IAS establishment. Here we provide the first global, spatial analysis of the terrestrial threat from IAS in light of twenty-first century globalization and environmental change, and evaluate national capacities to prevent and manage species invasions. We find that one-sixth of the global land surface is highly vulnerable to invasion, including substantial areas in developing economies and biodiversity hotspots. The dominant invasion vectors differ between high-income countries (imports, particularly of plants and pets) and low-income countries (air travel). Uniting data on the causes of introduction and establishment can improve early-warning and eradication schemes. Most countries have limited capacity to act against invasions. In particular, we reveal a clear need for proactive invasion strategies in areas with high poverty levels, high biodiversity and low historical levels of invasion. PMID:27549569

  14. Little Dorrit’s Fourth Volume. Twenty-first Century Remediation of a Victorian Classic

    Directory of Open Access Journals (Sweden)

    Simonetta Falchi

    2014-11-01

    Full Text Available The current interest in the Victorian period is particularly evident in the multitude of successful period dramas and cinema productions deriving their inspiration from Victorian history and culture. Dickens’s Little Dorrit, generated by the difficulties of understanding a ‘new’ world dominated by more and more complicated machines, and markets, is possibly the ideal paradigm to explore this ‘Victorianomania’ because this novel strikes us “no less forcefully today in its indictment of society's ability to destroy through greed and crushing self-interest” (Kirschner 2009.This study carries out a threefold analysis of Little Dorrit's remediation in the twenty-first century: visual remediation – Xue’s 2012 Little Dorrit; audio-visual remediation – the BBC series; and web remediation – fan fiction – in order to investigate Dickens’s appeal and longevity in contemporary media.

  15. Global threats from invasive alien species in the twenty-first century and national response capacities.

    Science.gov (United States)

    Early, Regan; Bradley, Bethany A; Dukes, Jeffrey S; Lawler, Joshua J; Olden, Julian D; Blumenthal, Dana M; Gonzalez, Patrick; Grosholz, Edwin D; Ibañez, Ines; Miller, Luke P; Sorte, Cascade J B; Tatem, Andrew J

    2016-08-23

    Invasive alien species (IAS) threaten human livelihoods and biodiversity globally. Increasing globalization facilitates IAS arrival, and environmental changes, including climate change, facilitate IAS establishment. Here we provide the first global, spatial analysis of the terrestrial threat from IAS in light of twenty-first century globalization and environmental change, and evaluate national capacities to prevent and manage species invasions. We find that one-sixth of the global land surface is highly vulnerable to invasion, including substantial areas in developing economies and biodiversity hotspots. The dominant invasion vectors differ between high-income countries (imports, particularly of plants and pets) and low-income countries (air travel). Uniting data on the causes of introduction and establishment can improve early-warning and eradication schemes. Most countries have limited capacity to act against invasions. In particular, we reveal a clear need for proactive invasion strategies in areas with high poverty levels, high biodiversity and low historical levels of invasion.

  16. The Worldwide Significance of Chinese Aesthetics in the Twenty-First Century

    Institute of Scientific and Technical Information of China (English)

    Liu Qingping

    2006-01-01

    Through comparisons between traditional Chinese and Western aesthetics,this article tries to explain the worldwide significance of Chinese aesthetic tradition in the twentyfirst century.In contrast to cognitive-rational spirit and the tendency to distinguish the subjectives and objectives of traditional Western aesthetics,traditional Chinese aesthetics shows a distinctive practical-emotional spirit and a tendency to harmoniously unite human beings with nature,and believes that beauty is,first and foremost,a free state or way (Dao) of human life;the most important thing for human beings is how to make their own lives and existence beautiful.Therefore,it puts forward some persuasive and valuable insights into beauty and art,thus playing an independent and constructive role in intercultural aesthetic dialogues of the twenty-first century.

  17. Evolution and modulation of tropical heating from the last glacial maximum through the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Hoyos, Carlos D.; Webster, Peter J. [Georgia Institute of Technology, School of Earth and Atmospheric Sciences, Atlanta, GA (United States)

    2012-04-15

    Twentieth century observations show that during the last 50 years the sea-surface temperature (SST) of the tropical oceans has increased by {proportional_to}0.5 C and the area of SST >26.5 and 28 C (arbitrarily referred to as the oceanic warm pool: OWP) by 15 and 50% respectively in association with an increase in green house gas concentrations, with non-understood natural variability or a combination of both. Based on CMIP3 projections the OWP is projected to double during twenty-first century in a moderate CO{sub 2} forcing scenario (IPCC A1B scenario). However, during the observational period the area of positive atmospheric heating (referred to as the dynamic warm pool, DWP), has remained constant. The threshold SST (T{sub H}), which demarks the region of net heating and cooling, has increased from 26.6 C in the 1950s to 27.1 C in the last decade and it is projected to increase to {proportional_to}28.5 C by 2100. Based on climate model simulations, the area of the DWP is projected to remain constant during the twenty-first century. Analysis of the paleoclimate model intercomparison project (PMIP I and II) simulations for the Last Glacial maximum and the Mid-Holocene periods show a very similar behaviour, with a larger OWP in periods of elevated tropical SST, and an almost constant DWP associated with a varying T{sub H}. The constancy of the DWP area, despite shifts in the background SST, is shown to be the result of a near exact matching between increases in the integrated convective heating within the DWP and the integrated radiative cooling outside the DWP as SST changes. Although the area of the DWP remains constant, the total tropical atmospheric heating is a strong function of the SST. For example the net heating has increased by about 10% from 1950 to 2000 and it is projected to increase by a further 20% by 2100. Such changes must be compensated by a more vigorous atmospheric circulation, with growth in convective heating within the warm pool, and an

  18. New and newer[The New Physics for the Twenty-First Century

    Energy Technology Data Exchange (ETDEWEB)

    Clark, C. [Electron and Optical Physics Division, National Institute of Standards and Technology, MD (United States)]. E-mail: clark@mail.nist.gov

    2006-09-15

    Stephen Hawking's inaugural lecture as Lucasian Professor of Mathematics at Cambridge University in 1980 caused quite a stir. Its title - 'Is the end in sight for theoretical physics?' - raised the prospect of a unified 'theory of everything'. Hawking suggested that there was a good chance of resolving the remaining inconsistencies between the two big 'theories of something' - quantum mechanics and general relativity - before the turn of the century. My first impression on reading The New Physics for the Twenty-First Century, a collection of essays edited by science journalist Gordon Fraser, is that a theory of everything may still be attainable by the turn of the century. However, there is now 20 times more of everything in the universe than there was in the past century, 95% of which no-one has ever actually seen, or had even heard of until a few years ago - as summarized in articles by Wendy Freedman, Edward Kolb and Ronald Adler. Despite this, Michael Green describes amazing developments in string theory that could tie everything together, if one could just figure out which, if any, of the apparently infinite varieties of string theory applies to our world, and why. (U.K.)

  19. Indication to Open Anatrophic Nephrolithotomy in the Twenty-First Century: A Case Report

    Directory of Open Access Journals (Sweden)

    Alfredo Maria Bove

    2012-01-01

    Full Text Available Introduction. Advances in endourology have greatly reduced indications to open surgery in the treatment of staghorn kidney stones. Nevertheless in our experience, open surgery still represents the treatment of choice in rare cases. Case Report. A 71-year-old morbidly obese female patient complaining about occasional left flank pain, and recurrent cystitis for many years, presented bilateral staghorn kidney stones. Comorbidities were obesity (BMI 36.2, hypertension, type II diabetes, and chronic obstructive pulmunary disease (COPD hyperlipidemia. Due to these comorbidities, endoscopic and laparoscopic approaches were not indicated. We offered the patient staged open anatrophic nephrolithotomy. Results. Operative time was 180 minutes. Blood loss was 500 cc. requiring one unit of packed red blood cells. Hospital stay was 7 days. The renal function was unaffected based on preoperative and postoperative serum creatinine levels. Stone-free status of the left kidney was confirmed after surgery with CT scan. Conclusions. Open surgery can represent a valid alterative in the treatment of staghorn kidney stones of very selected cases. A discussion of the current indications in the twenty-first century is presented.

  20. Ukrainian Issues in Geopolitical thought of the Twentieth and Twenty-First Centuries

    Directory of Open Access Journals (Sweden)

    Reginia-Zacharski Jacek

    2016-12-01

    Full Text Available Ukrainian lands in the twentieth and twenty-first centuries have been in proximity of great geopolitical changes several times. During that time the Ukrainian nation – due to various factors – encountered a number of “windows of opportunity” for achieving the realization of dreams about independence and national sovereignty. The author identified in the period considered four “general moments,” of which two have been completed successfully. The first of these occurred in 1990–1991, when for the first time in modern history, Ukrainians managed to achieve a lasting and relatively stable independence. The second of the “moments” – still unresolved – are events that began in the late autumn of 2013. The process, called “Revolution of Dignity”, represents a new quality in the history of the Ukrainian nation, therefore, that the Ukrainians have to defend the status quo (independence, territorial integrity, sovereignty, etc. but not to seek to achieve an independent being. The analysis leads to the conclusion that the ability of Ukrainians to achieve and maintain independence is largely a function of the relative power of the Russian state as measured with respect to the shape and quality of international relations.

  1. Emerging Tick-Borne Viruses in the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    Karen L. Mansfield

    2017-07-01

    Full Text Available Ticks, as a group, are second only to mosquitoes as vectors of pathogens to humans and are the primary vector for pathogens of livestock, companion animals, and wildlife. The role of ticks in the transmission of viruses has been known for over 100 years and yet new pathogenic viruses are still being detected and known viruses are continually spreading to new geographic locations. Partly as a result of their novelty, tick-virus interactions are at an early stage in understanding. For some viruses, even the principal tick-vector is not known. It is likely that tick-borne viruses will continue to emerge and challenge public and veterinary health long into the twenty-first century. However, studies focusing on tick saliva, a critical component of tick feeding, virus transmission, and a target for control of ticks and tick-borne diseases, point toward solutions to emerging viruses. The aim of this review is to describe some currently emerging tick-borne diseases, their causative viruses, and to discuss research on virus-tick interactions. Through focus on this area, future protein targets for intervention and vaccine development may be identified.

  2. Effects of long-term variability on projections of twenty-first century dynamic sea level

    Science.gov (United States)

    Bordbar, Mohammad H.; Martin, Thomas; Latif, Mojib; Park, Wonsun

    2015-04-01

    Sea-level rise is one of the most pressing aspects of anthropogenic global warming with far-reaching consequences for coastal societies. However, sea-level rise did and will strongly vary from coast to coast. Here we investigate the long-term internal variability effects on centennial projections of dynamic sea level (DSL), the local departure from the globally averaged sea level. A large ensemble of global warming integrations has been conducted with a climate model, where each realization was forced by identical CO2 increase but started from different atmospheric and oceanic initial conditions. In large parts of the mid- and high latitudes, the ensemble spread of the projected centennial DSL trends is of the same order of magnitude as the globally averaged steric sea-level rise, suggesting that internal variability cannot be ignored when assessing twenty-first-century DSL trends. The ensemble spread is considerably reduced in the mid- to high latitudes when only the atmospheric initial conditions differ while keeping the oceanic initial state identical; indicating that centennial DSL projections are strongly dependent on ocean initial conditions.

  3. The Third Revolution: Philosophy into Practice in Twenty-first Century Psychiatry

    Directory of Open Access Journals (Sweden)

    KWM (Bill Fulford

    2008-12-01

    Full Text Available Three revolutions in psychiatry characterised the closing decade of the twentieth century: 1 in the neurosciences, 2 in patient-centred models of service delivery, and 3 in the emergence of a rapidly expanding new cross-disciplinary field of philosophy and psychiatry. Starting with a case history, the paper illustrates the impact of this third revolution - the new philosophy of psychiatry - on day-to-day clinical practice through training programmes and policy developments in what has become known as values-based practice. Derived from philosophical value theory and phenomenology, values-based practice is a partner to evidence-based practice in supporting clinical decision-making in the highly complex environment of mental health care. The paper concludes by setting values-based practice in context with other potentially practical important areas of the new philosophy of psychiatry arguing that all three revolutions need to be brought together if psychiatry is to meet the challenges of the twenty-first century.

  4. Waving or Drowning? Perceptions of Second Wave Feminism Through a Twenty-First Century Lens

    OpenAIRE

    Bralesford, Helen Margaret

    2006-01-01

    This dissertation sets out to explore some twenty-first century perceptions of second wave feminism with as little mediation from the academy as possible by employing the lens of popular culture to tease out and examine some of the assumptions about the second wave that have become culturally embedded at a grass roots level. The first chapter takes Betty Friedan���������������¢��������������������������������s seminal text The Feminine Mysti...

  5. Civil engineering at the crossroads in the twenty-first century.

    Science.gov (United States)

    Ramírez, Francisco; Seco, Andres

    2012-12-01

    The twenty-first century presents a major challenge for civil engineering. The magnitude and future importance of some of the problems perceived by society are directly related to the field of the civil engineer, implying an inescapable burden of responsibility for a group whose technical soundness, rational approach and efficiency is highly valued and respected by the citizen. However, the substantial changes in society and in the way it perceives the problems that it considers important call for a thorough review of our structures, both professional and educational; so that our profession, with its undeniable historical prestige, may modernize certain approaches and attitudes in order to continue to be a reliable instrument in the service of society, giving priority from an ethical standpoint to its actions in pursuit of "the public good". It possesses important tools to facilitate this work (new technologies, the development of communications, the transmission of scientific thought.···); but there is nevertheless a need for deep reflection on the very essence of civil engineering: what we want it to be in the future, and the ability and willingness to take the lead at a time when society needs disinterested messages, technically supported, reasonably presented and dispassionately transmitted.

  6. Santiago: Modernisation, segregation and urban identities in the twenty first century

    Directory of Open Access Journals (Sweden)

    Francisca Márquez

    2011-01-01

    Full Text Available This paper discusses research carried out in Santiago, Chile, and addresses the origin and construction of urban identities in this segregated city of the twenty first century. Based on sociological and ethnographic evidence, urban identity building processes are analysed by observing the occupation, use and appropriation of territory. The hypothesis is that, despite evidence of segregation, modernisation and globalisation, urban people reinvent lifestyles within their territories in order to harmonise their bonds of affection and belonging by using distinguishing markings or “brands” and by adopting typical everyday habits. The modern, segregated and global city is filled with “islands” that convey imagery and desires for a friendlier urban life. This paper analyses areas with community identities, neo community identities and border identities. It suggests that, just as community identities shelter nostalgia for a lost community (by finding refuge or reinventing ways to make the fringes of the city habitable in the background or on the “other side��� of the Mapocho River and very near the historical centre of the city, border identities have also arisen and persisted; these subvert the orderly and hegemonic city, resulting in a diverse, heterogeneous and multicultural lifestyle. The result is a synthesis and an urban lifestyle.

  7. Waving or Drowning? Perceptions of Second Wave Feminism Through a Twenty-First Century Lens

    OpenAIRE

    Bralesford, Helen Margaret

    2006-01-01

    This dissertation sets out to explore some twenty-first century perceptions of second wave feminism with as little mediation from the academy as possible by employing the lens of popular culture to tease out and examine some of the assumptions about the second wave that have become culturally embedded at a grass roots level. The first chapter takes Betty Friedan���������������¢��������������������������������s seminal text The Feminine Mysti...

  8. Diasporic Reparations: Repairing the Social Imaginaries of Central America in the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    Ana Patricia Rodríguez

    2013-06-01

    Full Text Available Contemporary Central American diasporic writers like Horacio Castellanos Moya, Francisco Goldman, Héctor Tobar, and Marcos McPeek Villatoro, in Senselessness (2008, The Art of Political Murder: Who Killed the Bishop? (2007, The Tattooed Soldier (1998, and the Romilia Chacón detective series, write in response to various forms of violence. They grapple with the image of Central America as a site of unsustainable violence, inhospitable material conditions, and unresolved historical issues that extend into the lives of Central Americans in the United States. The past is not easily dismissed, but lies at the core of transnational Central American subject formation. This essay examines how violence and impunity are closely tied in Central American diasporic texts and hold cognitive relevancy for Central Americans in and outside of the isthmus. While US Central Americans seek to understand the origins and conditions of their diaspora, writers reflect critically on Central American historiography, diaspora, and the construction of transnational “Centroamericanidades” in the twenty-first century. These writers engage in a literature of reparation that reveals the (impossibility of repairing and re-writing or righting the past in societies where violence and impunity have been institutionalized.

  9. Between vanguard and exclusion- young people of the twenty-first century

    Directory of Open Access Journals (Sweden)

    Agnieszka Gil

    2011-12-01

    Full Text Available This study has been narrowed down to reveal a paradox. Here the vanguard of culture and civilization - which is regarded as young people of the twenty-first century – is embroiled in a discourse of exclusion: economic, political and cultural life. In secondary school programs and high schools we do not find specific references and studies, primarily based on the needs of students, about the theory of popular culture and cultural education in the area of pop culture. The paradox of exclusion of mainstream culture from educational discourse is schizophrenic. The political exclusion of young people of the XXI century I consider all the disparaging scientific discourse, which skips the actual media and communication competence of young people. Prosumers, cognitarchy, digital natives, C-generation – they are for the modern economy “Silicon Valley” - their market power to exclude is already unstoppable. In other areas it remains to be considered whether excluding young people from the cultural discourse will not deprive our future teachers and translators of the next civilization revolution of social reality...

  10. SpaceNet: Modeling and Simulating Space Logistics

    Science.gov (United States)

    Lee, Gene; Jordan, Elizabeth; Shishko, Robert; de Weck, Olivier; Armar, Nii; Siddiqi, Afreen

    2008-01-01

    This paper summarizes the current state of the art in interplanetary supply chain modeling and discusses SpaceNet as one particular method and tool to address space logistics modeling and simulation challenges. Fundamental upgrades to the interplanetary supply chain framework such as process groups, nested elements, and cargo sharing, enabled SpaceNet to model an integrated set of missions as a campaign. The capabilities and uses of SpaceNet are demonstrated by a step-by-step modeling and simulation of a lunar campaign.

  11. SpaceNet: Modeling and Simulating Space Logistics

    Science.gov (United States)

    Lee, Gene; Jordan, Elizabeth; Shishko, Robert; de Weck, Olivier; Armar, Nii; Siddiqi, Afreen

    2008-01-01

    This paper summarizes the current state of the art in interplanetary supply chain modeling and discusses SpaceNet as one particular method and tool to address space logistics modeling and simulation challenges. Fundamental upgrades to the interplanetary supply chain framework such as process groups, nested elements, and cargo sharing, enabled SpaceNet to model an integrated set of missions as a campaign. The capabilities and uses of SpaceNet are demonstrated by a step-by-step modeling and simulation of a lunar campaign.

  12. Investigating the pace of temperature change and its implications over the twenty-first century

    Science.gov (United States)

    Chavaillaz, Y.; Joussaume, S.; Braconnot, P.; Vautard, R.

    2015-12-01

    In most studies, climate change is approached by focusing on the evolution between a fixed current baseline and the future, emphasizing stronger warming as we move further from the current climate. Under climate conditions that are continuously evolving, human systems might have to constantly adapt to a changing target. We propose here an alternative approach, and consider indicators of the pace of temperature change and its effects on temperature distributions estimated from projections of an ensemble of 18 General Circulation Models. The pace is represented by a rate defined by the difference between two subsequent 20-year periods. Under the strongest emission pathway (RCP 8.5), the warming rate strongly increases over the twenty-first century, with a maximum reached before 2080. Whilst northern high-latitudes witness the highest temperature rise, all other latitudes highlight at least a doubling in the warming rate compared to the current period. The spatial extent of significant shifts in annual temperature distributions between two subsequent 20-year periods is projected to be at least four times larger than in the current period. They are mainly located in tropical areas, such as West Africa and South-East Asia. The fraction of the world population exposed to these shifts grows from 8% to 60% from around 2060 onwards, i.e. reaching 6 billions people. In contrast, low mitigation measures (RCP 6.0) are sufficient to keep the warming rate similar to current values. Under the medium mitigation pathway (RCP 4.5), population exposure to significant shifts drops to negligible values by the end of the century. Strong mitigation measures (RCP 2.6) are the only option that generates a global return to historical conditions regarding our indicators. Considering the pace of change can bring an alternative way to interact with climate impacts and adaptation communities.

  13. Latvian Security and Defense Policy within the Twenty-First Century Security Environment

    Directory of Open Access Journals (Sweden)

    Rublovskis Raimonds

    2014-12-01

    Full Text Available The aim of this paper is to analyze fundamental factors which form and profoundly shape security and defense policy of the Republic of Latvia. One can argue that historical background, geographical location, common institutional history within the former Soviet Union, the Russia factor, the relative smallness of the territory of state and the population, the ethnic composition of the population, the low density of the population and rather limited financial and manpower resources available for the defense of the Republic of Latvia are the key factors of influence on the state security and defense policy. The core principles of the security and defense policy of Latvia are the membership in powerful global military alliance of NATO and bilateral strategic partnership with the United States. However, security and defense cooperation among the three Baltic States as well as enhanced cooperation within the Baltic-Nordic framework is seen as an important supplementary factor for the increased security of the Republic of Latvia. Latvia has developed a sustainable legal and institutional framework in order to contribute to state security and defense; however, security challenges and significant changes within the global security environment of the twenty-first century will further challenge the ability of the Republic of Latvia to sustain its current legal framework, and more importantly, current institutional structure of Latvian security and defense architecture. Significant internal and external challenges will impact the fundamental pillars of Latvian security and defense policy, such as American strategic shift to the Pacific, and lack of political will to increase defense budgets in European part of NATO. It has to be clear that very independence, security and defense of the Republic of Latvia depend on the ability of NATO to remain an effective organization with timely and efficient decision-making, and the ability of the United States to remain

  14. Yeast culture collections in the twenty-first century: new opportunities and challenges.

    Science.gov (United States)

    Boundy-Mills, Kyria L; Glantschnig, Ewald; Roberts, Ian N; Yurkov, Andrey; Casaregola, Serge; Daniel, Heide-Marie; Groenewald, Marizeth; Turchetti, Benedetta

    2016-07-01

    The twenty-first century has brought new opportunities and challenges to yeast culture collections, whether they are long-standing or recently established. Basic functions such as archiving, characterizing and distributing yeasts continue, but with expanded responsibilities and emerging opportunities. In addition to a number of well-known, large public repositories, there are dozens of smaller public collections that differ in the range of species and strains preserved, field of emphasis and services offered. Several collections have converted their catalogues to comprehensive databases and synchronize them continuously through public services, making it easier for users worldwide to locate a suitable source for specific yeast strains and the data associated with these yeasts. In-house research such as yeast taxonomy continues to be important at culture collections. Because yeast culture collections preserve a broad diversity of species and strains within a species, they are able to make discoveries in many other areas as well, such as biotechnology, functional, comparative and evolution genomics, bioprocesses and novel products. Due to the implementation of the Convention of Biological Diversity (CBD) and the Nagoya Protocol (NP), there are new requirements for both depositors and users to ensure that yeasts were collected following proper procedures and to guarantee that the country of origin will be considered if benefits arise from a yeast's utilization. Intellectual property rights (IPRs) are extremely relevant to the current access and benefit-sharing (ABS) mechanisms; most research and development involving genetic resources and associated traditional knowledge will be subject to this topic. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Gendering inequality: a note on Piketty's Capital in the twenty-first century.

    Science.gov (United States)

    Perrons, Diane

    2014-12-01

    Thomas Piketty's Capital in the Twenty-First Century is remarkable for moving inequality from the margins to mainstream debate through detailed analysis of longitudinal statistics and, for an economist, by advocating an interdisciplinary perspective and writing in a witty and accessible style. With reference to the post 1970 period, when wage increases are largely responsible for the increase in inequality, Piketty shows how patrimonial capitalists (elite managers) in the top decile and centile of the distribution appropriate a growing share of social wealth as a consequence of their 'power to set their own remuneration' in the context of tolerant social norms rather than through their productive contributions. Piketty raises but defers the question of where these social norms come from to other disciplines. A Feminist Economics perspective indicates that these questions are central to a more inclusive form of economic analysis and such an approach would enrich Piketty's analysis in two main ways. First, by paying greater attention to the processes and social norms through which inequalities are produced and justified and second by highlighting the ways in which inequality is experienced differently depending not only on class, but also on other aspects of identity including gender. This approach also suggests that it is necessary to supplement the ex-post redistributive policies recommended by Piketty: a global wealth tax and more steeply progressive income tax, with ex-ante measures to stop the rise in wage inequality in the first place, especially by bridging the huge gulf that exists between those who care for people and those who manage money.

  16. Preparation of European Public Health Professionals in the Twenty-first Century

    Science.gov (United States)

    Bjegovic-Mikanovic, Vesna; Otok, Robert

    2017-01-01

    The public health profession in Europe has a leadership role for ensuring European’s health in the twenty-first century and therefore must assume responsibility for advancing education for research and practice. Three fundamental questions are explored: (1) What are the main public health problems facing public health professionals; (2) What are their existing competencies after training; and (3) What competencies do European employers expect? The European Schools of Public Health assessed their best success to be in the field of health promotion, followed by disease prevention including identification of priority health problems, and elimination of health hazards in the community. Conversely, they see the least success in dealing with preparedness and planning for public health emergencies. From an employer’s perspective, significant gaps between current and desired levels of performance at the job exist for all Essential Public Health Operations of World Health Organization. Based on prior research and recent European surveys of Schools and Departments of Public Health, the following recommendations are made, which emphasize the leadership role of the European public health community: (1) the preparation of public health professionals requires an interface between public health functions, competencies, and performance; (2) competence-based education is important and allows debates on the scope of the required education; (3) governments have to realize that the present lack of infrastructure and capacity is detrimental to the people’s health; (4) as public health challenges are increasingly global, educational institutions have to look beyond the national boundaries and participate in European and global networks for education, research, and practice. PMID:28261578

  17. Towards regional projections of twenty-first century sea-level change based on IPCC SRES scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Slangen, A.B.A.; Wal, R.S.W. van de [Utrecht University, Institute for Marine and Atmospheric research Utrecht, Utrecht (Netherlands); Katsman, C.A. [Royal Netherlands Meteorological Institute (KNMI), P.O. Box 201, De Bilt (Netherlands); Vermeersen, L.L.A. [TU Delft, Faculty of Aerospace Engineering, Delft (Netherlands); Riva, R.E.M. [TU Delft, Delft (Netherlands)

    2012-03-15

    Sea-level change is often considered to be globally uniform in sea-level projections. However, local relative sea-level (RSL) change can deviate substantially from the global mean. Here, we present maps of twenty-first century local RSL change estimates based on an ensemble of coupled climate model simulations for three emission scenarios. In the Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCC AR4), the same model simulations were used for their projections of global mean sea-level rise. The contribution of the small glaciers and ice caps to local RSL change is calculated with a glacier model, based on a volume-area approach. The contributions of the Greenland and Antarctic ice sheets are obtained from IPCC AR4 estimates. The RSL distribution resulting from the land ice mass changes is then calculated by solving the sea-level equation for a rotating, elastic Earth model. Next, we add the pattern of steric RSL changes obtained from the coupled climate models and a model estimate for the effect of Glacial Isostatic Adjustment. The resulting ensemble mean RSL pattern reveals that many regions will experience RSL changes that differ substantially from the global mean. For the A1B ensemble, local RSL change values range from -3.91 to 0.79 m, with a global mean of 0.47 m. Although the RSL amplitude differs, the spatial patterns are similar for all three emission scenarios. The spread in the projections is dominated by the distribution of the steric contribution, at least for the processes included in this study. Extreme ice loss scenarios may alter this picture. For individual sites, we find a standard deviation for the combined contributions of approximately 10 cm, regardless of emission scenario. (orig.)

  18. Twenty-first century science as a relational process: from eureka! to team science and a place for community psychology.

    Science.gov (United States)

    Tebes, Jacob Kraemer; Thai, Nghi D; Matlin, Samantha L

    2014-06-01

    In this paper we maintain that twenty-first century science is, fundamentally, a relational process in which knowledge is produced (or co-produced) through transactions among researchers or among researchers and public stakeholders. We offer an expanded perspective on the practice of twenty-first century science, the production of scientific knowledge, and what community psychology can contribute to these developments. We argue that: (1) trends in science show that research is increasingly being conducted in teams; (2) scientific teams, such as transdisciplinary teams of researchers or of researchers collaborating with various public stakeholders, are better able to address complex challenges; (3) transdisciplinary scientific teams are part of the larger, twenty-first century transformation in science; (4) the concept of heterarchy is a heuristic for team science aligned with this transformation; (5) a contemporary philosophy of science known as perspectivism provides an essential foundation to advance twenty-first century science; and (6) community psychology, through its core principles and practice competencies, offers theoretical and practical expertise for advancing team science and the transformation in science currently underway. We discuss the implications of these points and illustrate them briefly with two examples of transdisciplinary team science from our own work. We conclude that a new narrative is emerging for science in the twenty-first century that draws on interpersonal transactions in teams, and active engagement by researchers with the public to address critical accountabilities. Because of its core organizing principles and unique blend of expertise on the intersection of research and practice, community psychologists are well-prepared to help advance these developments, and thus have much to offer twenty-first century science.

  19. Agriculture in West Africa in the Twenty-First Century: Climate Change and Impacts Scenarios, and Potential for Adaptation.

    Science.gov (United States)

    Sultan, Benjamin; Gaetani, Marco

    2016-01-01

    West Africa is known to be particularly vulnerable to climate change due to high climate variability, high reliance on rain-fed agriculture, and limited economic and institutional capacity to respond to climate variability and change. In this context, better knowledge of how climate will change in West Africa and how such changes will impact crop productivity is crucial to inform policies that may counteract the adverse effects. This review paper provides a comprehensive overview of climate change impacts on agriculture in West Africa based on the recent scientific literature. West Africa is nowadays experiencing a rapid climate change, characterized by a widespread warming, a recovery of the monsoonal precipitation, and an increase in the occurrence of climate extremes. The observed climate tendencies are also projected to continue in the twenty-first century under moderate and high emission scenarios, although large uncertainties still affect simulations of the future West African climate, especially regarding the summer precipitation. However, despite diverging future projections of the monsoonal rainfall, which is essential for rain-fed agriculture, a robust evidence of yield loss in West Africa emerges. This yield loss is mainly driven by increased mean temperature while potential wetter or drier conditions as well as elevated CO2 concentrations can modulate this effect. Potential for adaptation is illustrated for major crops in West Africa through a selection of studies based on process-based crop models to adjust cropping systems (change in varieties, sowing dates and density, irrigation, fertilizer management) to future climate. Results of the cited studies are crop and region specific and no clear conclusions can be made regarding the most effective adaptation options. Further efforts are needed to improve modeling of the monsoon system and to better quantify the uncertainty in its changes under a warmer climate, in the response of the crops to such

  20. Uncertainty in twenty-first century projections of the Atlantic Meridional Overturning Circulation in CMIP3 and CMIP5 models

    Science.gov (United States)

    Reintges, Annika; Martin, Thomas; Latif, Mojib; Keenlyside, Noel S.

    2017-09-01

    Uncertainty in the strength of the Atlantic Meridional Overturning Circulation (AMOC) is analyzed in the Coupled Model Intercomparison Project Phase 3 (CMIP3) and Phase 5 (CMIP5) projections for the twenty-first century; and the different sources of uncertainty (scenario, internal and model) are quantified. Although the uncertainty in future projections of the AMOC index at 30°N is larger in CMIP5 than in CMIP3, the signal-to-noise ratio is comparable during the second half of the century and even larger in CMIP5 during the first half. This is due to a stronger AMOC reduction in CMIP5. At lead times longer than a few decades, model uncertainty dominates uncertainty in future projections of AMOC strength in both the CMIP3 and CMIP5 model ensembles. Internal variability significantly contributes only during the first few decades, while scenario uncertainty is relatively small at all lead times. Model uncertainty in future changes in AMOC strength arises mostly from uncertainty in density, as uncertainty arising from wind stress (Ekman transport) is negligible. Finally, the uncertainty in changes in the density originates mostly from the simulation of salinity, rather than temperature. High-latitude freshwater flux and the subpolar gyre projections were also analyzed, because these quantities are thought to play an important role for the future AMOC changes. The freshwater input in high latitudes is projected to increase and the subpolar gyre is projected to weaken. Both the freshening and the gyre weakening likely influence the AMOC by causing anomalous salinity advection into the regions of deep water formation. While the high model uncertainty in both parameters may explain the uncertainty in the AMOC projection, deeper insight into the mechanisms for AMOC is required to reach a more quantitative conclusion.

  1. Uncertainty in twenty-first century projections of the Atlantic Meridional Overturning Circulation in CMIP3 and CMIP5 models

    Science.gov (United States)

    Reintges, Annika; Martin, Thomas; Latif, Mojib; Keenlyside, Noel S.

    2016-05-01

    Uncertainty in the strength of the Atlantic Meridional Overturning Circulation (AMOC) is analyzed in the Coupled Model Intercomparison Project Phase 3 (CMIP3) and Phase 5 (CMIP5) projections for the twenty-first century; and the different sources of uncertainty (scenario, internal and model) are quantified. Although the uncertainty in future projections of the AMOC index at 30°N is larger in CMIP5 than in CMIP3, the signal-to-noise ratio is comparable during the second half of the century and even larger in CMIP5 during the first half. This is due to a stronger AMOC reduction in CMIP5. At lead times longer than a few decades, model uncertainty dominates uncertainty in future projections of AMOC strength in both the CMIP3 and CMIP5 model ensembles. Internal variability significantly contributes only during the first few decades, while scenario uncertainty is relatively small at all lead times. Model uncertainty in future changes in AMOC strength arises mostly from uncertainty in density, as uncertainty arising from wind stress (Ekman transport) is negligible. Finally, the uncertainty in changes in the density originates mostly from the simulation of salinity, rather than temperature. High-latitude freshwater flux and the subpolar gyre projections were also analyzed, because these quantities are thought to play an important role for the future AMOC changes. The freshwater input in high latitudes is projected to increase and the subpolar gyre is projected to weaken. Both the freshening and the gyre weakening likely influence the AMOC by causing anomalous salinity advection into the regions of deep water formation. While the high model uncertainty in both parameters may explain the uncertainty in the AMOC projection, deeper insight into the mechanisms for AMOC is required to reach a more quantitative conclusion.

  2. 78 FR 52605 - Announcing the Twenty First Public Meeting of the Crash Injury Research and Engineering Network...

    Science.gov (United States)

    2013-08-23

    .... Researchers can review data and share expertise, which may lead to a better understanding of crash injury... National Highway Traffic Safety Administration Announcing the Twenty First Public Meeting of the Crash... Meeting of members of the Crash Injury Research and Engineering Network. CIREN is a collaborative...

  3. Transformative Pedagogy, Leadership and School Organisation for the Twenty-First-Century Knowledge-Based Economy: The Case of Singapore

    Science.gov (United States)

    Dimmock, Clive; Goh, Jonathan W. P.

    2011-01-01

    Singapore has a high performing school system; its students top international tests in maths and science. Yet while the Singapore government cherishes its world class "brand", it realises that in a globally competitive world, its schools need to prepare students for the twenty-first-century knowledge-based economy (KBE). Accordingly,…

  4. Insights into Finnish First-Year Pre-Service Teachers' Twenty-First Century Skills

    Science.gov (United States)

    Valtonen, Teemu; Sointu, Erkko Tapio; Kukkonen, Jari; Häkkinen, Päivi; Järvelä, Sanna; Ahonen, Arto; Näykki, Piia; Pöysä-Tarhonen, Johanna; Mäkitalo-Siegl, Kati

    2017-01-01

    This study focuses on Finnish pre-service teachers' perceptions of their twenty-first century skills, especially their learning strategies, collaboration and teamwork, as well as knowledge and attitudes related to ICT in education. The target group consist of 263 first-year pre-service teachers from three universities. The results outline how…

  5. From School to Cafe and Back Again: Responding to the Learning Demands of the Twenty-First Century

    Science.gov (United States)

    McWilliam, Erica

    2011-01-01

    This paper traces the historical origins of formal and informal lifelong learning to argue that optimal twenty-first-century education can and should draw on the traditions of both the school and the coffee house or cafe. For some time now, educational policy documents and glossy school brochures have come wrapped in the mantle of lifelong…

  6. A Shift to Inquiry: The Heart of Effective Teaching and Professional Development for the Twenty-First Century

    Science.gov (United States)

    Bartolini, Vicki; Worth, Karen; Jensen LaConte, Judy E.

    2014-01-01

    This article explores how an experienced teacher navigates the demands of curriculum to implement her inquiry-centered teaching and learning philosophy, and how administrators along the way supported her during this change. Interviews with this classroom teacher surface suggestions for twenty-first-century professional development and support,…

  7. Towards regional projections of twenty-first century sea-level change based on IPCC SRES scenarios

    NARCIS (Netherlands)

    Slangen, A.B.A.; Katsman, C.A.; Van de Wal, R.S.W.; Vermeersen, L.L.A.; Riva, R,E.M.

    2012-01-01

    Sea-level change is often considered to be globally uniform in sea-level projections. However, local relative sea-level (RSL) change can deviate substantially from the global mean. Here, we present maps of twenty-first century local RSL change estimates based on an ensemble of coupled climate model

  8. Thinking Like Twenty-First Century Learners: An Exploration of Blog Use in a Skills-Based Counselor Education Course

    Science.gov (United States)

    Buono, Lisa L.

    2011-01-01

    Twenty-first century learners and millennial generation students have integrated technology into their personal lives; there is a growing expectation for technology to be integrated into their classroom experiences as well. Incorporating technology, including the use of blogs, into teaching and learning is receiving attention in the literature.…

  9. The Twenty-First Century and Legal Studies in Business: Preparing Students to Perform in a Globally Competitive Environment

    Science.gov (United States)

    Burke, Debra D.; Johnson, Ronald A.; Kemp, Deborah J.

    2010-01-01

    This article first examines the dynamic role business education must play in a flat world economy. Second, it explains how legal courses in the business curricula already equip students with portable twenty-first-century skills and relevant academic content. The article then advocates the acceptance of the Boyer Model of Scholarship, which defines…

  10. Teaching and Learning for the Twenty-First Century: Educational Goals, Policies, and Curricula from Six Nations

    Science.gov (United States)

    Reimers, Fernando M., Ed.; Chung, Connie K., Ed.

    2016-01-01

    This book describes how different nations have defined the core competencies and skills that young people will need in order to thrive in the twenty-first-century, and how those nations have fashioned educational policies and curricula meant to promote those skills. The book examines six countries--Chile, China, India, Mexico, Singapore, and the…

  11. Science Teacher Education in the Twenty-First Century: A Pedagogical Framework for Technology-Integrated Social Constructivism

    Science.gov (United States)

    Barak, Miri

    2017-01-01

    Changes in our global world have shifted the skill demands from acquisition of structured knowledge to mastery of skills, often referred to as twenty-first century competencies. Given these changes, a sequential explanatory mixed methods study was undertaken to (a) examine predominant instructional methods and technologies used by teacher…

  12. Pressure effects on regional mean sea level trends in the German Bight in the twenty-first century

    Science.gov (United States)

    Albrecht, Frauke; Weisse, Ralf

    2014-05-01

    The effect of large-scale atmospheric pressure changes on regional mean sea level projections in the German Bight in the twenty-first century are considered. A developed statistical model is applied to climate model data of sea level pressure for the twenty-first century to assess the potential contribution of large-scale atmospheric changes to future sea level changes in the German Bight. Using 78 experiments, an ensemble mean of 1.4-cm rise in regional mean sea level is estimated until the end of the twenty-first century. Changes are somewhat higher for realisations of the special report on emission scenarios (SRES) A1B and A2, but generally do not exceed a few centimeters. This is considerably smaller than the changes expected from steric and self-gravitational effects. Large-scale changes in sea level pressure are thus not expected to provide a substantial contribution to twenty-first century sea level changes in the German Bight.

  13. Impact of the Twenty-First Century Afterschool Program on Student Achievement in Mathematics and Language Arts

    Science.gov (United States)

    Venzen, Marc A.

    2011-01-01

    The purpose of this quantitative study was to examine the academic impacts of the Twenty-First Century Community Learning Centers on students who participated in this program. The following research questions guided the study: (a) are there significant differences between the Criterion Reference Competency Test English Language Arts scores of…

  14. Thinking Like Twenty-First Century Learners: An Exploration of Blog Use in a Skills-Based Counselor Education Course

    Science.gov (United States)

    Buono, Lisa L.

    2011-01-01

    Twenty-first century learners and millennial generation students have integrated technology into their personal lives; there is a growing expectation for technology to be integrated into their classroom experiences as well. Incorporating technology, including the use of blogs, into teaching and learning is receiving attention in the literature.…

  15. Essential Soft Skills for Success in the Twenty-First Century Workforce as Perceived by Business Educators

    Science.gov (United States)

    Mitchell, Geana W.; Skinner, Leane B.; White, Bonnie J.

    2010-01-01

    Background: Soft skills describe career attributes that individuals should possess, such as team skills, communication skills, ethics, time-management skills, and an appreciation for diversity. In the twenty-first century workforce, soft skills are important in every business sector. However, employers in business continuously report that new…

  16. Science Teacher Education in the Twenty-First Century: a Pedagogical Framework for Technology-Integrated Social Constructivism

    Science.gov (United States)

    Barak, Miri

    2016-01-01

    Changes in our global world have shifted the skill demands from acquisition of structured knowledge to mastery of skills, often referred to as twenty-first century competencies. Given these changes, a sequential explanatory mixed methods study was undertaken to (a) examine predominant instructional methods and technologies used by teacher educators, (b) identify attributes for learning and teaching in the twenty-first century, and (c) develop a pedagogical framework for promoting meaningful usage of advanced technologies. Quantitative and qualitative data were collected via an online survey, personal interviews, and written reflections with science teacher educators and student teachers. Findings indicated that teacher educators do not provide sufficient models for the promotion of reform-based practice via web 2.0 environments, such as Wikis, blogs, social networks, or other cloud technologies. Findings also indicated four attributes for teaching and learning in the twenty-first century: (a) adapting to frequent changes and uncertain situations, (b) collaborating and communicating in decentralized environments, (c) generating data and managing information, and (d) releasing control by encouraging exploration. Guided by social constructivist paradigms and twenty-first century teaching attributes, this study suggests a pedagogical framework for fostering meaningful usage of advanced technologies in science teacher education courses.

  17. A Model for Reform. Two-Year Colleges in the Twenty-First Century: Breaking Down Barriers (TYC21).

    Science.gov (United States)

    Palmer, James C., Ed.

    This book describes the TYC21 project (Two-Year Colleges in the Twenty-First Century: Breaking Down Barriers), which provided a framework to implement reform in science, engineering, and physics education at two-year colleges via the cooperative efforts of faculty in cross-educational activities. The project sought to increase the quality of…

  18. Transformative Pedagogy, Leadership and School Organisation for the Twenty-First-Century Knowledge-Based Economy: The Case of Singapore

    Science.gov (United States)

    Dimmock, Clive; Goh, Jonathan W. P.

    2011-01-01

    Singapore has a high performing school system; its students top international tests in maths and science. Yet while the Singapore government cherishes its world class "brand", it realises that in a globally competitive world, its schools need to prepare students for the twenty-first-century knowledge-based economy (KBE). Accordingly, over the past…

  19. Index to the Twenty-first Semiannual Report of the Commission to the Congress. July 1956 - December 1956

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, Lewis L.

    1957-01-31

    This volume contains a name and subject indext for the twenty-first semiannual report of the United States Atomic Energy Commission to Congress. The full semiannual report covers the major unclassified activities of the Commission from July 1956 through December 1956.

  20. Predicting climate change impacts on native and invasive tree species using radial growth and twenty-first century climate scenarios

    NARCIS (Netherlands)

    González-Muñoz, N.; Linares, J.C.; Castro-Díez, P.; Sass-Klaassen, U.G.W.

    2014-01-01

    The climatic conditions predicted for the twenty-first century may aggravate the extent and impacts of plant invasions, by favouring those invaders more adapted to altered conditions or by hampering the native flora. We aim to predict the fate of native and invasive tree species in the oak forests o

  1. Science Teacher Education in the Twenty-First Century: a Pedagogical Framework for Technology-Integrated Social Constructivism

    Science.gov (United States)

    Barak, Miri

    2017-04-01

    Changes in our global world have shifted the skill demands from acquisition of structured knowledge to mastery of skills, often referred to as twenty-first century competencies. Given these changes, a sequential explanatory mixed methods study was undertaken to (a) examine predominant instructional methods and technologies used by teacher educators, (b) identify attributes for learning and teaching in the twenty-first century, and (c) develop a pedagogical framework for promoting meaningful usage of advanced technologies. Quantitative and qualitative data were collected via an online survey, personal interviews, and written reflections with science teacher educators and student teachers. Findings indicated that teacher educators do not provide sufficient models for the promotion of reform-based practice via web 2.0 environments, such as Wikis, blogs, social networks, or other cloud technologies. Findings also indicated four attributes for teaching and learning in the twenty-first century: (a) adapting to frequent changes and uncertain situations, (b) collaborating and communicating in decentralized environments, (c) generating data and managing information, and (d) releasing control by encouraging exploration. Guided by social constructivist paradigms and twenty-first century teaching attributes, this study suggests a pedagogical framework for fostering meaningful usage of advanced technologies in science teacher education courses.

  2. Predicting climate change impacts on native and invasive tree species using radial growth and twenty-first century climate scenarios

    NARCIS (Netherlands)

    González-Muñoz, N.; Linares, J.C.; Castro-Díez, P.; Sass-Klaassen, U.G.W.

    2014-01-01

    The climatic conditions predicted for the twenty-first century may aggravate the extent and impacts of plant invasions, by favouring those invaders more adapted to altered conditions or by hampering the native flora. We aim to predict the fate of native and invasive tree species in the oak forests

  3. From School to Cafe and Back Again: Responding to the Learning Demands of the Twenty-First Century

    Science.gov (United States)

    McWilliam, Erica

    2011-01-01

    This paper traces the historical origins of formal and informal lifelong learning to argue that optimal twenty-first-century education can and should draw on the traditions of both the school and the coffee house or cafe. For some time now, educational policy documents and glossy school brochures have come wrapped in the mantle of lifelong…

  4. The Art of Negotiation: What the Twenty-First Century Business Student Should Know

    Science.gov (United States)

    McClendon, Bill; Burke, Debra D.; Willey, Lorrie

    2010-01-01

    Negotiation skills are vital for concluding international treaties on subjects ranging from arms agreements, and rights in outer space to trade agreements. Yet the importance of being able to negotiate effectively is not limited to international treaties or crises situations. Using negotiation exercises represents a student-centered approach to…

  5. The Art of Negotiation: What the Twenty-First Century Business Student Should Know

    Science.gov (United States)

    McClendon, Bill; Burke, Debra D.; Willey, Lorrie

    2010-01-01

    Negotiation skills are vital for concluding international treaties on subjects ranging from arms agreements, and rights in outer space to trade agreements. Yet the importance of being able to negotiate effectively is not limited to international treaties or crises situations. Using negotiation exercises represents a student-centered approach to…

  6. Spinning Straw into Gold: A Community College Library's Twenty First Century Transformation

    Science.gov (United States)

    McKay, Devin

    2011-01-01

    This article describes a library renovation project in a community college that involved using existing space and reorganizing it to support the way students learn both individually and collaboratively. The article also looks at the importance of the academic library as place.

  7. Spinning Straw into Gold: A Community College Library's Twenty First Century Transformation

    Science.gov (United States)

    McKay, Devin

    2011-01-01

    This article describes a library renovation project in a community college that involved using existing space and reorganizing it to support the way students learn both individually and collaboratively. The article also looks at the importance of the academic library as place.

  8. Engineers and maintenance managers in the Australian coal-mining industry: are we ready for the twenty-first century?

    Energy Technology Data Exchange (ETDEWEB)

    Clark, D. [Terotechnology Services Pty Ltd., Cessnock, NSW (Australia)

    2000-10-01

    Coal mines in the twenty-first century will require engineers to respond to arguably the greatest challenge in the history of the Australian coal industry. The article discusses the roles and skills, qualifications and experience required by mine maintenance managers. The author uses information he obtained from a survey in 1990, of perceptions of skills and experience required in maintenance managers and from an analysis, in 1996, of requirements of such personnel made from situations vacant advertisements. 4 refs., 1 tab.

  9. Twenty-first century learning in school systems: the case of the Metropolitan School District of Lawrence Township, Indianapolis, Indiana.

    Science.gov (United States)

    Capuano, Marcia; Knoderer, Troy

    2006-01-01

    To empower students with skills such as information and technological literacy, global awareness and cultural competence, self-direction, and sound reasoning, teachers must master these skills themselves. This chapter examines how the Digital Age Literacy Initiative of the Metropolitan School District of Lawrence Township in Indianapolis, Indiana, which is funded by the Lilly Endowment, incorporated twenty-first century learning through a systemic approach involving teacher training and the use of data. The authors explain the district's content, process, and context goals toward accomplishing its mission of empowering students with the necessary twenty-first century skills to succeed in the digital age. The district places a strong emphasis on professional development for teachers. To support the necessary teacher learning and therefore sustain the work of the initiative, the district has adopted action research, self-assessment, and an online professional development network. To support teachers in implementing new strategies, master teachers serve as digital age literacy coaches. The chapter discusses the initiative's focus on evidence of progress. Through a partnership with the Metiri Group of California, the district has built a range of assessments including online inventories and twenty-first century skill rubrics. For example, the Mankato Survey collected teacher and student data around access, ability, and use of technology in the classroom in 2001 and then in 2004. This research showed significant gains in some technologies across all grade levels and consistent gains in nearly all technologies for middle and high school students. As it moves into the next phase of implementing the Digital Age Literacy Initiative, the district embraces the systemic shifts in school culture necessary to institutionalize twenty-first century learning.

  10. Transnational Circulations of "Laban" Methods: Gender, Power Relations, and Negotiated Meanings in Early Twenty-First Century South Korea's Modernity

    OpenAIRE

    Hwang, Hye-Won

    2013-01-01

    This dissertation investigates western-developed "Laban" methods that middle-class Korean female Laban specialists transported to South Korea and, there, tactically adapted to South Korean contexts during the 1990s and the early twenty-first century. It particularly focuses on how these Korean women's repurposings of "Laban" methods intersect with conditions of global capitalism and specific South Korean cultural politics, job markets, and dance instruction and employment networks. I claim th...

  11. Gravity's ghost and big dog scientific discovery and social analysis in the twenty-first century

    CERN Document Server

    Collins, Harry

    2013-01-01

    Gravity's Ghost and Big Dog brings to life science's efforts to detect cosmic gravitational waves. These ripples in space-time are predicted by general relativity, and their discovery will not only demonstrate the truth of Einstein's theories but also transform astronomy. Although no gravitational wave has ever been directly detected, the previous five years have been an especially exciting period in the field. Here sociologist Harry Collins offers readers an unprecedented view of gravitational wave research and explains what it means for an analyst to do work of this kind.

  12. A Painter's View of the Cosmos In the Twenty-first Century

    Science.gov (United States)

    Cro-Ken, K.

    2016-01-01

    I am an ecosystem artist who uses paint to bring nature's “invisible forces” into view. My eco-sensitive palette recreates the push-pull forces that shape and mold all things. As a result, I create microscopic and telescopic views of earth and places scattered throughout our universe. Self-similarity led me to realize that if I want my mind to wonder into the far reaches of the universe, I must draw closer to nature. I show how space looks and appears and, more importantly, how it moves. My speed element palette is a portal through which I peer into the universe at scales great and small using paint as my lens. Microscopes, telescopes, the Internet, and even eyeglasses are portals through which technology affords us the ability to see that which is unseen to the unaided eye. Rather than see the world and then paint, the opposite is true for me. My work is revelatory, not representational and, as such, seeks similar occurrences in nature. Just as a planet's surface is a visual record of past events, so too do speed element experiments reveal traces of the past. It would be more accurate to call a painting that comes to rest a “painted.” It is video that captures images that eluded capture by the canvas and could more accurately be called a “painting. ” Simply put, I manipulate space, time, and matter—and the matter is never just paint.

  13. Numerical Propulsion System Simulation for Space Transportation

    Science.gov (United States)

    Owen, Karl

    2000-01-01

    Current system simulations are mature, difficult to modify, and poorly documented. Probabilistic life prediction techniques for space applications are in their early application stage. Many parts of the full system, variable fidelity simulation, have been demonstrated individually or technology is available from aeronautical applications. A 20% reduction in time to design with improvements in performance and risk reduction is anticipated. GRC software development will proceed with similar development efforts in aeronautical simulations. Where appropriate, parallel efforts will be encouraged/tracked in high risk areas until success is assured.

  14. Ditching Simulation of Air and Space Vehicles

    Science.gov (United States)

    Patel, Mahesh; Mouillet, Jean-Baptiste; Burkhalter, Drew; Robert, Adrien; Schwoertzig, Thierry

    2014-06-01

    The impact on water of an aircraft or a re-entry space vehicle is a very complex event and considered as an important issue for the air and space industry. To ensure the safety of the crew and to limit the risks of loss of the vehicle, a prediction of its structural behaviour under various ditching configurations must be performed. Structural tests are very costly and must be limited in scale or number, so numerical simulations may be of great help for this purpose.Numerical simulations aim to predict the trajectory of the vehicle under impact, the pressure repartition on the body, structural stress, possible damages to the structure and occupant 'g' levels during impact. Physically, two types of configurations involving different phenomenon can be identified, vertical impacts and impacts with high horizontal components, where air entrapment, ventilation and cavitations can be the dimensioning factors.The purpose of this paper is to give an overview of the features of ALE (Arbitrary Lagrangian-Eulerian) transient dynamic explicit simulation methods to perform such simulations. This paper details analysis of the critical simulation parameters, fluid dynamic calculations, CPU and model size reduction techniques, Fluid-Structure contact modelling, examples of such simulations, correlation to physical tests using Explicit Finite Element based code RADIOSS from Altair Engineering. Two examples of re-entry and ditching numerical simulation are discussed in this paper with comparisons to physical test data.

  15. An experimental public: heterogeneous groups defining embodiment in the early twenty-first century.

    Science.gov (United States)

    Laki, Julia

    2014-01-01

    In this paper, I take a look at certain forms of contemporary art as practices that allow meanings within biomedical science and medical practice to emerge in novel ways. I claim that conceptual art and biological art are two unique spaces within which the understanding of embodiment and disease comes to be shaped actively and reflexively, sometimes on the very level of the materiality of the body, sometimes through the articulation and representation of medical images and technologies. I link these developments to Paul Rabinow's notion of biosociality and argue that the molecularization and geneticization of the medical gaze, conjoined with certain social and cultural shifts, results in the formation of an experimental public of artists, scientists and lay people, all invested in actively shaping the conceptualization of bodies and diseases. This will take me to a consideration of the intertwining of art and medicine beyond the domain of the visual.

  16. How we made professionalism relevant to twenty-first century residents.

    Science.gov (United States)

    Khandelwal, Aditi; Nugus, Peter; Elkoushy, Mohamed A; Cruess, Richard L; Cruess, Sylvia R; Smilovitch, Mark; Andonian, Sero

    2015-01-01

    The complexity of the current medical trainee work environment, including the impact of social media participation, is underappreciated. Despite rapid adoption of social media by residents and the introduction of social media guidelines targeted at medical professionals, there is a paucity of data evaluating practical methods to incorporate social media into professionalism teaching curricula. We developed a flipped classroom program, focusing on the application of professionalism principles to challenging real-life scenarios including social media-related issues. The pre-workshop evaluation showed that the participants had a good understanding of basic professionalism concepts. A post-workshop survey assessed residents' comfort level with professionalism concepts. The post-workshop survey revealed that the postgraduate trainees perceived significant improvement in their understanding of professionalism (p classroom format in combination with simulation-based sessions allows easy incorporation of contemporary professionalism issues surrounding social media.

  17. Impact of climate change on mid-twenty-first century growing seasons in Africa

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Kerry H.; Vizy, Edward K. [The University of Texas at Austin, Department of Geological Sciences, Jackson School of Geosciences, Austin, TX (United States)

    2012-12-15

    Changes in growing seasons for 2041-2060 across Africa are projected using a regional climate model at 90-km resolution, and confidence in the predictions is evaluated. The response is highly regional over West Africa, with decreases in growing season days up to 20% in the western Guinean coast and some regions to the east experiencing 5-10% increases. A longer growing season up to 30% in the central and eastern Sahel is predicted, with shorter seasons in parts of the western Sahel. In East Africa, the short rains (boreal fall) growing season is extended as the Indian Ocean warms, but anomalous mid-tropospheric moisture divergence and a northward shift of Sahel rainfall severely curtails the long rains (boreal spring) season. Enhanced rainfall in January and February increases the growing season in the Congo basin by 5-15% in association with enhanced southwesterly moisture transport from the tropical Atlantic. In Angola and the southern Congo basin, 40-80% reductions in austral spring growing season days are associated with reduced precipitation and increased evapotranspiration. Large simulated reductions in growing season over southeastern Africa are judged to be inaccurate because they occur due to a reduction in rainfall in winter which is over-produced in the model. Only small decreases in the actual growing season are simulated when evapotranspiration increases in the warmer climate. The continent-wide changes in growing season are primarily the result of increased evapotranspiration over the warmed land, changes in the intensity and seasonal cycle of the thermal low, and warming of the Indian Ocean. (orig.)

  18. Navigation simulator for the Space Tug vehicle

    Science.gov (United States)

    Colburn, B. K.; Boland, J. S., III; Peters, E. G.

    1977-01-01

    A general simulation program (GSP) for state estimation of a nonlinear space vehicle flight navigation system is developed and used as a basis for evaluating the performance of a Space Tug navigation system. An explanation of the iterative guidance mode (IGM) guidance law, derivation of the dynamics, coordinate frames and state estimation routines are given in order to clarify the assumptions and approximations made. A number of simulation and analytical studies are used to demonstrate the operation of the Tug system. Included in the simulation studies are (1) initial offset vector parameter study; (2) propagation time vs accuracy; (3) measurement noise parametric study and (4) reduction in computational burden of an on-board implementable scheme. From the results of these studies, conclusions and recommendations concerning future areas of practical and theoretical work are presented.

  19. Exploring Space Physics Concepts Using Simulation Results

    Science.gov (United States)

    Gross, N. A.

    2008-05-01

    The Center for Integrated Space Weather Modeling (CISM), a Science and Technology Center (STC) funded by the National Science Foundation, has the goal of developing a suite of integrated physics based computer models of the space environment that can follow the evolution of a space weather event from the Sun to the Earth. In addition to the research goals, CISM is also committed to training the next generation of space weather professionals who are imbued with a system view of space weather. This view should include an understanding of both helio-spheric and geo-space phenomena. To this end, CISM offers a yearly Space Weather Summer School targeted to first year graduate students, although advanced undergraduates and space weather professionals have also attended. This summer school uses a number of innovative pedagogical techniques including devoting each afternoon to a computer lab exercise that use results from research quality simulations and visualization techniques, along with ground based and satellite data to explore concepts introduced during the morning lectures. These labs are suitable for use in wide variety educational settings from formal classroom instruction to outreach programs. The goal of this poster is to outline the goals and content of the lab materials so that instructors may evaluate their potential use in the classroom or other settings.

  20. Lunar-based optical telescopes: Planning astronomical tools of the twenty-first century

    Science.gov (United States)

    Hilchey, J. D.; Nein, M. E.

    1995-01-01

    A succession of optical telescopes, ranging in aperture from 1 to 16 m or more, can be deployed and operated on the lunar surface over the next half-century. These candidates to succeed NASA's Great Observatories would capitalize on the unique observational advantages offered by the Moon. The Lunar Telescope Working Group and the LUTE Task Team of the George C. Marshall Space Flight Center (MSFC) have assessed the feasibility of developing and deploying these facilities. Studies include the 16-m Large Lunar Telescope (LLT); the Lunar Cluster Telescope Experiment (LCTE), a 4-m precursor to the LLT; the 2-m Lunar Transit Telescope (LTT); and its precursor, the 1-m Lunar Ultraviolet Telescope Experiment (LUTE). The feasibility of developing and deploying each telescope was assessed and system requirements and options for supporting technologies, subsystems, transportation, and operations were detailed. Influences of lunar environment factors and site selection on telescope design and operation were evaluated, and design approaches and key tradeoffs were established. This paper provides an overview of the study results. Design concepts and brief system descriptions are provided, including subsystem and mission options selected for the concepts.

  1. Energy and environmental issues on building services engineering in the life-style of twenty-first century; 21 seiki no kankyo energy to life style no arikata

    Energy Technology Data Exchange (ETDEWEB)

    Nakahara, N.

    2001-01-05

    The author discussed building services technologies that characterize twenty-first century in view of the energy, environment and life-style. Natural energy systems shall take a principal roll in this century from the energy and environmental point of view. Among building engineering issues, IT technologies such as BEMS and BOFD connected to the internet and backed by simulation technologies will become most important together with establishment of commissioning process. Home office will become the new life-style of the century and home environmental conditioning systems shall be further developed. As important social system issues, international relations and language matters, the education system of environmental system engineering and the licensing of professional engineers were discussed. Maintaining philosophy to seek for truth hidden behind the shadow was concluded as most important for environmental engineers. (author)

  2. Baltic Sea climate in the late twenty-first century: a dynamical downscaling approach using two global models and two emission scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Meier, H.E.M. [Swedish Meteorological and Hydrological Institute, Rossby Centre, Norrkoeping (Sweden)

    2006-07-15

    A regional ocean circulation model was used to project Baltic Sea climate at the end of the twenty-first century. A set of four scenario simulations was performed utilizing two global models and two forcing scenarios. To reduce model biases and to spin up future salinity the so-called {delta}-change approach was applied. Using a regional coupled atmosphere-ocean model 30-year climatological monthly mean changes of atmospheric surface data and river discharge into the Baltic Sea were calculated from previously conducted time slice experiments. These changes were added to reconstructed atmospheric surface fields and runoff for the period 1903-1998. The total freshwater supply (runoff and net precipitation) is projected to increase between 0 and 21%. Due to increased westerlies in winter the annual mean wind speed will be between 2 and 13% larger compared to present climate. Both changes will cause a reduction of the average salinity of the Baltic Sea between 8 and 50%. Although salinity in the entire Baltic might be significantly lower at the end of the twenty-first century, deep water ventilation will very likely only slightly change. The largest change is projected for the secondary maximum of sea water age within the halocline. Further, the average temperature will increase between 1.9 and 3.2 C. The temperature response to atmospheric changes lags several months. Future annual maximum sea ice extent will decrease between 46 and 77% in accordance to earlier studies. However, in contrast to earlier results in the warmest scenario simulation one ice-free winter out of 96 seasons was found. Although wind speed changes are uniform, extreme sea levels may increase more than the mean sea level. In two out of four projections significant changes of 100-year surge heights were found. (orig.)

  3. Neutral Buoyancy Simulator- NB38 -Space Telescope

    Science.gov (United States)

    1980-01-01

    The Hubble Space Telescope (HST) is a cooperative program of the European Space Agency (ESA) and the National Aeronautical and Space Administration (NASA) to operate a long-lived space-based observatory. It was the flagship mission of NASA's Great Observatories program. The HST program began as an astronomical dream in the 1940s. During the 1970s and 1980s, the HST was finally designed and built becoming operational in the 1990s. The HST was deployed into a low-Earth orbit on April 25, 1990 from the cargo bay of the Space Shuttle Discovery (STS-31). The design of the HST took into consideration its length of service and the necessity of repairs and equipment replacement by making the body modular. In doing so, subsequent shuttle missions could recover the HST, replace faulty or obsolete parts and be re-released. Pictured is MSFC's Neutral Buoyancy Simulator (NBS) that served as the test center for shuttle astronauts training for Hubble related missions. Shown are astronauts Bruce McCandless and Sharnon Lucid being fitted for their space suits prior to entering the NBS to begin training on the space telescope axial scientific instrument changeout.

  4. Neutral Buoyancy Simulator- NB38 -Space Telescope

    Science.gov (United States)

    1980-01-01

    The Hubble Space Telescope (HST) is a cooperative program of the European Space Agency (ESA) and the National Aeronautical and Space Administration (NASA) to operate a long-lived space-based observatory. It was the flagship mission of NASA's Great Observatories program. The HST program began as an astronomical dream in the 1940s. During the 1970s and 1980s, the HST was finally designed and built becoming operational in the 1990s. The HST was deployed into a low-Earth orbit on April 25, 1990 from the cargo bay of the Space Shuttle Discovery (STS-31). The design of the HST took into consideration its length of service and the necessity of repairs and equipment replacement by making the body modular. In doing so, subsequent shuttle missions could recover the HST, replace faulty or obsolete parts and be re-released. Pictured is MSFC's Neutral Buoyancy Simulator (NBS) that served as the test center for shuttle astronauts training for Hubble related missions. Shown are astronauts Bruce McCandless and Sharnon Lucid being fitted for their space suits prior to entering the NBS to begin training on the space telescope axial scientific instrument changeout.

  5. Twenty-first century skills for students: hands-on learning after school builds school and life success.

    Science.gov (United States)

    Cabral, Leide

    2006-01-01

    At the core of the movement for twenty-first century skills are students. The growing efforts to increase programs leveraging out-of-school time are focused on giving American youth everything they need to compete in this increasingly complex world. The author is one of many students who have been well served by initiatives imparting twenty-first century skills during after-school hours. Now a senior at Boston Latin School, the author has been helped along the way by Citizen Schools, an after-school education program focused on hands-on learning apprenticeships and homework help. While enrolled in the program as a middle school student, the author took part in projects that exemplified hands-on, inquiry-based learning that helped her develop twenty-first century skills. For example, along with dozens of other students, she advanced her data analysis skills by analyzing statistics about Boston Public high schools, which also helped her select and enroll in one of the city's premier exam schools. Also, she and her peers worked with corporate attorneys who served as writing coaches and whose expertise the author drew from in producing a published essay and greatly improving her writing skills. The author now finds that the public speaking, leadership, organizational, social, and management abilities she built through her participation in Citizen Schools are a great asset to her in high school. The confidence with which she tackles her responsibilities can also be traced back to her experiences in the program. As she looks toward college, the author reflects and realizes that being actively involved in a quality after-school program put her on track for a successful future.

  6. Space shuttle main engine hardware simulation

    Science.gov (United States)

    Vick, H. G.; Hampton, P. W.

    1985-01-01

    The Huntsville Simulation Laboratory (HSL) provides a simulation facility to test and verify the space shuttle main engine (SSME) avionics and software system using a maximum complement of flight type hardware. The HSL permits evaluations and analyses of the SSME avionics hardware, software, control system, and mathematical models. The laboratory has performed a wide spectrum of tests and verified operational procedures to ensure system component compatibility under all operating conditions. It is a test bed for integration of hardware/software/hydraulics. The HSL is and has been an invaluable tool in the design and development of the SSME.

  7. Redesigning healthcare systems to meet the health challenges associated with climate change in the twenty-first century.

    Science.gov (United States)

    Phua, Kai-Lit

    2015-01-01

    In the twenty-first century, climate change is emerging as a significant threat to the health and well-being of the public through links to the following: extreme weather events, sea level rise, temperature-related illnesses, air pollution patterns, water security, food security, vector-borne infectious diseases, and mental health effects (as a result of extreme weather events and climate change-induced population displacement). This article discusses how national healthcare systems can be redesigned through changes in its components such as human resources, facilities and technology, health information system, and health policy to meet these challenges.

  8. Preparing Teacher-Students for Twenty-First-Century Learning Practices (PREP 21): A Framework for Enhancing Collaborative Problem-Solving and Strategic Learning Skills

    Science.gov (United States)

    Häkkinen, Päivi; Järvelä, Sanna; Mäkitalo-Siegl, Kati; Ahonen, Arto; Näykki, Piia; Valtonen, Teemu

    2017-01-01

    With regard to the growing interest in developing teacher education to match the twenty-first-century skills, while many assumptions have been made, there has been less theoretical elaboration and empirical research on this topic. The aim of this article is to present our pedagogical framework for the twenty-first-century learning practices in…

  9. Preparing Teacher-Students for Twenty-First-Century Learning Practices (PREP 21): A Framework for Enhancing Collaborative Problem-Solving and Strategic Learning Skills

    Science.gov (United States)

    Häkkinen, Päivi; Järvelä, Sanna; Mäkitalo-Siegl, Kati; Ahonen, Arto; Näykki, Piia; Valtonen, Teemu

    2017-01-01

    With regard to the growing interest in developing teacher education to match the twenty-first-century skills, while many assumptions have been made, there has been less theoretical elaboration and empirical research on this topic. The aim of this article is to present our pedagogical framework for the twenty-first-century learning practices in…

  10. Public Heath in Colonial and Post-Colonial Ghana: Lesson-Drawing for The Twenty-First Century

    Directory of Open Access Journals (Sweden)

    Adu-Gyamfi, Samuel

    2017-06-01

    Full Text Available Public health in twenty-first century Ghana is mired with several issues ranging from the inadequacy of public health facilities, improper settlement planning, insanitary conditions, and the inadequacy of laws and their implementation. This situation compared to the colonial era is a direct contradiction. Development in the pre-colonial era to the colonial era sought to make the prevention of diseases a priority in the colonial administration. This was begun with the establishment of the health branch in 1909 as a response to the bubonic plague that was fast spreading in the colony. From here public health policies and strategies were enacted to help the diseases prevention cause. Various public health boards, the medical research institute or the laboratory branch, the waste management department, the use of preventive medicine and maintenance of good settlement planning and sanitation were public health measures in the colonial era. This research seeks to analyse the public health system in the colonial era so as to draw basic lessons for twenty-first century Ghana. Archival data and other secondary sources are reviewed and analysed to help draw these lessons. Richard Rose’s lesson-drawing approach was used to draw the lessons.

  11. A needs assessment for DOE`s packaging and transportation activities - a look into the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Pope, R. [Oak Ridge National Lab., TN (United States); Turi, G.; Brancato, R.; Blalock, L. [Department of Energy, Germantown, MD (United States); Merrill, O. [Scientific Applications International Corp., Gaithersburg, MD (United States)

    1995-12-31

    The U.S. Department of Energy (DOE) has performed a department-wide scoping of its packaging and transportation needs and has arrived at a projection of these needs for well into the twenty-first century. The assessment, known as the Transportation Needs Assessment (TNA) was initiated during August 1994 and completed in December 1994. The TNA will allow DOE to better prepare for changes in its transportation requirements in the future. The TNA focused on projected, quantified shipping needs based on forecasts of inventories of materials which will ultimately require transport by the DOE for storage, treatment and/or disposal. In addition, experts provided input on the growing needs throughout DOE resulting from changes in regulations, in DOE`s mission, and in the sociopolitical structure of the United States. Through the assessment, DOE`s transportation needs have been identified for a time period extending from the present through the first three decades of the twenty-first century. The needs assessment was accomplished in three phases: (1) defining current packaging, shipping, resource utilization, and methods of managing packaging and transportation activities; (2) establishing the inventory of materials which DOE will need to transport on into the next century and scenarios which project when, from where, and to where these materials will need to be transported; and (3) developing requirements and projected changes for DOE to accomplish the necessary transport safely and economically.

  12. Simulating Autonomous Telecommunication Networks for Space Exploration

    Science.gov (United States)

    Segui, John S.; Jennings, Esther H.

    2008-01-01

    Currently, most interplanetary telecommunication systems require human intervention for command and control. However, considering the range from near Earth to deep space missions, combined with the increase in the number of nodes and advancements in processing capabilities, the benefits from communication autonomy will be immense. Likewise, greater mission science autonomy brings the need for unscheduled, unpredictable communication and network routing. While the terrestrial Internet protocols are highly developed their suitability for space exploration has been questioned. JPL has developed the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to help characterize network designs and protocols. The results will allow future mission planners to better understand the trade offs of communication protocols. This paper discusses various issues with interplanetary network and simulation results of interplanetary networking protocols.

  13. Simulating Autonomous Telecommunication Networks for Space Exploration

    Science.gov (United States)

    Segui, John S.; Jennings, Esther H.

    2008-01-01

    Currently, most interplanetary telecommunication systems require human intervention for command and control. However, considering the range from near Earth to deep space missions, combined with the increase in the number of nodes and advancements in processing capabilities, the benefits from communication autonomy will be immense. Likewise, greater mission science autonomy brings the need for unscheduled, unpredictable communication and network routing. While the terrestrial Internet protocols are highly developed their suitability for space exploration has been questioned. JPL has developed the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to help characterize network designs and protocols. The results will allow future mission planners to better understand the trade offs of communication protocols. This paper discusses various issues with interplanetary network and simulation results of interplanetary networking protocols.

  14. Deconstructing adolescent same-sex attraction and sexual behavior in the twenty-first century: perspectives for the clinician.

    Science.gov (United States)

    Sison, Antonio C; Greydanus, Donald E

    2007-06-01

    The adolescent with same-sex attraction in the twenty-first century straddles ambivalent cultural and religious attitudes regarding gay, lesbian, bisexual, or transgender (GLBT) issues; rapid technologic advances that provide easy access to information on sex and sex partners; and the clinician's sensitivity about GLBT issues and his or her awareness of how adolescents can use technology for sex-seeking behavior. It is necessary to deconstruct these factors into defined frameworks. Three checklists, the Clinician's Framework Guide Questions for the GLBT Adolescent, Clinician Reaction to GLBT Issues Checklist, and Global GLBT Checklist for Biopsychosocial Risk Factors, may aid the clinician in acquiring an appreciation of the global dynamics between the gay adolescent, the clinician, and the impact of current social realities.

  15. End-Permian Mass Extinction in the Oceans: An Ancient Analog for the Twenty-First Century?

    Science.gov (United States)

    Payne, Jonathan L.; Clapham, Matthew E.

    2012-05-01

    The greatest loss of biodiversity in the history of animal life occurred at the end of the Permian Period (˜252 million years ago). This biotic catastrophe coincided with an interval of widespread ocean anoxia and the eruption of one of Earth's largest continental flood basalt provinces, the Siberian Traps. Volatile release from basaltic magma and sedimentary strata during emplacement of the Siberian Traps can account for most end-Permian paleontological and geochemical observations. Climate change and, perhaps, destruction of the ozone layer can explain extinctions on land, whereas changes in ocean oxygen levels, CO2, pH, and temperature can account for extinction selectivity across marine animals. These emerging insights from geology, geochemistry, and paleobiology suggest that the end-Permian extinction may serve as an important ancient analog for twenty-first century oceans.

  16. The Renaissance of Word-of-Mouth Marketing: A ‘New’ Standard in Twenty-First Century Marketing Management?!

    Directory of Open Access Journals (Sweden)

    Norbert H. Meiners

    2010-12-01

    Full Text Available In this paper the importance of word of mouth for marketing management in the twenty-first century will be discussed. After a short introduction, there will be a focus on the demarcations and problems of traditional marketing. Then, in the third section, word ofmouth (WOM and word-of-mouth marketing (WOMM as a ‘new’ standard in modern marketing are described. The fourth section broaches the importance of word of mouth and word-of-mouth marketing from the point of view of business and consumers, and then in the fifth section their importance for the Internet is considered. Finally, in section six evangelism marketing is discussed as the most effective form of word-of-mouth marketing. Section seven concludes the paper with a short summary. The paper focuses on scholarly articles andcurrent research so as to keep theory as close as possible to reality.

  17. Us, them, and others: reflections on Canadian multiculturalism and national identity at the turn of the twenty-first century.

    Science.gov (United States)

    Winter, Elke

    2014-05-01

    The John Porter Lecture at the annual meeting of the Canadian Sociological Association in Victoria 2013 draws upon my book Us, Them, and Others: Pluralism and National Identity in Diverse Societies. Incorporating the findings from an analysis of Canadian English-language newspaper discourses during the 1990s into a theoretical framework inspired by Weberian sociology, the book argues that pluralism is best understood as a dynamic set of triangular relations where the compromise between unequal groups--"us" and "others"--is rendered meaningful through the confrontation with real or imagined outsiders ("them"). The lecture summarizes the theoretical contribution and explains how multiculturalism became consolidated in dominant Canadian discourses in the late 1990s. The lecture then discusses changes to Canadian multicultural identity at the beginning of the twenty-first century.

  18. Sub-Saharan Northern African climate at the end of the twenty-first century: forcing factors and climate change processes

    Energy Technology Data Exchange (ETDEWEB)

    Patricola, C.M. [Cornell University, Department of Earth and Atmospheric Sciences, Ithaca, NY (United States); Texas A and M University, Department of Atmospheric Sciences, College Station, TX (United States); Cook, K.H. [The University of Texas at Austin, Department of Geological Sciences, Jackson School of Geosciences, Austin, TX (United States)

    2011-09-15

    A regional climate model, the Weather Research and Forecasting (WRF) Model, is forced with increased atmospheric CO{sub 2} and anomalous SSTs and lateral boundary conditions derived from nine coupled atmosphere-ocean general circulation models to produce an ensemble set of nine future climate simulations for northern Africa at the end of the twenty-first century. A well validated control simulation, agreement among ensemble members, and a physical understanding of the future climate change enhance confidence in the predictions. The regional model ensembles produce consistent precipitation projections over much of northern tropical Africa. A moisture budget analysis is used to identify the circulation changes that support future precipitation anomalies. The projected midsummer drought over the Guinean Coast region is related partly to weakened monsoon flow. Since the rainfall maximum demonstrates a southward bias in the control simulation in July-August, this may be indicative of future summer drying over the Sahel. Wetter conditions in late summer over the Sahel are associated with enhanced moisture transport by the West African westerly jet, a strengthening of the jet itself, and moisture transport from the Mediterranean. Severe drought in East Africa during August and September is accompanied by a weakened Indian monsoon and Somali jet. Simulations with projected and idealized SST forcing suggest that overall SST warming in part supports this regional model ensemble agreement, although changes in SST gradients are important over West Africa in spring and fall. Simulations which isolate the role of individual climate forcings suggest that the spatial distribution of the rainfall predictions is controlled by the anomalous SST and lateral boundary conditions, while CO{sub 2} forcing within the regional model domain plays an important secondary role and generally produces wetter conditions. (orig.)

  19. Estimates of herbicide use for the twenty-first through the fortieth most-used herbicides in the conterminous United States

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This coverage contains estimates of herbicide use for the twenty-first through the fortieth most-used herbicides in the conterminous United States as reported in...

  20. The Significance, for Readers in the Twenty-first Century, of the Character of Safie in Mary Shelley's Frankestein

    Directory of Open Access Journals (Sweden)

    Esther Katheu Mbithi

    2017-02-01

    Full Text Available The Significance, for Readers in the Twenty-first century, of the Character of Safie in Mary Shelley's Frankenstein. Abstract: This paper presents a critical look at one of the characters in Mary Shelley’s Frankenstein, Safie, through the lenses of a female African scholar in the twenty-first century. A close look at the narrative structure leads to the gradual peeling off of the first two layers, to the core of the narration. The paper looks keenly at a minor character in this core, in the light of feminist literary criticism and against the concept of globalisation. The analysis of the character of Safie, carried out in full consciousness of the fact that Frankenstein was written two hundred years ago, involves a look at the words used to describe her; comparison between her and other characters, particularly other female characters; as well as a general overview of her circumstances and how she reacts in response to them. Keywords: literature, English, Gothic, Romanticism, feminism, Frankestein. // El significado del personaje de Safie en Frankestein, de Mary Shelley, para los lectores del siglo XXI. Resumen: Este artículo estudia críticamente a Safie, uno de los personajes de Frankestein de Mary Shelley, desde la perspectiva de una investigadora africana. Un examen atento de la estructura narrativa de la novela permite despojarla de sus dos capas exteriores, para alcanzar el núcleo de la narración. Este trabajo examina un personaje secundario de este núcleo a la luz de la crítica literaria feminista y en contra del concepto de globalización. El análisis del personaje de Safie, realizado con plena conciencia del hecho de que Frankestein fue escrito hace doscientos años, exige el examen de los términos usados para describirla; la comparación entre ella y otros personajes, especialmente otros personajes femeninos; así como una contextualización de sus circunstancias y de cómo Safie reacciona ante ellas. Palabras clave

  1. An estimate of the effects of climate change on the rainfall of Mediterranean Spain by the late twenty first century

    Energy Technology Data Exchange (ETDEWEB)

    Sumner, G.N. [Centre for Geography, University of Wales, Lampeter, Ceredigion, Wales (United Kingdom); Romero, R.; Homar, V.; Ramis, C.; Alonso, S. [Departament de Fisica, Universitat de les Illes Balears, Palma de Mallorca (Spain); Zorita, E. [Institut fuer Gewaesserphysik GKSS, Geesthacht (Germany)

    2003-05-01

    Heading Abstract. The study uses a GCM (ECHAM-OPYC3) and the association between the atmospheric circulation at 925 and 500 hPa and the distribution of daily precipitation for Mediterranean Spain (from earlier analyses) to give estimates of the probable annual precipitation for the late twenty first century. A down-scaling technique is used which involves the matching of daily circulation output from the model for a sequence of years in the late twentieth century (1971-90) and for a corresponding period in the late twenty first century (2080-99) to derive probable regional atmospheric pattern (AP) frequencies for this latter period, and thence to estimate likely changes in annual precipitation. Model days are classified by searching for the closest analogue amongst 19 previously identified APs from an earlier study. Future annual precipitation distribution is derived using previously established relationships between circulation type and daily precipitation distribution. Predicted AP frequencies and precipitation amounts and distribution are compensated by comparing model output with ECMWF data for a decade (1984-93) within the 1971-90 sequence, so that the analysis also provides a verification of the performance of the model. In general the agreement between model output and actual AP frequencies is very good for the present day, though for this southerly region the model appears slightly to under-estimate the frequency of easterly type circulations, many of which yield some of the most significant autumn severe storm rainfalls along the Mediterranean coast. The model tends to over-estimate the frequency of westerly type situations. The study utilises a 'moving window' technique in an attempt to derive measures of inter-decadal variability within the two 20 year periods. This avoids use of data from outside the periods, which would incorporate changing AP frequencies during a period of sustained climate change. Quite pronounced changes in frequency are

  2. Evaluating CMIP5 models using GPS radio occultation COSMIC temperature in UTLS region during 2006-2013: twenty-first century projection and trends

    Science.gov (United States)

    Kishore, P.; Basha, Ghouse; Venkat Ratnam, M.; Velicogna, Isabella; Ouarda, T. B. M. J.; Narayana Rao, D.

    2016-11-01

    This paper provides a first overview of the performance of global climate models participating in the Coupled Model Inter-Comparison Project phase 5 (CMIP5) in simulating the upper troposphere and lower stratosphere (UTLS) temperatures. Temperature from CMIP5 models is evaluated with high resolution global positioning system radio occultation (GPSRO) constellation observing system for meteorology, ionosphere, and climate (COSMIC) data during the period of July 2006-December 2013. Future projections of 17 CMIP5 models based on the representative concentration pathway (RCP) 8.5 scenarios are utilized to assess model performance and to identify the biases in the temperature in the UTLS region at eight different pressure levels. The evaluations were carried out vertically, regionally, and globally to understand the temperature uncertainties in CMIP5 models. It is found that the CMIP5 models successfully reproduce the general features of temperature structure in terms of vertical, annual, and inter-annual variation. The ensemble mean of CMIP5 models compares well with the COSMIC GPSRO data with a mean difference of ±1 K. In the tropical region, temperature biases vary from one model to another. The spatial difference between COSMIC and ensemble mean reveals that at 100 hPa, the models show a bias of about ±2 K. With increase in altitude the bias decreases and turns into a cold bias over the tropical and Antarctic regions. The future projections of the CMIP5 models were presented during 2006-2099 under the RCP 8.5 scenarios. Projections show a warming trend at 300, 200, and 100 hPa levels over a wide region of 60°N-45°S. The warming decreases rapidly and becomes cooling with increase in altitudes by the end of twenty-first century. Significant cooling is observed at 30, 20, and 10 hPa levels. At 300/10 hPa, the temperature trend increases/decreases by 0.82/0.88 K/decade at the end of twenty-first century under RCP 8.5 scenarios.

  3. The era of the wandering mind? Twenty-first century research on self-generated mental activity.

    Science.gov (United States)

    Callard, Felicity; Smallwood, Jonathan; Golchert, Johannes; Margulies, Daniel S

    2013-01-01

    The first decade of the twenty-first century was characterized by renewed scientific interest in self-generated mental activity (activity largely generated by the individual, rather than in direct response to experimenters' instructions or specific external sensory inputs). To understand this renewal of interest, we interrogated the peer-reviewed literature from 2003 to 2012 (i) to explore recent changes in use of terms for self-generated mental activity; (ii) to investigate changes in the topics on which mind wandering research, specifically, focuses; and (iii) to visualize co-citation communities amongst researchers working on self-generated mental activity. Our analyses demonstrated that there has been a dramatic increase in the term "mind wandering" from 2006, and a significant crossing-over of psychological investigations of mind wandering into cognitive neuroscience (particularly in relation to research on the default mode and default mode network). If our article concludes that this might, indeed, be the "era of the wandering mind," it also calls for more explicit reflection to be given by researchers in this field to the terms they use, the topics and brain regions they focus on, and the research literatures that they implicitly foreground or ignore.

  4. Twenty-first century projected summer mean climate in the Mediterranean interpreted through the monsoon-desert mechanism

    Science.gov (United States)

    Cherchi, Annalisa; Annamalai, H.; Masina, Simona; Navarra, Antonio; Alessandri, Andrea

    2016-10-01

    The term "monsoon-desert mechanism" indicates the relationship between the diabatic heating associated with the South Asian summer monsoon rainfall and the remote response in the western sub-tropics where long Rossby waves anchor strong descent with high subsidence. In CMIP5 twenty-first century climate scenarios, the precipitation over South Asia is projected to increase. This study investigates how this change could affect the summer climate projections in the Mediterranean region. In a linear framework the monsoon-desert mechanism in the context of climate change would imply that the change in subsidence over the Mediterranean should be strongly linked with the changes in South Asian monsoon precipitation. The steady-state solution from a linear model forced with CMIP5 model projected precipitation change over South Asia shows a broad region of descent in the Mediterranean, while the results from CMIP5 projections differ having increased descent mostly in the western sector but also decreased descent in parts of the eastern sector. Local changes in circulation, particularly the meridional wind, promote cold air advection that anchors the descent but the barotropic Rossby wave nature of the wind anomalies consisting of alternating northerlies/southerlies favors alternating descent/ascent locations. In fact, the local mid-tropospheric meridional wind changes have the strongest correlation with the regions where the difference in subsidence is largest. There decreased rainfall is mostly balanced by changes in moisture, omega and in the horizontal advection of moisture.

  5. The era of the wandering mind? Twenty-first century research on self-generated mental activity

    Directory of Open Access Journals (Sweden)

    Felicity eCallard

    2013-12-01

    Full Text Available The first decade of the twenty-first century was characterized by renewed scientific interest in self-generated mental activity (activity largely generated by the individual, rather than in response to experimenters’ instructions or specific external sensory inputs. To understand this renewal of interest, we interrogated the peer-reviewed literature from 2003–2012 (i to explore recent changes in use of terms for self-generated mental activity; (ii to investigate changes in the topics on which mind wandering research, specifically, focuses; and (iii to visualize co-citation communities amongst researchers working on self-generated mental activity. Our analyses demonstrated that there has been a dramatic increase in the term mind wandering, and a significant crossing-over of psychological investigations of mind wandering, specifically, into cognitive neuroscience. If this is, indeed, the ‘era of the wandering mind’, our paper calls for more explicit reflection to be given by mind wandering researchers to the terms they use, the topics and brain regions they focused on, and the research literatures that they implicitly foreground or ignore as not relevant.

  6. Dialogues with Tradition: Feminist-Queer Encounters in German Crime Stories at the Turn of the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    Faye Stewart

    2011-01-01

    Full Text Available Pieke Biermann’s feminist crime collection Mit Zorn, Charme, und Methode (1992 and Lisa Kuppler’s gay and lesbian anthology Queer Crime (2002 engage in a common project, the rewriting of a popular genre to give voice to previously marginalized identities and perspectives. This article investigates the ways in which each volume negotiates the gendered conventions of crime fiction and its subcategories, feminist and queer crime. A comparative analysis of three mysteries from each collection demonstrates the converging and diverging tendencies of feminist and queer representation in turn-of-the-twenty-first century crime narratives. Feminist mysteries by Edith Kneifl, Birgit Rabisch, and Barbara Neuhaus shift generic conventions by aligning narrative perspectives with a feminist world view that destabilizes male-dominated structures through the intervention of a strong female figure who successfully closes the case. By contrast, queer mysteries by Thea Dorn, Ursula Steck, and Susanne Billig destabilize generic conventions and structures of identity altogether, highlighting misreadings and unsolved mysteries through parody, double entendre, and open endings.

  7. Semi-empirical versus process-based sea-level projections for the twenty-first century

    Science.gov (United States)

    Orlić, Mirko; Pasarić, Zoran

    2013-08-01

    Two dynamical methods are presently used to project sea-level changes during the next century. The process-based method relies on coupled atmosphere-ocean models to estimate the effects of thermal expansion and on sea-level models combined with certain empirical relationships to determine the influence of land-ice mass changes. The semi-empirical method uses various physically motivated relationships between temperature and sea level, with parameters determined from the data, to project total sea level. However, semi-empirical projections far exceed process-based projections. Here, we test the robustness of semi-empirical projections to the underlying assumptions about the inertial and equilibrium responses of sea level to temperature forcing and the impacts of groundwater depletion and dam retention during the twentieth century. Our results show that these projections are sensitive to the dynamics considered and the terrestrial-water corrections applied. For B1, which is a moderate climate-change scenario, the lowest semi-empirical projection of sea-level rise over the twenty-first century equals 62+/-14cm. The average value is substantially smaller than previously published semi-empirical projections and is therefore closer to the corresponding process-based values. The standard deviation is larger than the uncertainties of process-based estimates.

  8. Building Interdisciplinary Leadership Skills among Health Practitioners in the Twenty-First Century: An Innovative Training Model.

    Science.gov (United States)

    Negandhi, Preeti; Negandhi, Himanshu; Tiwari, Ritika; Sharma, Kavya; Zodpey, Sanjay P; Quazi, Zahiruddin; Gaidhane, Abhay; Jayalakshmi N; Gijare, Meenakshi; Yeravdekar, Rajiv

    2015-01-01

    Transformational learning is the focus of twenty-first century global educational reforms. In India, there is a need to amalgamate the skills and knowledge of medical, nursing, and public health practitioners and to develop robust leadership competencies among them. This initiative proposed to identify interdisciplinary leadership competencies among Indian health practitioners and to develop a training program for interdisciplinary leadership skills through an Innovation Collaborative. Medical, nursing, and public health institutions partnered in this endeavor. An exhaustive literature search was undertaken to identify leadership competencies in these three professions. Published evidence was utilized in searching for the need for interdisciplinary training of health practitioners, including current scenarios in interprofessional health education and the key competencies required. The interdisciplinary leadership competencies identified were self-awareness, vision, self-regulation, motivation, decisiveness, integrity, interpersonal communication skills, strategic planning, team building, innovation, and being an effective change agent. Subsequently, a training program was developed, and three training sessions were piloted with 66 participants. Each cohort comprised a mix of participants from different disciplines. The pilot training guided the development of a training model for building interdisciplinary leadership skills and organizing interdisciplinary leadership workshops. The need for interdisciplinary leadership competencies is recognized. The long-term objective of the training model is integration into the regular medical, nursing, and public health curricula, with the aim of developing interdisciplinary leadership skills among them. Although challenging, formal incorporation of leadership skills into health professional education is possible within the interdisciplinary classroom setting using principles of transformative learning.

  9. Twenty First Century Science: Insights from the Design and Implementation of a Scientific Literacy Approach in School Science

    Science.gov (United States)

    Millar, Robin

    2006-10-01

    Although the term “scientific literacy” has been increasingly used in recent years to characterise the aim of school science education, there is still considerable uncertainty about its meaning and implications for the curriculum. A major national project in England, Twenty First Century Science, is evaluating the feasibility of a more flexible science curriculum structure for 15-year-old and 16-year-old students, centring around a core course for all students with a scientific literacy emphasis. Over 12,000 students in 78 schools have followed this course since September 2003. The development of a detailed teaching programme is an important means of clarifying the meanings and implications of a “scientific literacy” approach. Questionnaire data from teachers at the end of the first and second years of the project (N = 40 and N = 51) show a strongly positive evaluation of the central features of the course design. Teachers perceive the scientific literacy emphasis as markedly increasing student interest and engagement. Key challenges identified are the language and reasoning demands in looking critically at public accounts of science, and the classroom management of more open discussion about science-related issues.

  10. The era of the wandering mind? Twenty-first century research on self-generated mental activity

    Science.gov (United States)

    Callard, Felicity; Smallwood, Jonathan; Golchert, Johannes; Margulies, Daniel S.

    2013-01-01

    The first decade of the twenty-first century was characterized by renewed scientific interest in self-generated mental activity (activity largely generated by the individual, rather than in direct response to experimenters’ instructions or specific external sensory inputs). To understand this renewal of interest, we interrogated the peer-reviewed literature from 2003 to 2012 (i) to explore recent changes in use of terms for self-generated mental activity; (ii) to investigate changes in the topics on which mind wandering research, specifically, focuses; and (iii) to visualize co-citation communities amongst researchers working on self-generated mental activity. Our analyses demonstrated that there has been a dramatic increase in the term “mind wandering” from 2006, and a significant crossing-over of psychological investigations of mind wandering into cognitive neuroscience (particularly in relation to research on the default mode and default mode network). If our article concludes that this might, indeed, be the “era of the wandering mind,” it also calls for more explicit reflection to be given by researchers in this field to the terms they use, the topics and brain regions they focus on, and the research literatures that they implicitly foreground or ignore. PMID:24391606

  11. Animal models of human cerebellar ataxias: a cornerstone for the therapies of the twenty-first century.

    Science.gov (United States)

    Manto, Mario; Marmolino, Daniele

    2009-09-01

    Cerebellar ataxias represent a group of disabling neurological disorders. Our understanding of the pathogenesis of cerebellar ataxias is continuously expanding. A considerable number of laboratory animals with neurological mutations have been reported and numerous relevant animal models mimicking the phenotype of cerebellar ataxias are becoming available. These models greatly help dissecting the numerous mechanisms of cerebellar dysfunction, a major step for the assessment of therapeutics targeting a given deleterious pathway and for the screening of old or newly synthesized chemical compounds. Nevertheless, differences between animal models and human disorders should not be overlooked and difficulties in terms of characterization should not be occulted. The identification of the mutations of many hereditary ataxias, the development of valuable animal models, and the recent identifications of the molecular mechanisms underlying cerebellar disorders represent a combination of key factors for the development of anti-ataxic innovative therapies. It is anticipated that the twenty-first century will be the century of effective therapies in the field of cerebellar ataxias. The animal models are a cornerstone to reach this goal.

  12. Inherited Behaviour in Wilkie Collins's The Legacy of Cain: Victorian Studies and Twenty-First-Century Science Policy

    Directory of Open Access Journals (Sweden)

    Jay Clayton

    2008-10-01

    Full Text Available 'The Legacy of Cain' (1888, the last novel Wilkie Collins published before his death, is structured as a case study of the respective influences of nature and nurture. The central question is whether the daughter of a murderess will reveal a 'hereditary taint' or whether a loving and religious environment will prove the stronger influence on the child's character. The Victorians knew nothing about genetics, but scientists and novelists alike shared a vigorous discourse about the hereditary transmission of behaviour and whether 'character' was heritable. In the wake of genetic and epigenetic discoveries, we find ourselves faced with a situation comparable to that Collins encountered in the 1880s, when evolutionary theory was unsettling many things Victorians held dear. Exploring how novelists and scientists in the late-nineteenth century attempted to cope with notions of inherited behaviour without genetics sheds an interesting light on twenty-first-century reactions to the news that acquired characteristics and behavioural traits may be passed on to future generations through mechanisms other than the gene. The emergence of an influential, semi-autonomous zone of activity known as the policy arena, which occupies an intermediate position between the disciplinary specialist and the public sphere, enables humanists to participate in science policy today in ways comparable to the contributions made by Victorian literary figures such as Wilkie Collins, George Eliot, Matthew Arnold and Samuel Butler.

  13. Projected impact of twenty-first century ENSO changes on rainfall over Central America and northwest South America from CMIP5 AOGCMs

    Science.gov (United States)

    Steinhoff, Daniel F.; Monaghan, Andrew J.; Clark, Martyn P.

    2015-03-01

    Due to the importance that the El Niño-Southern Oscillation (ENSO) has on rainfall over the tropical Americas, future changes in ENSO characteristics and teleconnections are important for regional hydroclimate. Projected changes to the ENSO mean state and characteristics, and the resulting impacts on rainfall anomalies over Central America, Colombia, and Ecuador during the twenty-first century are explored for several forcing scenarios using a suite of coupled atmosphere-ocean global climate models (AOGCMs) from the fifth phase of the Coupled Model Intercomparison Project (CMIP5). Mean-state warming of eastern tropical Pacific sea surface temperatures, drying of Central America and northern Colombia, and wetting of southwest Colombia and Ecuador are consistent with previous studies that used earlier versions of the AOGCMs. Current and projected future characteristics of ENSO (frequency, duration, amplitude) show a wide range of values across the various AOGCMs. The magnitude of ENSO-related rainfall anomalies are currently underestimated by most of the models, but the model ensembles generally simulate the correct sign of the anomalies across the seasons around the peak ENSO effects. While the models capture the broad present-day ENSO-related rainfall anomalies, there is not a clear sense of projected future changes in the precipitation anomalies.

  14. Deep Space Navigation and Timing Architecture and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Microcosm will develop a deep space navigation and timing architecture and associated simulation, incorporating state-of-the art radiometric, x-ray pulsar, and laser...

  15. A scenario of European climate change for the late twenty-first century: seasonal means and interannual variability

    Energy Technology Data Exchange (ETDEWEB)

    Rowell, David P. [Hadley Centre for Climate Prediction and Research, Exeter (United Kingdom)

    2005-12-01

    A scenario of European climate change for the late twenty-first century is described, using a high-resolution state-of-the-art model. A time-slice approach is used, whereby the atmospheric general circulation model, HadAM3P, was integrated for two periods, 1960-1990 and 2070-2100, using the SRES A2 scenario. For the first time an ensemble of such experiments was produced, along with appropriate statistical tests for assessing significance. The focus is on changes to the statistics of seasonal means, and includes analysis of both multi-year means and interannual variance. All four seasons are assessed, and anomalies are mapped for surface air temperature, precipitation and snow mass. Mechanisms are proposed where these are dominated by straightforward local processes. In winter, the largest warming occurs over eastern Europe, up to 7 C, mean snow mass is reduced by at least 80% except over Scandinavia, and precipitation increases over all but the southernmost parts of Europe. In summer, temperatures rise by 6-9 C south of about 50 N, and mean rainfall is substantially reduced over the same area. In spring and autumn, anomalies tend to be weaker, but often display patterns similar to the preceding season, reflecting the inertia of the land surface component of the climate system. Changes in interannual variance are substantial in the solsticial seasons for many regions (note that for precipitation, variance estimates are scaled by the square of the mean). In winter, interannual variability of near-surface air temperature is considerably reduced over much of Europe, and the relative variability of precipitation is reduced north of about 50 N. In summer, the (relative) interannual variance of both variables increases over much of the continent. (orig.)

  16. The traditional commons of England and Wales in the twenty-first century: meeting new and old challenges

    Directory of Open Access Journals (Sweden)

    Chris Short

    2008-07-01

    Full Text Available The commons literature makes much of the changes within the traditional land use sectors of developed countries. This largely focuses on the decline of the economic function of commons that threaten their existence, the emergence of multiple use patterns, and the resilience and policy adaptation needed to continue. The situation in England and Wales is used to illustrate that commons are increasingly important to a number of ‘new’ rural functions and that the associated policy developments may hold an important message for progress towards sustainable multifunctional land management more generally. This article reviews and updates what is meant by the term common land within England and Wales, while outlining its current importance and threats. The commons literature is investigated to see if the approach is useful in revealing the current issues associated with the incorporation of new stakeholders and functions within a traditional structure. Recent changes and developments surrounding the Commons Act 2006 are assessed to see if they are likely to assist in sustaining these commons through the twenty-first century. The article argues that any new approach requires long term planning and a commitment to support local participation among commoners and others who are involved in the governance and management of these areas of land. In order for these challenges to be met there needs to be an understanding of the functions and cultural traditions of common land as well as of the changes in society associated with the decline in traditional agrarian management in developed countries. Such challenges can rarely if ever be achieved through legislation and policy developments, requiring an investment in developing locally based solutions.

  17. Dietary guidelines to nourish humanity and the planet in the twenty-first century. A blueprint from Brazil.

    Science.gov (United States)

    Monteiro, Carlos Augusto; Cannon, Geoffrey; Moubarac, Jean-Claude; Martins, Ana Paula Bortoletto; Martins, Carla Adriano; Garzillo, Josefa; Canella, Daniela Silva; Baraldi, Larissa Galastri; Barciotte, Maluh; Louzada, Maria Laura da Costa; Levy, Renata Bertazzi; Claro, Rafael Moreira; Jaime, Patrícia Constante

    2015-09-01

    To present and discuss the dietary guidelines issued by the Brazilian government in 2014. The present paper describes the aims of the guidelines, their shaping principles and the approach used in the development of recommendations. The main recommendations are outlined, their significance for the cultural, socio-economic and environmental aspects of sustainability is discussed, and their application to other countries is considered. Brazil in the twenty-first century. All people in Brazil, now and in future. The food- and meal-based Brazilian Dietary Guidelines address dietary patterns as a whole and so are different from nutrient-based guidelines, even those with some recommendations on specific foods or food groups. The guidelines are based on explicit principles. They take mental and emotional well-being into account, as well as physical health and disease prevention. They identify diet as having cultural, socio-economic and environmental as well as biological and behavioural dimensions. They emphasize the benefits of dietary patterns based on a variety of natural or minimally processed foods, mostly plants, and freshly prepared meals eaten in company, for health, well-being and all relevant aspects of sustainability, as well as the multiple negative effects of ready-to-consume ultra-processed food and drink products. The guidelines' recommendations are designed to be sustainable personally, culturally, socially, economically and environmentally, and thus fit to face this century. They are for foods, meals and dietary patterns of types that are already established in Brazil, which can be adapted to suit the climate, terrain and customs of all countries.

  18. Deep Space Navigation and Timing Architecture and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Microcosm team will complete the simulation tool architecture early in Phase II, and in parallel begin to develop the simulation. The tool is architected for...

  19. Macro Level Simulation Model Of Space Shuttle Processing

    Science.gov (United States)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  20. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    Science.gov (United States)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  1. Recognized simulation of space locomotive target based on sky background

    Science.gov (United States)

    Zhang, Han; Ma, Jianhong

    2017-01-01

    Space moving object recognition and tracking is an important research topic in computer vision. It has broad application prospects in space exploration, detection of traffic flow, military field, automatic control and other fields. This paper aims to propose a new space target recognition algorithm, and use this algorithm to identify the motion trajectory simulation of a certain object in the universe.

  2. Space Science Investigation: NASA ISS Stowage Simulator

    Science.gov (United States)

    Crawford, Gary

    2017-01-01

    During this internship the opportunity was granted to work with the Integrated, Graphics, Operations and Analysis Laboratory (IGOAL) team. The main assignment was to create 12 achievement patches for the Space Station training simulator called the "NASA ISS Stowage Training Game." This project was built using previous IGOAL developed software. To accomplish this task, Adobe Photoshop and Adobe Illustrator were used to craft the badges and other elements required. Blender, a 3D modeling software, was used to make the required 3D elements. Blender was a useful tool to make things such as a CTB bag for the "No More Bob" patch which shows a gentleman kicking a CTB bag into the distance. It was also used to pose characters to the positions that was optimal for their patches as in the "Station Sanitation" patch which portrays and astronaut waving on a U.S module on a truck. Adobe Illustrator was the main piece of software for this task. It was used to craft the badges and upload them when they were completed. The style of the badges were flat, meaning that they shouldn't look three dimensional in any way, shape or form. Adobe Photoshop was used when any pictures need brightening and was where the texture for the CTB bag was made. In order for the patches to be ready for the game's next major release, they have to go under some critical reviewing, revising and re-editing to make sure the other artists and the rest of the staff are satisfied with the final products. Many patches were created and revamped to meet the flat setting and incorporate suggestions from the IGOAL team. After the three processes were completed, the badges were implemented into the game (reference fig1 for badges). After a month of designing badges, the finished products were placed into the final game build via the programmers. The art was the final piece in showcasing the latest build to the public for testing. Comments from the testers were often exceptional and the feedback on the badges were

  3. Sea-level rise and its possible impacts given a ‘beyond 4°C world’ in the twenty-first century

    NARCIS (Netherlands)

    Nicholls, R.; Marinova, N.A.; Lowe, J.; Brown, S.; Vellinga, P.

    2011-01-01

    The range of future climate-induced sea-level rise remains highly uncertain with continued concern that large increases in the twenty-first century cannot be ruled out. The biggest source of uncertainty is the response of the large ice sheets of Greenland and west Antarctica. Based on our analysis,

  4. Bruce's Magnificent Quartet: Inquiry, Community, Technology and Literacy--Implications for Renewing Qualitative Research in the Twenty-First Century

    Science.gov (United States)

    Davidson, Judith

    2014-01-01

    Bruce and Bishop's community informatics work brings forward four critical concepts: inquiry, community, technology, and literacy. These four terms serve as the basis for a discussion of qualitative research in the twenty-first century--what is lacking and what is needed. The author suggests that to resolve the tensions or challenges…

  5. Towards a Common Ground: Arab versus Western Views about Challenges of Islamic Religious Education Curriculum of the Twenty-First Century

    Science.gov (United States)

    Rashed, Hazem

    2015-01-01

    The Islamic religious education curriculum of the twenty-first century is a cornerstone in a hot debate about necessary educational reforms in the Islamic World. This study aimed at investigating the depth of agreement/disagreement between Arab and Western educational views about challenges of this curriculum through reviewing academic…

  6. Towards a Common Ground: Arab versus Western Views about Challenges of Islamic Religious Education Curriculum of the Twenty-First Century

    Science.gov (United States)

    Rashed, Hazem

    2015-01-01

    The Islamic religious education curriculum of the twenty-first century is a cornerstone in a hot debate about necessary educational reforms in the Islamic World. This study aimed at investigating the depth of agreement/disagreement between Arab and Western educational views about challenges of this curriculum through reviewing academic…

  7. Bruce's Magnificent Quartet: Inquiry, Community, Technology and Literacy--Implications for Renewing Qualitative Research in the Twenty-First Century

    Science.gov (United States)

    Davidson, Judith

    2014-01-01

    Bruce and Bishop's community informatics work brings forward four critical concepts: inquiry, community, technology, and literacy. These four terms serve as the basis for a discussion of qualitative research in the twenty-first century--what is lacking and what is needed. The author suggests that to resolve the tensions or challenges…

  8. The Challenges of Teaching and Learning about Science in the Twenty-First Century: Exploring the Abilities and Constraints of Adolescent Learners

    Science.gov (United States)

    Anderman, Eric M.; Sinatra, Gale M.; Gray, DeLeon L.

    2012-01-01

    In this article, we critically examine skills that are necessary for the effective learning of science in adolescent populations. We argue that a focus on twenty-first-century skills among adolescents within the context of science instruction must be considered in light of research on cognitive and social development. We first review adolescents'…

  9. Solving the problems we face: the United States Environmental Protection Agency, sustainability, and the challenges of the twenty-first century

    Science.gov (United States)

    Addressing the problems of the twenty-first century will require new initiatives that complement traditional regulatory activities. Existing regulations, such as the Clean Air Act and Clean Water Act are important safety nets in the United States for protecting human health and t...

  10. Measuring Twenty-First Century Skills: Development and Validation of a Scale for In-Service and Pre-Service Teachers

    Science.gov (United States)

    Jia, Yueming; Oh, Youn Joo; Sibuma, Bernadette; LaBanca, Frank; Lorentson, Mhora

    2016-01-01

    A self-report scale that measures teachers' confidence in teaching students about twenty-first century skills was developed and validated with pre-service and in-service teachers. First, 16 items were created to measure teaching confidence in six areas: information literacy, collaboration, communication, innovation and creativity, problem solving,…

  11. Status Report of Simulated Space Radiation Environment Facility

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Phil Hyun; Nho, Young Chang; Jeun, Joon Pyo; Choi, Jae Hak; Lim, Youn Mook; Jung, Chan Hee; Jeon, Young Kyu

    2007-11-15

    The technology for performance testing and improvement of materials which are durable at space environment is a military related technology and veiled and securely regulated in advanced countries such as US and Russia. This core technology cannot be easily transferred to other country too. Therefore, this technology is the most fundamental and necessary research area for the successful establishment of space environment system. Since the task for evaluating the effects of space materials and components by space radiation plays important role in satellite lifetime extension and running failure percentage decrease, it is necessary to establish simulated space radiation facility and systematic testing procedure. This report has dealt with the status of the technology to enable the simulation of space environment effects, including the effect of space radiation on space materials. This information such as the fundamental knowledge of space environment and research status of various countries as to the simulation of space environment effects of space materials will be useful for the research on radiation hardiness of the materials. Furthermore, it will be helpful for developer of space material on deriving a better choice of materials, reducing the design cycle time, and improving safety.

  12. The Use of Microgravity Simulators for Space Research

    Science.gov (United States)

    Zhang, Ye; Richards, Stephanie E.; Wade, Randall I.; Richards, Jeffrey T.; Fritsche, Ralph F.; Levine, Howard G.

    2016-01-01

    The spaceflight environment is known to influence biological processes ranging from stimulation of cellular metabolism to possible impacts on cellular damage repair, suppression of immune functions, and bone loss in astronauts. Microgravity is one of the most significant stress factors experienced by living organisms during spaceflight, and therefore, understanding cellular responses to altered gravity at the physiological and molecular level is critical for expanding our knowledge of life in space. Since opportunities to conduct experiments in space are scarce, various microgravity simulators and analogues have been widely used in space biology ground studies. Even though simulated microgravity conditions have produced some, but not all of the biological effects observed in the true microgravity environment, they provide test beds that are effective, affordable, and readily available to facilitate microgravity research. A Micro-g Simulator Center is being developed at Kennedy Space Center (KSC) to offer a variety of microgravity simulators and platforms for Space Biology investigators. Assistance will be provided by both KSC and external experts in molecular biology, microgravity simulation, and engineering. Comparisons between the physical differences in microgravity simulators, examples of experiments using the simulators, and scientific questions regarding the use of microgravity simulators will be discussed.

  13. Wandering crowd simulation based on space syntax theory

    Institute of Scientific and Technical Information of China (English)

    ZHENG Liping; SUN Chen; LIU Li; WANG Lin

    2012-01-01

    Space syntax has proven there appears to be a fundamental process that informs human and social usage of an environ- ment, and the effects of spatial configuration on movement patterns are consistent with a model of individual decision behavior. In- troducing space syntax to crowd simulation will enable space structure guide the random movement of the crowd with no specific targets. This paper proposes a simulation method of the wandering crowd, which calculates the crowd distribution corresponding to the space through space syntax and uses a hybrid path planning algorithm to dynamically navigate the crowd to conform to the dis- tribution. Experiments show the presented method can obtain reasonable and vision-realistic simulation results.

  14. Twenty-First Century Technology and the Global Environment: Developing a Cause/Effect Relationship Perspective Among Proactive Action Students.

    Science.gov (United States)

    Peters, Richard O.

    Technology, defined as power to build or to destroy, affects both the natural and social environments. Technological societies are characterized by five elements: green revolution, industry, medicine, biology, and space technology. To demonstrate that individuals and groups perceive the effects of these aspects differently, a summary of nine pro…

  15. A Simulation and Modeling Framework for Space Situational Awareness

    Science.gov (United States)

    Olivier, S.

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. This framework includes detailed models for threat scenarios, signatures, sensors, observables and knowledge extraction algorithms. The framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the details of the modeling and simulation framework, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical and infra-red brightness calculations, generic radar system models, generic optical and infra-red system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The specific modeling of the Space Surveillance Network is performed in collaboration with the Air Force Space Command Space Control Group. We will demonstrate the use of this integrated simulation and modeling framework on specific threat scenarios, including space debris and satellite maneuvers, and we will examine the results of case studies involving the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.

  16. Future change of climate in South America in the late twenty-first century: intercomparison of scenarios from three regional climate models

    Energy Technology Data Exchange (ETDEWEB)

    Marengo, Jose A.; Valverde, Maria C.; Torres, Roger R.; Santos, Daniel C. [Centro de Ciencia do Sistema Terrestre, Instituto Nacional de Pesquisas Espaciais, CCST/INPE, Sao Paulo, SP (Brazil); Ambrizzi, Tercio; Rocha, Rosmeri P. da [University of Sao Paulo, IAG-DCA/USP, Department of Atmospheric Sciences, Sao Paulo, SP (Brazil); Alves, Lincoln M. [Centro de Previsao de Tempo e Estudos Climaticos, Instituto Nacional de Pesquisas Espaciais, CPTEC/INPE, Sao Paulo, SP (Brazil); Cuadra, Santiago V. [Universidade Federal de Vicosa, Vicosa, MG (Brazil); Ferraz, Simone E.T. [Universidade Federal de Santa Maria, Santa Maria, RS (Brazil)

    2010-11-15

    Regional climate change projections for the last half of the twenty-first century have been produced for South America, as part of the CREAS (Cenarios REgionalizados de Clima Futuro da America do Sul) regional project. Three regional climate models RCMs (Eta CCS, RegCM3 and HadRM3P) were nested within the HadAM3P global model. The simulations cover a 30-year period representing present climate (1961-1990) and projections for the IPCC A2 high emission scenario for 2071-2100. The focus was on the changes in the mean circulation and surface variables, in particular, surface air temperature and precipitation. There is a consistent pattern of changes in circulation, rainfall and temperatures as depicted by the three models. The HadRM3P shows intensification and a more southward position of the subtropical Pacific high, while a pattern of intensification/weakening during summer/winter is projected by the Eta CCS/RegCM3. There is a tendency for a weakening of the subtropical westerly jet from the Eta CCS and HadRM3P, consistent with other studies. There are indications that regions such of Northeast Brazil and central-eastern and southern Amazonia may experience rainfall deficiency in the future, while the Northwest coast of Peru-Ecuador and northern Argentina may experience rainfall excesses in a warmer future, and these changes may vary with the seasons. The three models show warming in the A2 scenario stronger in the tropical region, especially in the 5 N-15 S band, both in summer and especially in winter, reaching up to 6-8 C warmer than in the present. In southern South America, the warming in summer varies between 2 and 4 C and in winter between 3 and 5 C in the same region from the 3 models. These changes are consistent with changes in low level circulation from the models, and they are comparable with changes in rainfall and temperature extremes reported elsewhere. In summary, some aspects of projected future climate change are quite robust across this set of

  17. Future change of climate in South America in the late twenty-first century: intercomparison of scenarios from three regional climate models

    Science.gov (United States)

    Marengo, Jose A.; Ambrizzi, Tercio; Da Rocha, Rosmeri P.; Alves, Lincoln M.; Cuadra, Santiago V.; Valverde, Maria C.; Torres, Roger R.; Santos, Daniel C.; Ferraz, Simone E. T.

    2010-11-01

    Regional climate change projections for the last half of the twenty-first century have been produced for South America, as part of the CREAS (Cenarios REgionalizados de Clima Futuro da America do Sul) regional project. Three regional climate models RCMs (Eta CCS, RegCM3 and HadRM3P) were nested within the HadAM3P global model. The simulations cover a 30-year period representing present climate (1961-1990) and projections for the IPCC A2 high emission scenario for 2071-2100. The focus was on the changes in the mean circulation and surface variables, in particular, surface air temperature and precipitation. There is a consistent pattern of changes in circulation, rainfall and temperatures as depicted by the three models. The HadRM3P shows intensification and a more southward position of the subtropical Pacific high, while a pattern of intensification/weakening during summer/winter is projected by the Eta CCS/RegCM3. There is a tendency for a weakening of the subtropical westerly jet from the Eta CCS and HadRM3P, consistent with other studies. There are indications that regions such of Northeast Brazil and central-eastern and southern Amazonia may experience rainfall deficiency in the future, while the Northwest coast of Peru-Ecuador and northern Argentina may experience rainfall excesses in a warmer future, and these changes may vary with the seasons. The three models show warming in the A2 scenario stronger in the tropical region, especially in the 5°N-15°S band, both in summer and especially in winter, reaching up to 6-8°C warmer than in the present. In southern South America, the warming in summer varies between 2 and 4°C and in winter between 3 and 5°C in the same region from the 3 models. These changes are consistent with changes in low level circulation from the models, and they are comparable with changes in rainfall and temperature extremes reported elsewhere. In summary, some aspects of projected future climate change are quite robust across this set of

  18. Regional climate change experiments over southern South America. II: Climate change scenarios in the late twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Nunez, Mario N.; Solman, Silvina A. [Centro de Investigaciones del Mar y la Atmosfera (CIMA-CONICET/UBA) DCAO (FCEyN-UBA), Ciudad Universitaria, Pabellon II, Piso 2, Buenos Aires (Argentina); Cabre, Maria Fernanda [Centro de Investigaciones del Mar y la Atmosfera (CIMA-CONICET/UBA), Ciudad Universitaria, Pabellon II, Piso 2, Buenos Aires (Argentina)

    2009-06-15

    We present an analysis of climate change over southern South America as simulated by a regional climate model. The regional model MM5 was nested within time-slice global atmospheric model experiments conducted by the HadAM3H model. The simulations cover a 10-year period representing present-day climate (1981-1990) and two future scenarios for the SRESA2 and B2 emission scenarios for the period 2081-2090. There are a few quantitative differences between the two regional scenarios. The simulated changes are larger for the A2 than the B2 scenario, although with few qualitative differences. For the two regional scenarios, the warming in southern Brazil, Paraguay, Bolivia and northeastern Argentina is particularly large in spring. Over the western coast of South America both scenarios project a general decrease in precipitation. Both the A2 and B2 simulations show a general increase in precipitation in northern and central Argentina especially in summer and fall and a general decrease in precipitation in winter and spring. In fall the simulations agree on a general decrease in precipitation in southern Brazil. This reflects changes in the atmospheric circulation during winter and spring. Changes in mean sea level pressure show a cell of increasing pressure centered somewhere in the southern Atlantic Ocean and southern Pacific Ocean, mainly during summer and fall in the Atlantic and in spring in the Pacific. In relation to the pressure distribution in the control run, this indicates a southward extension of the summer mean Atlantic and Pacific subtropical highs. (orig.)

  19. A library for the twenty-first century: the Galter Health Sciences Library's renovation and expansion project.

    Science.gov (United States)

    Shedlock, J; Ross, F

    1997-04-01

    A renovation and expansion project at the Galter Health Sciences Library of Northwestern University strikes a balance between traditional and future libraries, library ambiance and high technology, old and new. When guided by a vision of future building use, renovation projects can succeed in meeting many institutional goals as a viable alternative to new library buildings. Issues addressed include planning considerations, architectural history, library design, building features, information technology considerations, and ideal library space design when new construction is not possible.

  20. Numerical simulation of space UV spectrographs

    Science.gov (United States)

    Yushkin, Maksim; Fatkhullin, Timur; Panchuk, Vladimir; Sachkov, Mikhail; Kanev, Evgeny

    2016-07-01

    Based on the ray tracing method, we developed algorithms for constructing numerical model of spectroscopic instrumentation. The Software is realized in C ++ using nVidia CUDA technology. The software package consists of three separate modules: the ray tracing module, a module for calculating energy efficiency and module of CCD image simulation. The main objective of this work was to obtain images of the spectra for the cross-dispersed spectrographs as well as segmented aperture Long Slit Spectrograph. The software can be potentially used by WSO-UV project. To test our algorithms and the software package we have performed simulations of the ground cross-dispersed Nasmyth Echelle Spectrometer (NES) installed on the platform of the Nasmyth focus of the Russian 6-meter BTA telescope. The comparison of model images of stellar spectra with observations on this device confirms that the software works well. The high degree of agreement between the theoretical and real spectra is shown.

  1. Historical Approach to the Role of Women in the Legislation of Iran: A Case Study on the Twenty-First Parliament

    Directory of Open Access Journals (Sweden)

    Sarah Sheibani

    2017-01-01

    Full Text Available One hundred and ten years ago, men and women took constitutionalism to achieve justice in Iran. National Council was the result of the Iranian people's struggle for justice, both women and men. Men policies from the beginning of legislation put women as minors and lunatics and bankrupted and banned them from vote. However, the Constitutional Revolution as a turning point and a national revolution played a key role in changing attitudes to women and structural context of their participation provided. In this paper, with the use of descriptive-analytical as well as quantitative methods, we sought to answer the question that what was the position of women in the twenty-first Parliament. The results of this study suggest that when Iranian women were allowed to participate politics, they have achieved to show their ability in politics as we saw examples in the twenty-first Parliament in which women had twenty-two percent participation.

  2. Change and Continuity in Librarianship: Approaching the Twenty-First Century. Proceedings of the 40th Military Librarians Workshop, 20-22 November 1996, Annapolis, Maryland,

    Science.gov (United States)

    1996-11-01

    Garden Hotel on 20-22 November 1996. The theme was Change and Continuity in LibrarianshiR: ADDroachina to Twenty-first Century. The program featured...text is printed out, there are enormous economic and ecological disadvantages to the all-digital library. Strike one. Omnipresent Electronics What ever...a change of address. WINGS is also the prototype of a proposed kiosk system which would place kiosks with these services in local malls, libraries

  3. Book review: The superlative city: Dubai and the urban condition in the early twenty-first century edited by Ahmed Kanna

    OpenAIRE

    Housby, Elaine

    2014-01-01

    "The Superlative City: Dubai and the Urban Condition in the Early Twenty-First Century." Ahmed Kanna. Harvard University Press. August 2013. --- In the last few years, the Persian Gulf city of Dubai has exploded from the Arabian sands onto the world stage. Oil wealth, land rent, and so-called informal economic practices have blanketed the urbanscape with enormous enclaved developments attracting a global elite, while the economy runs on a huge army of migrant workers from the labour-exporting...

  4. Simulation Modeling of Space Missions Using the High Level Architecture

    Directory of Open Access Journals (Sweden)

    Luis Rabelo

    2013-01-01

    Full Text Available This paper discusses an environment being developed to model a mission of the Space Launch System (SLS and the Multipurpose Crew Vehicle (MPCV being launched from Kennedy Space Center (KSC to the International Space Station (ISS. Several models representing different phases of the mission such as the ground operations processes, engineered systems, and range components such as failure tree, blast, gas dispersion, and debris modeling are explained. These models are built using different simulation paradigms such as continuous, system dynamics, discrete-event, and agent-based simulation modeling. The High Level Architecture (HLA is the backbone of this distributed simulation. The different design decisions and the information fusion scheme of this unique environment are explained in detail for decision-making. This can also help in the development of exploration missions beyond the International Space Station.

  5. Cold & Black Environment Design in Large Space Simulator

    Science.gov (United States)

    Min, Liu; Botao, Liu; Zijuan, Wang; Weiwei, Shan; Wenjing, Ding

    A space simulator provides a spacecraft with a specified environment during a thermal test of which a cold & black background is one of the important technical specifications. A shroud and nitrogen system used to simulate a cold & black environment with the effective space of 8500 mm × 9000 mm are studied in this article. In terms of the design of the shroud of the large space simulator, we should not only consider heat exchange and temperature uniformity, but also the feasibility of manufacture, transportation and installation. The cooling system adopts single-phase closed loop cycle. Based on the result of the test, it can be concluded that test data accord with the computational simulation result. The average temperature is 90 K and the temperature uniformity of the shroud meets the technical requirement.

  6. A Simulation and Modeling Framework for Space Situational Awareness

    Energy Technology Data Exchange (ETDEWEB)

    Olivier, S S

    2008-09-15

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.

  7. The need of formation anthropocosmos pedagogy in the twenty-first century (philosophical and educational, pedagogical and spiritual aspects

    Directory of Open Access Journals (Sweden)

    Natalia V. Polischuk

    2016-02-01

    Full Text Available The article presents a definition of the subject of «anthropocosmic pedagogy» in contemporary philosophical and pedagogical discourse. The author’s interpretation of the approaches to the explication of key terms and concepts anthropocosmos pedagogy and disclosure of their filosofico-educational, pedagogical and spiritual essence. It is proved that due to the need to promote high-quality transition of the intelligent matter of the Earth from its planetary state into a cosmic force, it is necessary to ensure resettlement and reproduction of intelligent matter of the Earth on the scale of the Solar system with the prospect of reaching the galactic and metagalactic spaces. But for this you need to implement cosmic education, which means that we need to form a concept of highly spiritual and moral personality of the future highly advanced space-ekzoplanete civilization on the basis of anthropocosmos philosophy of education and pedagogy. Content components of the new anthropocosmos concepts anthropocosmos information and high-tech civilization in the framework of professional, phenomenal-ideological, synergistic approaches, as well as their synthesis. On the basis of the comparative analysis presents the main characteristics of the newly introduced terms and concepts anthropocosmos philosophy of education and pedagogy.

  8. Design for the simulation of space based information network

    Institute of Scientific and Technical Information of China (English)

    Zeng Bin; Li Zitang; Wang Wei

    2006-01-01

    Ongoing research is described that is focused upon modelling the space base information network and simulating its behaviours: simulation of spaced based communications and networking project. Its objective is to demonstrate the feasibility of producing a tool that can provide a performance evaluation of various constellation access techniques and routing policies. The architecture and design of the simulation system are explored. The algorithm of data routing and instrument scheduling in this project is described. Besides these, the key methodologies of simulating the inter-satellite link features in the data transmissions are also discussed. The performance of both instrument scheduling algorithm and routing schemes is evaluated and analyzed through extensive simulations under a typical scenario.

  9. Projected changes of summer monsoon extremes and hydroclimatic regimes over West Africa for the twenty-first century

    Science.gov (United States)

    Diallo, Ismaïla; Giorgi, Filippo; Deme, Abdoulaye; Tall, Moustapha; Mariotti, Laura; Gaye, Amadou T.

    2016-12-01

    We use two CORDEX-Africa simulations performed with the regional model RegCM4 to characterize the projected changes in extremes and hydroclimatic regimes associated with the West African Monsoon (WAM). RegCM4 was driven for the period 1970-2100 by the HadGEM2-ES and the MPI-ESM Global Climate Models (GCMs) under the RCP8.5 greenhouse gas concentration pathway. RegCM4 accurately simulates the WAM characteristics in terms of seasonal mean, seasonal cycle, interannual variability and extreme events of rainfall. Overall, both RegCM4 experiments are able to reproduce the large-scale atmospheric circulation for the reference period (i.e. present-day), and in fact show improved performance compared to the driving GCMs in terms of precipitation mean climatology and extreme events, although different shortcomings in the various models are still evident. Precipitation is projected to decrease (increase) over western (eastern) Sahel, although with different spatial detail between RegCM4 and the corresponding driving GCMs. Changes in extreme precipitation events show patterns in line with those of the mean change. The models project different changes in water budget over the Sahel region, where the MPI projects an increased deficit in local moisture supply (E P). The E-P change is primarily precipitation driven. The precipitation increases over the eastern and/or central Sahel are attributed to the increase of moisture convergence due to increased water vapor in the boundary layer air column and surface evaporation. On the other hand, the projected dry conditions over the western Sahel are associated with the strengthening of moisture divergence in the upper level (850-300 hPa) combined to both a southward migration of the African Easterly Jet (AEJ) and a weakening of rising motion between the core of the AEJ and the Tropical Easterly Jet.

  10. Will the tropical land biosphere dominate the climate-carbon cycle feedback during the twenty-first century?

    Energy Technology Data Exchange (ETDEWEB)

    Raddatz, T.J.; Reick, C.H. [Max Planck Institute for Biogeochemistry, Jena (Germany); Max Planck Institute for Meteorology, Hamburg (Germany); Knorr, W. [Max Planck Institute for Biogeochemistry, Jena (Germany); QUEST, University of Bristol, Bristol (United Kingdom); Kattge, J. [Max Planck Institute for Biogeochemistry, Jena (Germany); Roeckner, E.; Schnur, R.; Schnitzler, K.G.; Wetzel, P.; Jungclaus, J. [Max Planck Institute for Meteorology, Hamburg (Germany)

    2007-11-15

    Global warming caused by anthropogenic CO{sub 2} emissions is expected to reduce the capability of the ocean and the land biosphere to take up carbon. This will enlarge the fraction of the CO{sub 2} emissions remaining in the atmosphere, which in turn will reinforce future climate change. Recent model studies agree in the existence of such a positive climate-carbon cycle feedback, but the estimates of its amplitude differ by an order of magnitude, which considerably increases the uncertainty in future climate projections. Therefore we discuss, in how far a particular process or component of the carbon cycle can be identified, that potentially contributes most to the positive feedback. The discussion is based on simulations with a carbon cycle model, which is embedded in the atmosphere/ocean general circulation model ECHAM5/MPI-OM. Two simulations covering the period 1860-2100 are conducted to determine the impact of global warming on the carbon cycle. Forced by historical and future carbon dioxide emissions (following the scenario A2 of the Intergovernmental Panel on Climate Change), they reveal a noticeable positive climate-carbon cycle feedback, which is mainly driven by the tropical land biosphere. The oceans contribute much less to the positive feedback and the temperate/boreal terrestrial biosphere induces a minor negative feedback. The contrasting behavior of the tropical and temperate/boreal land biosphere is mostly attributed to opposite trends in their net primary productivity (NPP) under global warming conditions. As these findings depend on the model employed they are compared with results derived from other climate-carbon cycle models, which participated in the Coupled Climate-Carbon Cycle Model Intercomparison Project (C4MIP). (orig.)

  11. THE DISPUTE BETWEEN POLITICAL THEOLOGY AND THE POLITICS OF THEOLOGY IN THE TWENTY-FIRST CENTURY ON THE MEANINGS OF THE POSTMODERN GLOBALIZING AND INDIVIDUALISTIC SOCIETY AND THE CHRISTIAN PERSONALIST GLOBALITY

    Directory of Open Access Journals (Sweden)

    Stelian MANOLACHE

    2016-05-01

    Full Text Available Upon the dawn of postmodernity, in the twenty-first century, we witness the emergence of a new way of thinking and of new forms of culture and life, under the ideology of globalism, whose dominance is given by the practicality and utility related to civilization, and under globality, which is the cultural aspect of globalization, pertaining to the field of culture. The two dimensions of globalization and globality, civilizational and cultural, will (requestion the principle relationship between Christianity and the new postmodern globalizing utopia, requiring to (reconsider the sense and presence of Christianity within the world, and the appropriate sociological figure of the Church, within the new reality of global and globalized humanity, in the postmodern public space. This paper deals with this ideology - globalism and the cultural manifestation of globality, and with the Orthodox answer to the new challenge of individualism and postmodern globalizing (neocollectivism.

  12. THE DISPUTE BETWEEN POLITICAL THEOLOGY AND THE POLITICS OF THEOLOGY IN THE TWENTY-FIRST CENTURY ON THE MEANINGS OF THE POSTMODERN GLOBALIZING AND INDIVIDUALISTIC SOCIETY AND THE CHRISTIAN PERSONALIST GLOBALITY

    Directory of Open Access Journals (Sweden)

    Stelian MANOLACHE

    2016-05-01

    Full Text Available Upon the dawn of postmodernity, in the twenty-first century, we witness the emergence of a new way of thinking and of new forms of culture and life, under the ideology of globalism, whose dominance is given by the practicality and utility related to civilization, and under globality, which is the cultural aspect of globalization, pertaining to the field of culture. The two dimensions of globalization and globality, civilizational and cultural, will (requestion the principle relationship between Christianity and the new postmodern globalizing utopia, requiring to (reconsider the sense and presence of Christianity within the world, and the appropriate sociological figure of the Church, within the new reality of global and globalized humanity, in the postmodern public space. This paper deals with this ideology - globalism and the cultural manifestation of globality, and with the Orthodox answer to the new challenge of individualism and postmodern globalizing (neocollectivism.

  13. Religion and decolonial feminism: The protagonisms and the new religious assemblages of women in the twenty-first centur

    Directory of Open Access Journals (Sweden)

    Anete Roese

    2015-10-01

    Full Text Available Religions and the research about them were significantly affected by the feminist practices and studies in the twentieth century. In the religious context that has been presented in this third millennium, marked by the autonomy of women and their role in society, further studies are necessary to understand the religious phenomenon that occurs in the silent protagonism of women. One has to ask how to research and to think religion from a feminist perspective at this time; what religion is for women, how women experience religion and appropriate of it in the third millennium. The new religious practices, the connection of women to religion or the ruptures with religions, spearheaded by them, the resistances, their active subjectivity, as alternatives to the traditional spaces circumscribed by religious patriarchy, as well as the issue of autonomy and responsibility of the women in the construction of spiritual and religious alternatives in the contemporary society deserve attention. This text aims to present signs of these protagonist movements of women, especially in the Christian context in the current Brazil, stating hypotheses and presenting reflections on this reality. The text dialogues with the de-colonial feminism and has implications for the ways of conceptualizing and studying religion from the institutionalized religious forms.

  14. Desdemona and a ticket to space; training for space flight in a 3g motion simulator

    NARCIS (Netherlands)

    Wouters, M.

    2014-01-01

    On October 5, 2013, Marijn Wouters and two other contestants of a nation-wide competition ‘Nederland Innoveert’ underwent a space training exercise. One by one, the trainees were pushed to their limits in the Desdemona motion simulator, an experience that mimicked the Space Expedition Corporation (S

  15. A New Paradigm Is Needed for Medical Education in the Mid-Twenty-First Century and Beyond: Are We Ready?

    Directory of Open Access Journals (Sweden)

    Dan E. Benor

    2014-07-01

    Full Text Available The twentieth century witnessed profound changes in medical education. All these changes, however, took place within the existing framework, suggested by Flexner a century ago. The present paper suggests that we are approaching a singularity point, where we shall have to change the paradigm and be prepared for an entirely new genre of medical education. This suggestion is based upon analysis of existing and envisaged trends: first, in technology, such as availability of information and sophisticated simulations; second, in medical practice, such as far-reaching interventions in life and death that create an array of new moral dilemmas, as well as a change in patient mix in hospitals and a growing need of team work; third, in the societal attitude toward higher education. The structure of the future medical school is delineated in a rough sketch, and so are the roles of the future medical teacher. It is concluded that we are presently not prepared for the approaching changes, neither from practical nor from attitudinal points of view, and that it is now high time for both awareness of and preparation for these changes.

  16. A model study of factors influencing projected changes in regional sea level over the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Pardaens, Anne K.; Lowe, J.A. [Met Office, Hadley Centre, Exeter, Devon (United Kingdom); Gregory, J.M. [Met Office, Hadley Centre, Exeter, Devon (United Kingdom); University of Reading, Department of Meteorology, Walker Institute for Climate System Research, Earley Gate, PO Box 243, Reading (United Kingdom)

    2011-05-15

    In addition to projected increases in global mean sea level over the 21st century, model simulations suggest there will also be changes in the regional distribution of sea level relative to the global mean. There is a considerable spread in the projected patterns of these changes by current models, as shown by the recent Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment (AR4). This spread has not reduced from that given by the Third Assessment models. Comparison with projections by ensembles of models based on a single structure supports an earlier suggestion that models of similar formulation give more similar patterns of sea level change. Analysing an AR4 ensemble of model projections under a business-as-usual scenario shows that steric changes (associated with subsurface ocean density changes) largely dominate the sea level pattern changes. The relative importance of subsurface temperature or salinity changes in contributing to this differs from region to region and, to an extent, from model-to-model. In general, thermosteric changes give the spatial variations in the Southern Ocean, halosteric changes dominate in the Arctic and strong compensation between thermosteric and halosteric changes characterises the Atlantic. The magnitude of sea level and component changes in the Atlantic appear to be linked to the amount of Atlantic meridional overturning circulation (MOC) weakening. When the MOC weakening is substantial, the Atlantic thermosteric patterns of change arise from a dominant role of ocean advective heat flux changes. (orig.)

  17. Ocean (de)oxygenation from the Last Glacial Maximum to the twenty-first century: insights from Earth System models.

    Science.gov (United States)

    Bopp, L; Resplandy, L; Untersee, A; Le Mezo, P; Kageyama, M

    2017-09-13

    All Earth System models project a consistent decrease in the oxygen content of oceans for the coming decades because of ocean warming, reduced ventilation and increased stratification. But large uncertainties for these future projections of ocean deoxygenation remain for the subsurface tropical oceans where the major oxygen minimum zones are located. Here, we combine global warming projections, model-based estimates of natural short-term variability, as well as data and model estimates of the Last Glacial Maximum (LGM) ocean oxygenation to gain some insights into the major mechanisms of oxygenation changes across these different time scales. We show that the primary uncertainty on future ocean deoxygenation in the subsurface tropical oceans is in fact controlled by a robust compensation between decreasing oxygen saturation (O2sat) due to warming and decreasing apparent oxygen utilization (AOU) due to increased ventilation of the corresponding water masses. Modelled short-term natural variability in subsurface oxygen levels also reveals a compensation between O2sat and AOU, controlled by the latter. Finally, using a model simulation of the LGM, reproducing data-based reconstructions of past ocean (de)oxygenation, we show that the deoxygenation trend of the subsurface ocean during deglaciation was controlled by a combination of warming-induced decreasing O2sat and increasing AOU driven by a reduced ventilation of tropical subsurface waters.This article is part of the themed issue 'Ocean ventilation and deoxygenation in a warming world'. © 2017 The Author(s).

  18. Agriculture in West Africa in the Twenty-first Century: climate change and impacts scenarios, and potential for adaptation

    Directory of Open Access Journals (Sweden)

    Benjamin Sultan

    2016-08-01

    Full Text Available West Africa is known to be particularly vulnerable to climate change due to high climate variability, high reliance on rain-fed agriculture and limited economic and institutional capacity to respond to climate variability and change. In this context, better knowledge of how climate will change in West Africa and how such changes will impact crop productivity is crucial to inform policies that may counteract the adverse effects. This review paper provides a comprehensive overview of climate change impacts on agriculture in West Africa based on the recent scientific literature. West Africa is nowadays experiencing a rapid climate change, characterized by a widespread warming, a recovery of the monsoonal precipitation, and an increase in the occurrence of climate extremes. The observed climate tendencies are also projected to continue in the 21st century under moderate and high emission scenarios, although large uncertainties still affect simulations of the future West African climate, especially regarding the summer precipitation. However, despite diverging future projections of the monsoonal rainfall, which is essential for rain-fed agriculture, a robust evidence of yield loss in West Africa emerges. This yield loss is mainly driven by increased mean temperature while potential wetter or drier conditions as well as elevated CO2 concentrations can modulate this effect. Potential for adaptation is illustrated for major crops in West Africa through a selection of studies based on process-based crop models to adjust cropping systems (change in varieties, sowing dates and density, irrigation, fertilizer management to future climate. Results of the cited studies are crop and region specific and no clear conclusions can be made regarding the most effective adaptation options difficult. Further efforts are needed to improve modelling of the monsoon system and to better quantify the uncertainty in its changes under a warmer climate, the response of the

  19. Beyond conventional energy use: A regionally based end-use approach for the twenty-first century

    Science.gov (United States)

    Feder, Deborah R.

    In the United States, the dominant energy discourse is supply-oriented and focused on the large-scale use of fossil fuel and nuclear electricity energy resources. Fossil fuels and nuclear electricity are valued for the convenience, quality of life, and services that they provide. Despite these qualities, the laws of thermodynamics tell us that fossil fuels and nuclear electricity are not required for all end-use needs. Concentrated high quality energy sources such as fossil fuels and electricity are degraded when used for tasks such as water and space heating. The degraded energy is released into the environment as waste heat and pollution and contribute to scarcity in many realms. This dissertation suggests an alternative discourse on energy that calls on three frameworks of thinking: the nexus of relations, end-use analysis, and regional geography. The nexus of relations is a device for showing how different relations in society construct uses of energy that lead most naturally to scarcity and environmental degradation. End-use analysis is a framework for matching energy sources and end-uses based on thermodynamic quality, and regional geography is useful for identifying localized renewable energy sources and end-use needs. By combining these three approaches, a new framework has been created that matches thermodynamically appropriate renewable resources to end-use needs. This approach offers a new perspective on resource use that emphasizes how energy demands can be met, while minimizing scarcity and environmental degradation. To illustrate this regionally based end-use framework, a case study was conducted at three sites within Centre County, Pennsylvania. At each study site, the flux density of solar, wind, and water resources was evaluated and matched with local end-use needs. This exercise resulted in several findings: one, fossil fuel/nuclear electricity savings are possible at each study site; two, geographically-specific renewable resources can be used to

  20. Planetary and Space Simulation Facilities (PSI) at DLR

    Science.gov (United States)

    Panitz, Corinna; Rabbow, E.; Rettberg, P.; Kloss, M.; Reitz, G.; Horneck, G.

    2010-05-01

    The Planetary and Space Simulation facilities at DLR offer the possibility to expose biological and physical samples individually or integrated into space hardware to defined and controlled space conditions like ultra high vacuum, low temperature and extraterrestrial UV radiation. An x-ray facility stands for the simulation of the ionizing component at the disposal. All of the simulation facilities are required for the preparation of space experiments: - for testing of the newly developed space hardware - for investigating the effect of different space parameters on biological systems as a preparation for the flight experiment - for performing the 'Experiment Verification Tests' (EVT) for the specification of the test parameters - and 'Experiment Sequence Tests' (EST) by simulating sample assemblies, exposure to selected space parameters, and sample disassembly. To test the compatibility of the different biological and chemical systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed among many others for the ESA facilities of the ongoing missions EXPOSE-R and EXPOSE-E on board of the International Space Station ISS . Several experiment verification tests EVTs and an experiment sequence test EST have been conducted in the carefully equipped and monitored planetary and space simulation facilities PSI of the Institute of Aerospace Medicine at DLR in Cologne, Germany. These ground based pre-flight studies allowed the investigation of a much wider variety of samples and the selection of the most promising organisms for the flight experiment. EXPOSE-E had been attached to the outer balcony of the European Columbus module of the ISS in February 2008 and stayed for 1,5 years in space; EXPOSE-R has been attached to the Russian Svezda module of the ISS in spring 2009 and mission duration will be approx. 1,5 years. The missions will give new insights into the survivability of terrestrial

  1. Simulation of space charge effects in resistive plate chambers

    CERN Document Server

    Lippmann, Christian

    2003-01-01

    Multigap resistive plate chambers with 0.3-mm gas gaps operated in avalanche mode at atmospheric pressure have reached timing accuracies below 50 ps (standard deviation) with efficiencies above 99% . The avalanches in high homogeneous electric fields of 100 kV/cm are strongly influenced by space charge effects which are the main topic of this paper. We extend a previously discussed Monte Carlo simulation model of avalanches in resistive plate chambers by the dynamic calculation of the electric field in the avalanches. We complete the previously presented results on time resolution and efficiency data with simulated charge spectra. The simulated data shows good agreement with measurements. The detailed simulation of the avalanche saturation due to the space charge fields explains the small observed charges, the shape of the spectra, and the linear increase of average charges with high voltage. (22 refs).

  2. Types of social media (Web 2.0) used by Australian allied health professionals to deliver early twenty-first-century practice promotion and health care.

    Science.gov (United States)

    Usher, Wayne

    2011-01-01

    Types of social media (Web 2.0) usage associated with eight of Australia's major allied health professions (AHPs, n = 935) were examined. Australian AHPs are interacting with Web 2.0 technologies for personal use but are failing to implement such technologies throughout their health professions to deliver health care. Australian AHPs are willing to undertake online educational courses designed to up skill them about how Web 2.0 may be used for practice promotion and health care delivery in the early twenty-first century. Participants in this study indicated that educational courses that were offered online would be the preferred mode of delivery.

  3. Does the Common Agricultural Policy still make sense in the twenty-first century? CAP after 2013 from the perspective of Poland and Hungary

    Directory of Open Access Journals (Sweden)

    Elżbieta Daszkowska

    2009-01-01

    Full Text Available The EU CAP has developed immensely since the 1960’s. However, its current determinants are completely different from those which formed the CAP foundations. This results mainly from the fact that the UE CAP must meet present-day challenges and threats. Moreover, further EU enlargements also significantly influenced performance of this sector of economy. It is important to determine whether the existence of the CAP in the twenty-first century still makes sense and to specify in more detail the CAP reform directions after 2013 from the perspective of Poland and Hungary.

  4. The development of a combined effects space simulation facility

    Energy Technology Data Exchange (ETDEWEB)

    Maldonado, Carlos A.; Lilly, Taylor C.; Ketsdever, Andrew D. [University of Colorado, Colorado Springs, Department of Mechanical and Aerospace Engineering, Colorado Springs, CO 80918 (United States)

    2012-11-27

    An overview of the development of a facility to study the combined effects of the space environment on spacecraft is presented. The characterization of a magnetic filter plasma source and a low energy electron flood source for the simulation of the low Earth orbit plasma environment is discussed. Plasma diagnostics show that the magnetic filter plasma source provides streaming ion energies of approximately 5eV and can supply the appropriate density for LEO simulation. Additionally the low energy flood gun is shown to provide the appropriate density for LEO simulation as a function of altitude and solar activity.

  5. The Planetary and Space Simulation Facilities at DLR Cologne

    Science.gov (United States)

    Rabbow, Elke; Parpart, André; Reitz, Günther

    2016-06-01

    Astrobiology strives to increase our knowledge on the origin, evolution and distribution of life, on Earth and beyond. In the past centuries, life has been found on Earth in environments with extreme conditions that were expected to be uninhabitable. Scientific investigations of the underlying metabolic mechanisms and strategies that lead to the high adaptability of these extremophile organisms increase our understanding of evolution and distribution of life on Earth. Life as we know it depends on the availability of liquid water. Exposure of organisms to defined and complex extreme environmental conditions, in particular those that limit the water availability, allows the investigation of the survival mechanisms as well as an estimation of the possibility of the distribution to and survivability on other celestial bodies of selected organisms. Space missions in low Earth orbit (LEO) provide access for experiments to complex environmental conditions not available on Earth, but studies on the molecular and cellular mechanisms of adaption to these hostile conditions and on the limits of life cannot be performed exclusively in space experiments. Experimental space is limited and allows only the investigation of selected endpoints. An additional intensive ground based program is required, with easy to access facilities capable to simulate space and planetary environments, in particular with focus on temperature, pressure, atmospheric composition and short wavelength solar ultraviolet radiation (UV). DLR Cologne operates a number of Planetary and Space Simulation facilities (PSI) where microorganisms from extreme terrestrial environments or known for their high adaptability are exposed for mechanistic studies. Space or planetary parameters are simulated individually or in combination in temperature controlled vacuum facilities equipped with a variety of defined and calibrated irradiation sources. The PSI support basic research and were recurrently used for pre

  6. Book review of Capital in the Twenty-First Century, by Thomas Piketty. Cambridge, Massachusetts, London, England: The Belknap Press of Harvard Press, 2014, 605 pages

    Directory of Open Access Journals (Sweden)

    Paul Dobrescu

    2015-04-01

    Full Text Available “Every now and then, the field of economics produces an important book; this is one of them” (Cowen, 2014. These are the opening words of Tyler Cowen’s presentation of Thomas Piketty’s work, “Capital in the Twenty-First Century” (Piketty, 2014, in Foreign Affairs. This is a book that is visibly placed in all important bookstores around the world, widely debated, acclaimed, sold (over 1 million copies have been sold so far. It has been favorably reviewed or quoted in all major journals. The assessment of “Capital in the Twenty-First Century” by Paul Krugman, Nobel Economics Prize Laureate as a “magnificent, sweeping meditation on inequality”, is highly relevant: “This is a book that will change both the way we think about society and the way we do economics” (Krugman, 2014. Finally, Piketty’s book is included in the list of the year’s best books by prestigious journals, such as The Economist, Financial Times, The Washington Post, Observer, The Independent, Daily Telegraph; Financial Times and McKinsey have hailed it as the best book of 2014.

  7. A Conservation Ethic and the Collecting of Animals by Institutions of Natural Heritage in the Twenty-First Century: Case Study of the Australian Museum.

    Science.gov (United States)

    Ikin, Timothy

    2011-02-15

    Collecting of animals from their habitats for preservation by museums and related bodies is a core operation of such institutions. Conservation of biodiversity in the current era is a priority in the scientific agendas of museums of natural heritage in Australia and the world. Intuitively, to take animals from the wild, while engaged in scientific or other practices that are supposed to promote their ongoing survival, may appear be incompatible. The Australian Museum presents an interesting ground to consider zoological collecting by museums in the twenty-first century. Anderson and Reeves in 1994 argued that a milieu existed that undervalued native species, and that the role of natural history museums, up to as late as the mid-twentieth century, was only to make a record the faunal diversity of Australia, which would inevitably be extinct. Despite the latter, conservation of Australia's faunal diversity is a key aspect of research programmes in Australia's institutions of natural heritage in the current era. This paper analyses collecting of animals, a core task for institutions of natural heritage, and how this interacts with a professed "conservation ethic" in a twenty-first century Australian setting.

  8. High School Students' Perceptions of the Effects of International Science Olympiad on Their STEM Career Aspirations and Twenty-First Century Skill Development

    Science.gov (United States)

    Sahin, Alpaslan; Gulacar, Ozcan; Stuessy, Carol

    2015-12-01

    Social cognitive theory guided the design of a survey to investigate high school students' perceptions of factors affecting their career contemplations and beliefs regarding the influence of their participation in the international Science Olympiad on their subject interests and twenty-first century skills. In addition, gender differences in students' choice of competition category were studied. Mixed methods analysis of survey returns from 172 Olympiad participants from 31 countries showed that students' career aspirations were affected most by their teachers, personal interests, and parents, respectively. Students also indicated that they believed that their participation in the Olympiad reinforced their plan to choose a science, technology, engineering, and mathematics (STEM) major at college and assisted them in developing and improving their twenty-first century skills. Furthermore, female students' responses indicated that their project choices were less likely to be in the engineering category and more likely to be in the environment or energy categories. Findings are discussed in the light of increasing the awareness of the role and importance of Science Olympiads in STEM career choice and finding ways to attract more female students into engineering careers.

  9. How Has Elderly Migration Changed in the Twenty-First Century? What the Data Can-and Cannot-Tell Us.

    Science.gov (United States)

    Conway, Karen Smith; Rork, Jonathan C

    2016-08-01

    Interstate elderly migration has strong implications for state tax policies and health care systems, yet little is known about how it has changed in the twenty-first century. Its relative rarity requires a large data set with which to construct reliable measures, and the replacement of the U.S. Census long form (CLF) with the American Community Survey (ACS) has made such updates difficult. Two commonly used alternative migration data sources-the Current Population Survey (CPS) and the Statistics of Income (SOI) program of the Internal Revenue Service (IRS)-suffer serious limitations in studying the migration of any subpopulation, including the elderly. Our study informs migration research in the post-2000 era by identifying methodological differences between data sources and devising strategies for reconciling the CLF and ACS. Our investigation focusing on the elderly suggests that the ACS can generate comparable migration data that reveal a continuation of previously identified geographic patterns as well as changes unique to the 2000s. However, its changed definition of residence and survey timing leaves us unable to construct a comparable national migration rate, suggesting that one must use national trends in the smaller CPS to investigate whether elderly migration has increased or decreased in the twenty-first century.

  10. A Conservation Ethic and the Collecting of Animals by Institutions of Natural Heritage in the Twenty-First Century: Case Study of the Australian Museum

    Directory of Open Access Journals (Sweden)

    Timothy Ikin

    2011-02-01

    Full Text Available Collecting of animals from their habitats for preservation by museums and related bodies is a core operation of such institutions. Conservation of biodiversity in the current era is a priority in the scientific agendas of museums of natural heritage in Australia and the world. Intuitively, to take animals from the wild, while engaged in scientific or other practices that are supposed to promote their ongoing survival, may appear be incompatible. The Australian Museum presents an interesting ground to consider zoological collecting by museums in the twenty-first century. Anderson and Reeves in 1994 argued that a milieu existed that undervalued native species, and that the role of natural history museums, up to as late as the mid-twentieth century, was only to make a record the faunal diversity of Australia, which would inevitably be extinct. Despite the latter, conservation of Australia’s faunal diversity is a key aspect of research programmes in Australia’s institutions of natural heritage in the current era. This paper analyses collecting of animals, a core task for institutions of natural heritage, and how this interacts with a professed “conservation ethic” in a twenty-first century Australian setting.

  11. Wicked Female Characters in Roddy Doyle’s “The Pram”: Revisiting Celtic and Polish Myths in the Context of Twenty-First Century Ireland

    Directory of Open Access Journals (Sweden)

    Burcu Gülüm Tekin

    2015-07-01

    Full Text Available “The Pram” is the only horror story in Roddy Doyle’s collection The Deportees and Other Stories (2007. It is also unique in terms of its approach to Ireland’s multicultural scene in the twenty-first century. Doyle turns the other side of the coin and introduces a migrant caretaker (Alina, who loses her mind due to her employees’ (the O’Reilly family ill-treatment. As a reaction to their scornful attitude, Alina becomes a murderer. Set in the context of twenty-first century Dublin, “The Pram” contains various references to Celtic and Polish mythological female figures (in particular, the Old Hag of Beara and Boginka, which strengthen the thrilling, mythical elements in the plot. This paper aims to examine the characters’ negative attitude towards migrants in Ireland in the light of the racist discourse present in the story. Also, I will focus on the story’s female characters and discuss the handicaps of being a female migrant in Ireland. The parallels between the mythical female figures and the protagonist Alina will be another point to be analyzed. The argument of this paper is that Doyle does not always portray the positive outcomes of a multicultural society. On the contrary, he conveys the perspective of the incoming migrant. “The Pram” stages the obstacles that a female outsider may experience in Ireland and her subsequent transformation as a result of the racism she encounters there.

  12. Extremophiles Survival to Simulated Space Conditions: An Astrobiology Model Study

    Science.gov (United States)

    Mastascusa, V.; Romano, I.; Di Donato, P.; Poli, A.; Della Corte, V.; Rotundi, A.; Bussoletti, E.; Quarto, M.; Pugliese, M.; Nicolaus, B.

    2014-09-01

    In this work we investigated the ability of four extremophilic bacteria from Archaea and Bacteria domains to resist to space environment by exposing them to extreme conditions of temperature, UV radiation, desiccation coupled to low pressure generated in a Mars' conditions simulator. All the investigated extremophilic strains (namely Sulfolobus solfataricus, Haloterrigena hispanica, Thermotoga neapolitana and Geobacillus thermantarcticus) showed a good resistance to the simulation of the temperature variation in the space; on the other hand irradiation with UV at 254 nm affected only slightly the growth of H. hispanica, G. thermantarcticus and S. solfataricus; finally exposition to Mars simulated condition showed that H. hispanica and G. thermantarcticus were resistant to desiccation and low pressure.

  13. Extremophiles survival to simulated space conditions: an astrobiology model study.

    Science.gov (United States)

    Mastascusa, V; Romano, I; Di Donato, P; Poli, A; Della Corte, V; Rotundi, A; Bussoletti, E; Quarto, M; Pugliese, M; Nicolaus, B

    2014-09-01

    In this work we investigated the ability of four extremophilic bacteria from Archaea and Bacteria domains to resist to space environment by exposing them to extreme conditions of temperature, UV radiation, desiccation coupled to low pressure generated in a Mars' conditions simulator. All the investigated extremophilic strains (namely Sulfolobus solfataricus, Haloterrigena hispanica, Thermotoga neapolitana and Geobacillus thermantarcticus) showed a good resistance to the simulation of the temperature variation in the space; on the other hand irradiation with UV at 254 nm affected only slightly the growth of H. hispanica, G. thermantarcticus and S. solfataricus; finally exposition to Mars simulated condition showed that H. hispanica and G. thermantarcticus were resistant to desiccation and low pressure.

  14. A Simulation Base Investigation of High Latency Space Systems Operations

    Science.gov (United States)

    Li, Zu Qun; Crues, Edwin Z.; Bielski, Paul; Moore, Michael

    2017-01-01

    NASA's human space program has developed considerable experience with near Earth space operations. Although NASA has experience with deep space robotic missions, NASA has little substantive experience with human deep space operations. Even in the Apollo program, the missions lasted only a few weeks and the communication latencies were on the order of seconds. Human missions beyond the relatively close confines of the Earth-Moon system will involve missions with durations measured in months and communications latencies measured in minutes. To minimize crew risk and to maximize mission success, NASA needs to develop a better understanding of the implications of these types of mission durations and communication latencies on vehicle design, mission design and flight controller interaction with the crew. To begin to address these needs, NASA performed a study using a physics-based subsystem simulation to investigate the interactions between spacecraft crew and a ground-based mission control center for vehicle subsystem operations across long communication delays. The simulation, built with a subsystem modeling tool developed at NASA's Johnson Space Center, models the life support system of a Mars transit vehicle. The simulation contains models of the cabin atmosphere and pressure control system, electrical power system, drinking and waste water systems, internal and external thermal control systems, and crew metabolic functions. The simulation has three interfaces: 1) a real-time crew interface that can be use to monitor and control the vehicle subsystems; 2) a mission control center interface with data transport delays up to 15 minutes each way; 3) a real-time simulation test conductor interface that can be use to insert subsystem malfunctions and observe the interactions between the crew, ground, and simulated vehicle. The study was conducted at the 21st NASA Extreme Environment Mission Operations (NEEMO) mission between July 18th and Aug 3rd of year 2016. The NEEMO

  15. Twenty-first century Irvings

    Energy Technology Data Exchange (ETDEWEB)

    Sawler, H.

    2007-07-01

    The Irving family is the most powerful family in Atlantic Canada and one of the richest families in the world. The family is valued at $5.9 billion. This book discussed the family's growth and gradual domination over the forestry, energy, and transportation industries in Atlantic Canada. The book examined how the family has managed to remain dominant and powerful, and examined the ability of future Irving generations to maintain their power in the future, as the family expands and disperses. Details of the Irvings' particular style of entrepreneurship and their use of vertical integration were presented, and their relationships with government agencies were also discussed. The book examined the business practices and methods of different generations of Irvings, from the origins of the family business. The book demonstrated how the family has remained a progressive economic force for more than 150 years. An Irving family business history was also provided. Family attitudes towards the environment, philanthropy, and the media were also discussed. refs.

  16. Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications

    Science.gov (United States)

    Boghosian, Mary; Narvaez, Pablo; Herman, Ray

    2012-01-01

    The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.

  17. A Simulation Based Investigation of High Latency Space Systems Operations

    Science.gov (United States)

    Li, Zu Qun; Moore, Michael; Bielski, Paul; Crues, Edwin Z.

    2017-01-01

    This study was the first in a series of planned tests to use physics-based subsystem simulations to investigate the interactions between a spacecraft's crew and a ground-based mission control center for vehicle subsystem operations across long communication delays. The simulation models the life support system of a deep space habitat. It contains models of an environmental control and life support system, an electrical power system, an active thermal control systems, and crew metabolic functions. The simulation has three interfaces: 1) a real-time crew interface that can be use to monitor and control the subsystems; 2) a mission control center interface with data transport delays up to 15 minute each way; and 3) a real-time simulation test conductor interface used to insert subsystem malfunctions and observe the interactions between the crew, ground, and simulated vehicle. The study was conducted at the 21st NASA Extreme Environment Mission Operations (NEEMO) mission. The NEEMO crew and ground support team performed a number of relevant deep space mission scenarios that included both nominal activities and activities with system malfunctions. While this initial test sequence was focused on test infrastructure and procedures development, the data collected in the study already indicate that long communication delays have notable impacts on the operation of deep space systems. For future human missions beyond cis-lunar, NASA will need to design systems and support tools to meet these challenges. These will be used to train the crew to handle critical malfunctions on their own, to predict malfunctions and assist with vehicle operations. Subsequent more detailed and involved studies will be conducted to continue advancing NASA's understanding of space systems operations across long communications delays.

  18. Simulated Space Environmental Effects on Thin Film Solar Array Components

    Science.gov (United States)

    Finckenor, Miria; Carr, John; SanSoucie, Michael; Boyd, Darren; Phillips, Brandon

    2017-01-01

    The Lightweight Integrated Solar Array and Transceiver (LISA-T) experiment consists of thin-film, low mass, low volume solar panels. Given the variety of thin solar cells and cover materials and the lack of environmental protection typically afforded by thick coverglasses, a series of tests were conducted in Marshall Space Flight Center's Space Environmental Effects Facility to evaluate the performance of these materials. Candidate thin polymeric films and nitinol wires used for deployment were also exposed. Simulated space environment exposures were selected based on SSP 30425 rev. B, "Space Station Program Natural Environment Definition for Design" or AIAA Standard S-111A-2014, "Qualification and Quality Requirements for Space Solar Cells." One set of candidate materials were exposed to 5 eV atomic oxygen and concurrent vacuum ultraviolet (VUV) radiation for low Earth orbit simulation. A second set of materials were exposed to 1 MeV electrons. A third set of samples were exposed to 50, 100, 500, and 700 keV energy protons, and a fourth set were exposed to >2,000 hours of near ultraviolet (NUV) radiation. A final set was rapidly thermal cycled between -55 and +125degC. This test series provides data on enhanced power generation, particularly for small satellites with reduced mass and volume resources. Performance versus mass and cost per Watt is discussed.

  19. Space: The Fourth Military Dimension. The Strategic Defense Initiative and the Implications for Land Warfare in the Twenty-First Century

    Science.gov (United States)

    1986-10-01

    20’::. !lnd r..s an inc~nti\\le to tile Soviet Umon to meet us in Sl!!rious m ns centro ! r•egoti~:iolJS. We \\’:fill t<!rgin that dep!oy- rneut li’lC...worldwide, operates and •aintaiDI comuni cat ions-elect root cs gyete~ for apace surveillance end mtuUe v6rniag end •elected dato -proce~sing equip~nt for the

  20. Psychosocial value of space simulation for extended spaceflight

    Science.gov (United States)

    Kanas, N.

    1997-01-01

    There have been over 60 studies of Earth-bound activities that can be viewed as simulations of manned spaceflight. These analogs have involved Antarctic and Arctic expeditions, submarines and submersible simulators, land-based simulators, and hypodynamia environments. None of these analogs has accounted for all the variables related to extended spaceflight (e.g., microgravity, long-duration, heterogeneous crews), and some of the stimulation conditions have been found to be more representative of space conditions than others. A number of psychosocial factors have emerged from the simulation literature that correspond to important issues that have been reported from space. Psychological factors include sleep disorders, alterations in time sense, transcendent experiences, demographic issues, career motivation, homesickness, and increased perceptual sensitivities. Psychiatric factors include anxiety, depression, psychosis, psychosomatic symptoms, emotional reactions related to mission stage, asthenia, and postflight personality, and marital problems. Finally, interpersonal factors include tension resulting from crew heterogeneity, decreased cohesion over time, need for privacy, and issues involving leadership roles and lines of authority. Since future space missions will usually involve heterogeneous crews working on complicated objectives over long periods of time, these features require further study. Socio-cultural factors affecting confined crews (e.g., language and dialect, cultural differences, gender biases) should be explored in order to minimize tension and sustain performance. Career motivation also needs to be examined for the purpose of improving crew cohesion and preventing subgrouping, scapegoating, and territorial behavior. Periods of monotony and reduced activity should be addressed in order to maintain morale, provide meaningful use of leisure time, and prevent negative consequences of low stimulation, such as asthenia and crew member withdrawal

  1. Using a global climate model to evaluate the influences of water vapor, snow cover and atmospheric aerosol on warming in the Tibetan Plateau during the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Rangwala, Imtiaz [Rutgers University, Department of Environmental Sciences, New Brunswick, NJ (United States); Miller, James R. [Rutgers University, Institute of Marine and Coastal Sciences, New Brunswick (United States); Russell, Gary L. [NASA Goddard Institute for Space Studies, New York (United States); Xu, Ming [Chinese Academy of Sciences, Institute of Geographic Sciences and Natural Resources Research, Beijing (China); Rutgers University, Department of Ecology, Evolution and Natural Resources, New Brunswick (United States)

    2010-05-15

    We examine trends in climate variables and their interrelationships over the Tibetan Plateau using global climate model simulations to elucidate the mechanisms for the pattern of warming observed over the plateau during the latter half of the twentieth century and to investigate the warming trend during the twenty-first century under the SRES A1B scenario. Our analysis suggests a 4 C warming over the plateau between 1950 and 2100. The largest warming rates occur during winter and spring. For the 1961-2000 period, the simulated warming is similar to the observed trend over the plateau. Moreover, the largest warming occurs at the highest elevation sites between 1950 and 2100. We find that increases in (1) downward longwave radiation (DLR) influenced by increases in surface specific humidity (q), and (2) absorbed solar radiation (ASR) influenced by decreases in snow cover extent are, in part, the reason for a large warming trend over the plateau, particularly during winter and spring. Furthermore, elevation-based increases in DLR (influenced by q) and ASR (influenced by snow cover and atmospheric aerosols) appear to affect the elevation dependent warming trend simulated in the model. (orig.)

  2. Digital Simulation of Space Vector Modulation Based Induction Motor Drive

    Directory of Open Access Journals (Sweden)

    G.V. Siva Krishna Rao and T.S. Surendra

    2011-04-01

    Full Text Available This study deals with simulation of Space vector modulated inverter fed induction motor drive. The drive system is modeled using matlab simulink and the results are presented. This drive has advantages like reduced harmonics and heating. Fixed AC is converted into DC and this DC is converted into variable voltage and variable frequency AC using SVM inverter. The output of SVM is applied to the stator of induction motor. The simulation results are compared with the analytical results. The FFT analysis shows that the current spectrum has reduced harmonics compared to the conventional system.

  3. 财富之恶还是政府之恶?——评Capital in the twenty-first Century

    Institute of Scientific and Technical Information of China (English)

    唐娜

    2014-01-01

    <正>2013年年初出版法语版同名著作之时,年仅42岁的法国著名经济学家、巴黎政治经济学院教授托马斯·皮克提(Thomas Piketty)一定不会想到仅在短短一年之后,当他撰写的新作Capital in the Twenty-First Century,英文版在美国一经面世就掀起了学术界的轩然大波,引发了社会各界关于资本主义财富分配和社会不平等问题的争论与思考。

  4. Twenty-First Water Reactor Safety Information Meeting. Volume 3, Primary system integrity; Aging research, products and applications; Structural and seismic engineering; Seismology and geology: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Monteleone, S. [comp.] [Brookhaven National Lab., Upton, NY (United States)

    1994-04-01

    This three-volume report contains 90 papers out of the 102 that were presented at the Twenty-First Water Reactor Safety Information Meeting held at the Bethesda Marriott Hotel, Bethesda, Maryland, during the week of October 25-27, 1993. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Germany, Japan, Russia, Switzerland, Taiwan, and United Kingdom. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. Selected papers were indexed separately for inclusion in the Energy Science and Technology Database.

  5. Neutral Buoyancy Simulator - NB32 - Large Space Structure

    Science.gov (United States)

    1980-01-01

    The Hubble Space Telescope (HST) is a cooperative program of the European Space Agency (ESA) and the National Aeronautical and Space Administration (NASA) to operate a long-lived space-based observatory; it was the flagship mission of NASA's Great Observatories program. The HST program began as an astronomical dream in the 1940s. During the 1970s and 1980s, HST was finally designed and built; and it finally became operational in the 1990s. HST was deployed into a low-Earth orbit on April 25, 1990 from the cargo bay of the Space Shuttle Discovery (STS-31). The design of the HST took into consideration its length of service and the necessity of repairs and equipment replacement by making the body modular. In doing so, subsequent shuttle missions could recover the HST, replace faulty or obsolete parts and be re-released. MSFC's Neutral Buoyancy Simulator served as the training facility for shuttle astronauts for Hubble related missions. Shown is astronaut Sharnon Lucid having her life support system being checked prior to entering the NBS to begin training on the space telescope axial scientific instrument changeout.

  6. Neutral Buoyancy Simulator - NB32 - Large Space Structure

    Science.gov (United States)

    1980-01-01

    The Hubble Space Telescope (HST) is a cooperative program of the European Space Agency (ESA) and the National Aeronautical and Space Administration (NASA) to operate a long-lived space-based observatory; it was the flagship mission of NASA's Great Observatories program. The HST program began as an astronomical dream in the 1940s. During the 1970s and 1980s, HST was finally designed and built; and it finally became operational in the 1990s. HST was deployed into a low-Earth orbit on April 25, 1990 from the cargo bay of the Space Shuttle Discovery (STS-31). The design of the HST took into consideration its length of service and the necessity of repairs and equipment replacement by making the body modular. In doing so, subsequent shuttle missions could recover the HST, replace faulty or obsolete parts and be re-released. MSFC's Neutral Buoyancy Simulator served as the training facility for shuttle astronauts for Hubble related missions. Shown is astronaut Sharnon Lucid having her life support system being checked prior to entering the NBS to begin training on the space telescope axial scientific instrument changeout.

  7. Projected impact of climate change in the hydroclimatology of Senegal with a focus over the Lake of Guiers for the twenty-first century

    Science.gov (United States)

    Tall, Moustapha; Sylla, Mouhamadou Bamba; Diallo, Ismaïla; Pal, Jeremy S.; Faye, Aïssatou; Mbaye, Mamadou Lamine; Gaye, Amadou Thierno

    2016-04-01

    This study analyzes the impact of anthropogenic climate change in the hydroclimatology of Senegal with a focus over the lake of Guiers basin for the middle (2041-2060) and late twenty-first century (2080-2099). To this end, high-resolution multimodel ensemble based on regional climate model experiments considering two Representative Concentration Pathways (RCP4.5 and RCP8.5) is used. The results indicate that an elevated warming, leading to substantial increase of atmospheric water demand, is projected over the whole of Senegal. In the Lake basin, these increases in potential evapotranspiration (PE) range between 10 and 25 % in the near future and for RCP4.5 while for the far future and RCP8.5, they exceed 50 %. In addition, mean precipitation unveils contrasting changes with wetter (10 to 25 % more) conditions by the middle of the century and drier conditions (more than 50 %) during the late twenty-first century. Such changes cause more/less evapotranspiration and soil moisture respectively during the two future periods. Furthermore, surface runoff shows a tendency to increase in most areas amid few locations including the Lake basin with substantial reduction. Finally, it is found that while semi-arid climates develop in the RCP4.5 scenario, generalized arid conditions prevail over the whole Senegal for RCP8.5. It is thus evident that these future climate conditions substantially threaten freshwater availability for the country and irrigated cropping over the Lake basin. Therefore, strong governmental politics are needed to help design response options to cope with the challenges posed by the projected climate change for the country.

  8. Impacts and responses to sea-level rise: a global analysis of the SRES scenarios over the twenty-first century.

    Science.gov (United States)

    Nicholls, Robert J; Tol, Richard S J

    2006-04-15

    Taking the Special Report on Emission Scenarios (SRES) climate and socio-economic scenarios (A1FI, A2, B1 and B2 'future worlds'), the potential impacts of sea-level rise through the twenty-first century are explored using complementary impact and economic analysis methods at the global scale. These methods have never been explored together previously. In all scenarios, the exposure and hence the impact potential due to increased flooding by sea-level rise increases significantly compared to the base year (1990). While mitigation reduces impacts, due to the lagged response of sea-level rise to atmospheric temperature rise, impacts cannot be avoided during the twenty-first century by this response alone. Cost-benefit analyses suggest that widespread protection will be an economically rational response to land loss due to sea-level rise in the four SRES futures that are considered. The most vulnerable future worlds to sea-level rise appear to be the A2 and B2 scenarios, which primarily reflects differences in the socio-economic situation (coastal population, Gross Domestic Product (GDP) and GDP/capita), rather than the magnitude of sea-level rise. Small islands and deltaic settings stand out as being more vulnerable as shown in many earlier analyses. Collectively, these results suggest that human societies will have more choice in how they respond to sea-level rise than is often assumed. However, this conclusion needs to be tempered by recognition that we still do not understand these choices and significant impacts remain possible. Future worlds which experience larger rises in sea-level than considered here (above 35 cm), more extreme events, a reactive rather than proactive approach to adaptation, and where GDP growth is slower or more unequal than in the SRES futures remain a concern. There is considerable scope for further research to better understand these diverse issues.

  9. Projected impact of climate change in the hydroclimatology of Senegal with a focus over the Lake of Guiers for the twenty-first century

    Science.gov (United States)

    Tall, Moustapha; Sylla, Mouhamadou Bamba; Diallo, Ismaïla; Pal, Jeremy S.; Faye, Aïssatou; Mbaye, Mamadou Lamine; Gaye, Amadou Thierno

    2017-07-01

    This study analyzes the impact of anthropogenic climate change in the hydroclimatology of Senegal with a focus over the lake of Guiers basin for the middle (2041-2060) and late twenty-first century (2080-2099). To this end, high-resolution multimodel ensemble based on regional climate model experiments considering two Representative Concentration Pathways (RCP4.5 and RCP8.5) is used. The results indicate that an elevated warming, leading to substantial increase of atmospheric water demand, is projected over the whole of Senegal. In the Lake basin, these increases in potential evapotranspiration (PE) range between 10 and 25 % in the near future and for RCP4.5 while for the far future and RCP8.5, they exceed 50 %. In addition, mean precipitation unveils contrasting changes with wetter (10 to 25 % more) conditions by the middle of the century and drier conditions (more than 50 %) during the late twenty-first century. Such changes cause more/less evapotranspiration and soil moisture respectively during the two future periods. Furthermore, surface runoff shows a tendency to increase in most areas amid few locations including the Lake basin with substantial reduction. Finally, it is found that while semi-arid climates develop in the RCP4.5 scenario, generalized arid conditions prevail over the whole Senegal for RCP8.5. It is thus evident that these future climate conditions substantially threaten freshwater availability for the country and irrigated cropping over the Lake basin. Therefore, strong governmental politics are needed to help design response options to cope with the challenges posed by the projected climate change for the country.

  10. Simulated Space Environment Effects on a Candidate Solar Sail Material

    Science.gov (United States)

    Kang, Jin Ho; Bryant, Robert G.; Wilkie, W. Keats; Wadsworth, Heather M.; Craven, Paul D.; Nehls, Mary K.; Vaughn, Jason A.

    2017-01-01

    For long duration missions of solar sails, the sail material needs to survive harsh space environments and the degradation of the sail material controls operational lifetime. Therefore, understanding the effects of the space environment on the sail membrane is essential for mission success. In this study, we investigated the effect of simulated space environment effects of ionizing radiation, thermal aging and simulated potential damage on mechanical, thermal and optical properties of a commercial off the shelf (COTS) polyester solar sail membrane to assess the degradation mechanisms on a feasible solar sail. The solar sail membrane was exposed to high energy electrons (about 70 keV and 10 nA/cm2), and the physical properties were characterized. After about 8.3 Grad dose, the tensile modulus, tensile strength and failure strain of the sail membrane decreased by about 20 95%. The aluminum reflective layer was damaged and partially delaminated but it did not show any significant change in solar absorbance or thermal emittance. The effect on mechanical properties of a pre-cracked sample, simulating potential impact damage of the sail membrane, as well as thermal aging effects on metallized PEN (polyethylene naphthalate) film will be discussed.

  11. High Performance Parallel Methods for Space Weather Simulations

    Science.gov (United States)

    Hunter, Paul (Technical Monitor); Gombosi, Tamas I.

    2003-01-01

    This is the final report of our NASA AISRP grant entitled 'High Performance Parallel Methods for Space Weather Simulations'. The main thrust of the proposal was to achieve significant progress towards new high-performance methods which would greatly accelerate global MHD simulations and eventually make it possible to develop first-principles based space weather simulations which run much faster than real time. We are pleased to report that with the help of this award we made major progress in this direction and developed the first parallel implicit global MHD code with adaptive mesh refinement. The main limitation of all earlier global space physics MHD codes was the explicit time stepping algorithm. Explicit time steps are limited by the Courant-Friedrichs-Lewy (CFL) condition, which essentially ensures that no information travels more than a cell size during a time step. This condition represents a non-linear penalty for highly resolved calculations, since finer grid resolution (and consequently smaller computational cells) not only results in more computational cells, but also in smaller time steps.

  12. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  13. 25th Space Simulation Conference. Environmental Testing: The Earth-Space Connection

    Science.gov (United States)

    Packard, Edward

    2008-01-01

    Topics covered include: Methods of Helium Injection and Removal for Heat Transfer Augmentation; The ESA Large Space Simulator Mechanical Ground Support Equipment for Spacecraft Testing; Temperature Stability and Control Requirements for Thermal Vacuum/Thermal Balance Testing of the Aquarius Radiometer; The Liquid Nitrogen System for Chamber A: A Change from Original Forced Flow Design to a Natural Flow (Thermo Siphon) System; Return to Mercury: A Comparison of Solar Simulation and Flight Data for the MESSENGER Spacecraft; Floating Pressure Conversion and Equipment Upgrades of Two 3.5kw, 20k, Helium Refrigerators; Affect of Air Leakage into a Thermal-Vacuum Chamber on Helium Refrigeration Heat Load; Special ISO Class 6 Cleanroom for the Lunar Reconnaissance Orbiter (LRO) Project; A State-of-the-Art Contamination Effects Research and Test Facility Martian Dust Simulator; Cleanroom Design Practices and Their Influence on Particle Counts; Extra Terrestrial Environmental Chamber Design; Contamination Sources Effects Analysis (CSEA) - A Tool to Balance Cost/Schedule While Managing Facility Availability; SES and Acoustics at GSFC; HST Super Lightweight Interchangeable Carrier (SLIC) Static Test; Virtual Shaker Testing: Simulation Technology Improves Vibration Test Performance; Estimating Shock Spectra: Extensions beyond GEVS; Structural Dynamic Analysis of a Spacecraft Multi-DOF Shaker Table; Direct Field Acoustic Testing; Manufacture of Cryoshroud Surfaces for Space Simulation Chambers; The New LOTIS Test Facility; Thermal Vacuum Control Systems Options for Test Facilities; Extremely High Vacuum Chamber for Low Outgassing Processing at NASA Goddard; Precision Cleaning - Path to Premier; The New Anechoic Shielded Chambers Designed for Space and Commercial Applications at LIT; Extraction of Thermal Performance Values from Samples in the Lunar Dust Adhesion Bell Jar; Thermal (Silicon Diode) Data Acquisition System; Aquarius's Instrument Science Data System (ISDS) Automated

  14. Advanced Unsteady Turbulent Combustion Simulation Capability for Space Propulsion Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is a high performance, high fidelity simulation capability to enable accurate, fast and robust simulation of unsteady turbulent,...

  15. Compositional Space Parameterization Approach for Reservoir Flow Simulation

    Science.gov (United States)

    Voskov, D.

    2011-12-01

    Phase equilibrium calculations are the most challenging part of a compositional flow simulation. For every gridblock and at every time step, the number of phases and their compositions must be computed for the given overall composition, temperature, and pressure conditions. The conventional approach used in petroleum industry is based on performing a phase-stability test, and solving the fugacity constraints together with the coupled nonlinear flow equations when the gridblock has more than one phase. The multi-phase compositional space can be parameterized in terms of tie-simplexes. For example, a tie-triangle can be used such that its interior encloses the three-phase region, and the edges represent the boundary with specific two-phase regions. The tie-simplex parameterization can be performed for pressure, temperature, and overall composition. The challenge is that all of these parameters can change considerably during the course of a simulation. It is possible to prove that the tie-simplexes change continuously with respect to pressure, temperature, and overall composition. The continuity of the tie-simplex parameterization allows for interpolation using discrete representations of the tie-simplex space. For variations of composition, a projection to the nearest tie-simplex is used, and if the tie-simplex is within a predefined tolerance, it can be used directly to identify the phase-state of this composition. In general, our parameterization approach can be seen as the generalization of negative flash idea for systems with two or more phases. Theory of dispersion-free compositional displacements, as well as computational experience of general-purpose compositional flow simulation indicates that the displacement path in compositional space is determined by a limited number of tie-simplexes. Therefore, only few tie-simplex tables are required to parameterize the entire displacement. The small number of tie-simplexes needed in a course of a simulation motivates

  16. Realistic simulation of the Space-borne Compton Polarimeter POLAR

    Science.gov (United States)

    Xiao, Hualin

    2016-07-01

    POLAR is a compact wide field space-borne detector dedicated for precise measurements of the linear polarization of hard x-rays emitted by transient sources. Its energy range sensitivity is optimized for the detection of the prompt emission of Gamma-ray bursts (GRBs). POLAR is developed by an international collaboration of China, Switzerland and Poland. It is planned to be launched into space in 2016 onboard the Chinese space laboratory TG2. The energy range of POLAR spans between 50 keV and 500 keV. POLAR detects gamma rays with an array of 1600 plastic scintillator bars read out by 25 muti-anode PMTs (MAPMTs). Polarization measurements use Compton scattering process and are based on detection of energy depositions in the scintillator bars. Reconstruction of the polarization degree and polarization angle of GRBs requires comparison of experimental modulation curves with realistic simulations of the full instrument response. In this paper we present a method to model and parameterize the detector response including efficiency of the light collection, contributions from crosstalk and non-uniformity of MAPMTs as well as dependency on low energy detection thresholds and noise from readout electronics. The performance of POLAR for determination of polarization is predicted with such realistic simulations and carefully cross-checked with dedicated laboratory tests.

  17. Time-dependent radiation dose simulations during interplanetary space flights

    Science.gov (United States)

    Dobynde, Mikhail; Shprits, Yuri; Drozdov, Alexander; Hoffman, Jeffrey; Li, Ju

    2016-07-01

    Space radiation is one of the main concerns in planning long-term interplanetary human space missions. There are two main types of hazardous radiation - Solar Energetic Particles (SEP) and Galactic Cosmic Rays (GCR). Their intensities and evolution depend on the solar activity. GCR activity is most enhanced during solar minimum, while the most intense SEPs usually occur during the solar maximum. SEPs are better shielded with thick shields, while GCR dose is less behind think shields. Time and thickness dependences of the intensity of these two components encourage looking for a time window of flight, when radiation intensity and dose of SEP and GCR would be minimized. In this study we combine state-of-the-art space environment models with GEANT4 simulations to determine the optimal shielding, geometry of the spacecraft, and launch time with respect to the phase of the solar cycle. The radiation environment was described by the time-dependent GCR model, and the SEP spectra that were measured during the period from 1990 to 2010. We included gamma rays, electrons, neutrons and 27 fully ionized elements from hydrogen to nickel. We calculated the astronaut's radiation doses during interplanetary flights using the Monte-Carlo code that accounts for the primary and the secondary radiation. We also performed sensitivity simulations for the assumed spacecraft size and thickness to find an optimal shielding. In conclusion, we present the dependences of the radiation dose as a function of launch date from 1990 to 2010, for flight durations of up to 3 years.

  18. Simulated Space Vacuum Ultraviolet (VUV) Exposure Testing for Polymer Films

    Science.gov (United States)

    Dever, Joyce A.; Pietromica, Anthony J.; Stueber, Thomas J.; Sechkar, Edward A.; Messer, Russell K.

    2002-01-01

    Vacuum ultraviolet (VUV) radiation of wavelengths between 115 and 200 nm produced by the sun in the space environment can cause degradation to polymer films producing changes in optical, mechanical, and chemical properties. These effects are particularly important for thin polymer films being considered for ultra-lightweight space structures, because, for most polymers, VUV radiation is absorbed in a thin surface layer. NASA Glenn Research Center has developed facilities and methods for long-term ground testing of polymer films to evaluate space environmental VUV radiation effects. VUV exposure can also be used as part of sequential simulated space environmental exposures to determine combined damaging effects. This paper will describe the effects of VUV on polymer films and the necessity for ground testing. Testing practices used at Glenn Research Center for VUV exposure testing will be described including characterization of the VUV radiation source used, calibration procedures traceable to the National Institute of Standards and Technology (NIST), and testing techniques for VUV exposure of polymer surfaces.

  19. Simulation and experiment for large scale space structure

    Science.gov (United States)

    Sun, Hongbo; Zhou, Jian; Zha, Zuoliang

    2013-04-01

    The future space structures are relatively large, flimsy, and lightweight. As a result, they are more easily affected or distortion by space environments compared to other space structures. This study examines the structural integrity of a large scale space structure. A new design of transient temperature field analysis method of the developable reflector on orbit environment is presented, which simulates physical characteristic of developable antenna reflector with a high precision. The different kinds of analysis denote that different thermal elastic characteristics of different materials. The three-dimension multi-physics coupling transient thermal distortion equations for the antenna are founded based on the Galerkins method. For a reflector on geosynchronous orbit, the transient temperature field results from this method are compared with these from NASA. It follows from the analysis that the precision of this method is high. An experimental system is established to verify the control mechanism with IEBIS and thermal sensor technique. The shape control experiments are finished by measuring and analyzing developable tube. Results reveal that the temperature levels of the developable antenna reflector alternate greatly in the orbital period, which is about ±120° when considering solar flux ,earth radiating flux and albedo scattering flux.

  20. Primary loop simulation of the SP-100 space nuclear reactor

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Eduardo M.; Braz Filho, Francisco A.; Guimaraes, Lamartine N.F., E-mail: eduardo@ieav.cta.b, E-mail: fbraz@ieav.cta.b, E-mail: guimarae@ieav.cta.b [Instituto de Estudos Avancados (IEAv/DCTA) Sao Jose dos Campos, SP (Brazil)

    2011-07-01

    Between 1983 and 1992 the SP-100 space nuclear reactor development project for electric power generation in a range of 100 to 1000 kWh was conducted in the USA. Several configurations were studied to satisfy different mission objectives and power systems. In this reactor the heat is generated in a compact core and refrigerated by liquid lithium, the primary loops flow are controlled by thermoelectric electromagnetic pumps (EMTE), and thermoelectric converters produce direct current energy. To define the system operation point for an operating nominal power, it is necessary the simulation of the thermal-hydraulic components of the space nuclear reactor. In this paper the BEMTE-3 computer code is used to EMTE pump design performance evaluation to a thermalhydraulic primary loop configuration, and comparison of the system operation points of SP-100 reactor to two thermal powers, with satisfactory results. (author)

  1. Simulation Modeling and Performance Evaluation of Space Networks

    Science.gov (United States)

    Jennings, Esther H.; Segui, John

    2006-01-01

    In space exploration missions, the coordinated use of spacecraft as communication relays increases the efficiency of the endeavors. To conduct trade-off studies of the performance and resource usage of different communication protocols and network designs, JPL designed a comprehensive extendable tool, the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE). The design and development of MACHETE began in 2000 and is constantly evolving. Currently, MACHETE contains Consultative Committee for Space Data Systems (CCSDS) protocol standards such as Proximity-1, Advanced Orbiting Systems (AOS), Packet Telemetry/Telecommand, Space Communications Protocol Specification (SCPS), and the CCSDS File Delivery Protocol (CFDP). MACHETE uses the Aerospace Corporation s Satellite Orbital Analysis Program (SOAP) to generate the orbital geometry information and contact opportunities. Matlab scripts provide the link characteristics. At the core of MACHETE is a discrete event simulator, QualNet. Delay Tolerant Networking (DTN) is an end-to-end architecture providing communication in and/or through highly stressed networking environments. Stressed networking environments include those with intermittent connectivity, large and/or variable delays, and high bit error rates. To provide its services, the DTN protocols reside at the application layer of the constituent internets, forming a store-and-forward overlay network. The key capabilities of the bundling protocols include custody-based reliability, ability to cope with intermittent connectivity, ability to take advantage of scheduled and opportunistic connectivity, and late binding of names to addresses. In this presentation, we report on the addition of MACHETE models needed to support DTN, namely: the Bundle Protocol (BP) model. To illustrate the use of MACHETE with the additional DTN model, we provide an example simulation to benchmark its performance. We demonstrate the use of the DTN protocol

  2. Program NAJOCSC and space charge effect simulation in C01

    Energy Technology Data Exchange (ETDEWEB)

    Tang, J.Y.; Chabert, A.; Baron, E

    1999-03-10

    During the beam tests of the THI project at GANIL, it was found it difficult to increase the beam power above 2 kW at CSS2 extraction. The space charge effect (abbreviated as S.C. effect) in cyclotrons is suspected to play some role in the phenomenon, especially the longitudinal S.C. one and also the coupling between longitudinal and radial motions. The injector cyclotron C01 is studied, and the role played by the S.C. effect in this cyclotron in the THI case is investigated by a simulation method. (K.A.) 12 refs.

  3. Human habitat positioning system for NASA's space flight environmental simulator

    Science.gov (United States)

    Caldwell, W. F.; Tucker, J.; Keas, P.

    1998-01-01

    Artificial gravity by centrifugation offers an effective countermeasure to the physiologic deconditioning of chronic exposure to microgravity; however, the system requirements of rotational velocity, radius of rotation, and resultant centrifugal acceleration require thorough investigation to ascertain the ideal human-use centrifuge configuration. NASA's Space Flight Environmental Simulator (SFES), a 16-meter (52-foot) diameter, animal-use centrifuge, was recently modified to accommodate human occupancy. This paper describes the SFES Human Habitat Positioning System, the mechanism that facilitates radius of rotation variability and alignment of the centrifuge occupants with the artificial gravity vector.

  4. Simulations of space charge neutralization in a magnetized electron cooler

    Energy Technology Data Exchange (ETDEWEB)

    Gerity, James [Texas A-M; McIntyre, Peter M. [Texas A-M; Bruhwiler, David Leslie [RadiaSoft, Boulder; Hall, Christopher [RadiaSoft, Boulder; Moens, Vince Jan [Ecole Polytechnique, Lausanne; Park, Chong Shik [Fermilab; Stancari, Giulio [Fermilab

    2017-02-02

    Magnetized electron cooling at relativistic energies and Ampere scale current is essential to achieve the proposed ion luminosities in a future electron-ion collider (EIC). Neutralization of the space charge in such a cooler can significantly increase the magnetized dynamic friction and, hence, the cooling rate. The Warp framework is being used to simulate magnetized electron beam dynamics during and after the build-up of neutralizing ions, via ionization of residual gas in the cooler. The design follows previous experiments at Fermilab as a verification case. We also discuss the relevance to EIC designs.

  5. Magnetic Null Points in Kinetic Simulations of Space Plasmas

    Science.gov (United States)

    Olshevsky, Vyacheslav; Deca, Jan; Divin, Andrey; Peng, Ivy Bo; Markidis, Stefano; Innocenti, Maria Elena; Cazzola, Emanuele; Lapenta, Giovanni

    2016-03-01

    We present a systematic attempt to study magnetic null points and the associated magnetic energy conversion in kinetic particle-in-cell simulations of various plasma configurations. We address three-dimensional simulations performed with the semi-implicit kinetic electromagnetic code iPic3D in different setups: variations of a Harris current sheet, dipolar and quadrupolar magnetospheres interacting with the solar wind, and a relaxing turbulent configuration with multiple null points. Spiral nulls are more likely created in space plasmas: in all our simulations except lunar magnetic anomaly (LMA) and quadrupolar mini-magnetosphere the number of spiral nulls prevails over the number of radial nulls by a factor of 3-9. We show that often magnetic nulls do not indicate the regions of intensive energy dissipation. Energy dissipation events caused by topological bifurcations at radial nulls are rather rare and short-lived. The so-called X-lines formed by the radial nulls in the Harris current sheet and LMA simulations are rather stable and do not exhibit any energy dissipation. Energy dissipation is more powerful in the vicinity of spiral nulls enclosed by magnetic flux ropes with strong currents at their axes (their cross sections resemble 2D magnetic islands). These null lines reminiscent of Z-pinches efficiently dissipate magnetic energy due to secondary instabilities such as the two-stream or kinking instability, accompanied by changes in magnetic topology. Current enhancements accompanied by spiral nulls may signal magnetic energy conversion sites in the observational data.

  6. Physical layer simulator for undersea free-space laser communications

    Science.gov (United States)

    Dalgleish, Fraser R.; Shirron, Joseph J.; Rashkin, David; Giddings, Thomas E.; Vuorenkoski Dalgleish, Anni K.; Cardei, Ionut; Ouyang, Bing; Caimi, Frank M.; Cardei, Mihaela

    2014-05-01

    High bandwidth (10 to 100 Mbps), real-time data networking in the subsea environment using free-space lasers has a potentially high impact as an enabling technology for a variety of future subsea operations in the areas of distributed sensing, real-time wireless data transfer, control of unmanned undersea vehicles, and other submerged assets. However, the development and testing of laser networking equipment in the undersea environment are expensive and time consuming, and there is a clear need for a network simulation framework that will allow researchers to evaluate the performance of alternate optical and electronic configurations under realistic operational and environmental constraints. The overall objective of the work reported in this paper was to develop and validate such a simulation framework, which consists of (1) a time-dependent radiative transfer model to accurately predict the channel impulse characteristics for alternate system designs over a range of geometries and optical properties and (2) digital modulation and demodulation blocks which accurately simulate both laser source and receiver noise characteristics in order to generate time domain bit stream samples that can be digitally demodulated to predict the resulting bit error rate of the simulated link.

  7. Retos de la bioética en la medicina del siglo XXI Challenges of bioethics in twenty-first century medicine

    Directory of Open Access Journals (Sweden)

    Jorge Alberto Álvarez-Díaz

    2011-12-01

    Full Text Available Para plantear posibles retos de la bioética en la medicina del siglo XXI es necesario considerar que existieron algunos retos en el pasado (en el origen de esa nueva disciplina llamada bioética; que los retos se han ido modificando con el avance científico, biomédico y humanístico; considerando que los retos que pueden plantearse para el futuro serán, de diferentes maneras, resultado de este devenir histórico. Se plantean como grandes retos: los problemas no resueltos de justicia, equidad y pobreza; los retos que plantea la introducción de nuevas tecnologías con el paradigma de la nanomedicina y los retos que plantea el avance de las neurociencias con el paradigma de la neuroética.In order to propose possible challenges of bioethics in the twenty-first century medicine, it is necessary to consider that there were some past challenges (at the origin of this new discipline called bioethics, that the challenges have been modified with scientific, biomedical and humanistic breakthroughs, considering at the same time that challenges that may arise in the future will be, in different ways, a result of this historical evolution. The major challenges would be in the future: the unsolved problems of justice, equity and poverty; the challenges posed by the introduction of new technologies with the nanomedicine paradigm; and finally, the challenges driven by breakthroughs in neurosciences with the neuroethics paradigm.

  8. Thomas Piketty: Capital in the Twenty-First Century (Le Capital au XXIe siècle. (Ensk þýðing: Arthur Goldhammer.

    Directory of Open Access Journals (Sweden)

    Gylfi Magnússon

    2014-06-01

    Full Text Available Í umsögn gagnrýnanda kemur meðal annars eftirfarandi fram: Ritinu er ekki ætlað að vera lokaorðin um viðfangsefnið heldur miklu frekar grunnur að frekari umræðu og rannsóknum. Það hefur tekist. Capital in the Twenty-First Century er verk sem hefur þegar vakið mikla umræðu og verður vafalaust rætt áfram árum saman. Það er raunar nánast skyldulesning fyrir þá sem ætla sér að fjalla um þjóðhagfræði og hlutverk hins opinbera, hversu sammála eða ósammála sem þeir eru höfundinum.

  9. Identification and future description of warming signatures over Pakistan with special emphasis on evolution of CO2 levels and temperature during the first decade of the twenty-first century.

    Science.gov (United States)

    Haider, Khadija; Khokhar, Muhammad Fahim; Chishtie, Farrukh; RazzaqKhan, Waseem; Hakeem, Khalid Rehman

    2017-03-01

    Like other developing countries, Pakistan is also facing changes in temperature per decade and other climatic abnormalities like droughts and torrential rains. In order to assess and identify the extent of temperature change over Pakistan, the whole Pakistan was divided into five climatic zones ranging from very cold to hot and dry climates. Similarly, seasons in Pakistan are defined on the basis of monsoon variability as winter, pre-monsoon, monsoon, and post-monsoon. This study primarily focuses on the comparison of surface temperature observations from Pakistan Meteorological Department (PMD) network with PRECIS (Providing Regional Climates for Impacts Studies) model simulations. Results indicate that PRECIS underestimates the temperature in Northern Pakistan and during the winter season. However, there exists a fair agreement between PRECIS output and observed datasets in the lower plain and hot areas of the country. An absolute increase of 0.07 °C is observed in the mean temperature over Pakistan during the time period of 1951-2010. Especially, the increase is more significant (0.7 °C) during the last 14 years (1997-2010). Moreover, SCIAMACHY observations were used to explore the evolution of atmospheric CO2 levels in comparison to temperature over Pakistan. CO2 levels have shown an increasing trend during the first decade of the twenty-first century.

  10. An FPGA computing demo core for space charge simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jinyuan; Huang, Yifei; /Fermilab

    2009-01-01

    In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.

  11. Space Debris Attitude Simulation - IOTA (In-Orbit Tumbling Analysis)

    Science.gov (United States)

    Kanzler, R.; Schildknecht, T.; Lips, T.; Fritsche, B.; Silha, J.; Krag, H.

    Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA's Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. The In-Orbit Tumbling Analysis tool (IOTA) is a prototype software, currently in development within the framework of ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), which is led by the Astronomical Institute of the University of Bern (AIUB). The project goal is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). Developed by Hyperschall Technologie Göttingen GmbH (HTG), IOTA will be a highly modular software tool to perform short- (days), medium- (months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour

  12. Virtual Reality Simulation of the International Space Welding Experiment

    Science.gov (United States)

    Phillips, James A.

    1996-01-01

    Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) It involves 3-dimensional computer graphics; (2) It includes real-time feedback and response to user actions; and (3) It must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, and we have therefore implemented a VR trainer for the International Space Welding Experiment. My role in the development of the ISWE trainer consisted of the following: (1) created texture-mapped models of the ISWE's rotating sample drum, technology block, tool stowage assembly, sliding foot restraint, and control panel; (2) developed C code for control panel button selection and rotation of the sample drum; (3) In collaboration with Tim Clark (Antares Virtual Reality Systems), developed a serial interface box for the PC and the SGI Indigo so that external control devices, similar to ones actually used on the ISWE, could be used to control virtual objects in the ISWE simulation; (4) In collaboration with Peter Wang (SFFP) and Mark Blasingame (Boeing), established the interference characteristics of the VIM 1000 head-mounted-display and tested software filters to correct the problem; (5) In collaboration with Peter Wang and Mark Blasingame, established software and procedures for interfacing the VPL DataGlove and the Polhemus 6DOF position sensors to the SGI Indigo serial ports. The majority of the ISWE modeling effort was conducted on a PC-based VR Workstation, described below.

  13. Magnetic null points in kinetic simulations of space plasmas

    CERN Document Server

    Olshevsky, Vyacheslav; Divin, Andrey; Peng, Ivy Bo; Markidis, Stefano; Innocenti, Maria Elena; Cazzola, Emanuele; Lapenta, Giovanni

    2015-01-01

    We present a systematic attempt to study magnetic null points and the associated magnetic energy conversion in kinetic Particle-in-Cell simulations of various plasma configurations. We address three-dimensional simulations performed with the semi-implicit kinetic electromagnetic code iPic3D in different setups: variations of a Harris current sheet, dipolar and quadrupolar magnetospheres interacting with the solar wind; and a relaxing turbulent configuration with multiple null points. Spiral nulls are more likely created in space plasmas: in all our simulations except lunar magnetic anomaly and quadrupolar mini-magnetosphere the number of spiral nulls prevails over the number of radial nulls by a factor of 3-9. We show that often magnetic nulls do not indicate the regions of intensive energy dissipation. Energy dissipation events caused by topological bifurcations at radial nulls are rather rare and short-lived. The so-called X-lines formed by the radial nulls in the Harris current sheet and lunar magnetic ano...

  14. Distributed interactive communication in simulated space-dwelling groups.

    Science.gov (United States)

    Brady, Joseph V; Hienz, Robert D; Hursh, Steven R; Ragusa, Leonard C; Rouse, Charles O; Gasior, Eric D

    2004-03-01

    This report describes the development and preliminary application of an experimental test bed for modeling human behavior in the context of a computer generated environment to analyze the effects of variations in communication modalities, incentives and stressful conditions. In addition to detailing the methodological development of a simulated task environment that provides for electronic monitoring and recording of individual and group behavior, the initial substantive findings from an experimental analysis of distributed interactive communication in simulated space dwelling groups are described. Crews of three members each (male and female) participated in simulated "planetary missions" based upon a synthetic scenario task that required identification, collection, and analysis of geologic specimens with a range of grade values. The results of these preliminary studies showed clearly that cooperative and productive interactions were maintained between individually isolated and distributed individuals communicating and problem-solving effectively in a computer-generated "planetary" environment over extended time intervals without benefit of one another's physical presence. Studies on communication channel constraints confirmed the functional interchangeability between available modalities with the highest degree of interchangeability occurring between Audio and Text modes of communication. The effects of task-related incentives were determined by the conditions under which they were available with Positive Incentives effectively attenuating decrements in performance under stressful time pressure.

  15. A Data Management System for International Space Station Simulation Tools

    Science.gov (United States)

    Betts, Bradley J.; DelMundo, Rommel; Elcott, Sharif; McIntosh, Dawn; Niehaus, Brian; Papasin, Richard; Mah, Robert W.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Groups associated with the design, operational, and training aspects of the International Space Station make extensive use of modeling and simulation tools. Users of these tools often need to access and manipulate large quantities of data associated with the station, ranging from design documents to wiring diagrams. Retrieving and manipulating this data directly within the simulation and modeling environment can provide substantial benefit to users. An approach for providing these kinds of data management services, including a database schema and class structure, is presented. Implementation details are also provided as a data management system is integrated into the Intelligent Virtual Station, a modeling and simulation tool developed by the NASA Ames Smart Systems Research Laboratory. One use of the Intelligent Virtual Station is generating station-related training procedures in a virtual environment, The data management component allows users to quickly and easily retrieve information related to objects on the station, enhancing their ability to generate accurate procedures. Users can associate new information with objects and have that information stored in a database.

  16. Future projections of synoptic weather types over the Arabian Peninsula during the twenty-first century using an ensemble of CMIP5 models

    Science.gov (United States)

    El Kenawy, Ahmed M.; McCabe, Matthew F.

    2016-07-01

    An assessment of future change in synoptic conditions over the Arabian Peninsula throughout the twenty-first century was performed using 20 climate models from the Coupled Model Intercomparison Project Phase 5 (CMIP5) database. We employed the mean sea level pressure (SLP) data from model output together with NCEP/NCAR reanalysis data and compared the relevant circulation types produced by the Lamb classification scheme for the base period 1975-2000. Overall, model results illustrated good agreement with the reanalysis, albeit with a tendency to underestimate cyclonic (C) and southeasterly (SE) patterns and to overestimate anticyclones and directional flows. We also investigated future projections for each circulation-type during the rainy season (December-May) using three Representative Concentration Pathways (RCPs), comprising RCP2.6, RCP4.5, and RCP8.5. Overall, two scenarios (RCP4.5 and RCP 8.5) revealed a statistically significant increase in weather types favoring above normal rainfall in the region (e.g., C and E-types). In contrast, weather types associated with lower amounts of rainfall (e.g., anticyclones) are projected to decrease in winter but increase in spring. For all scenarios, there was consistent agreement on the sign of change (i.e., positive/negative) for the most frequent patterns (e.g., C, SE, E and A-types), whereas the sign was uncertain for less recurrent types (e.g., N, NW, SE, and W). The projected changes in weather type frequencies in the region can be viewed not only as indicators of change in rainfall response but may also be used to inform impact studies pertinent to water resource planning and management, extreme weather analysis, and agricultural production.

  17. Future projections of synoptic weather types over the Arabian Peninsula during the twenty-first century using an ensemble of CMIP5 models

    KAUST Repository

    El Kenawy, Ahmed M.

    2016-07-28

    An assessment of future change in synoptic conditions over the Arabian Peninsula throughout the twenty-first century was performed using 20 climate models from the Coupled Model Intercomparison Project Phase 5 (CMIP5) database. We employed the mean sea level pressure (SLP) data from model output together with NCEP/NCAR reanalysis data and compared the relevant circulation types produced by the Lamb classification scheme for the base period 1975–2000. Overall, model results illustrated good agreement with the reanalysis, albeit with a tendency to underestimate cyclonic (C) and southeasterly (SE) patterns and to overestimate anticyclones and directional flows. We also investigated future projections for each circulation-type during the rainy season (December–May) using three Representative Concentration Pathways (RCPs), comprising RCP2.6, RCP4.5, and RCP8.5. Overall, two scenarios (RCP4.5 and RCP 8.5) revealed a statistically significant increase in weather types favoring above normal rainfall in the region (e.g., C and E-types). In contrast, weather types associated with lower amounts of rainfall (e.g., anticyclones) are projected to decrease in winter but increase in spring. For all scenarios, there was consistent agreement on the sign of change (i.e., positive/negative) for the most frequent patterns (e.g., C, SE, E and A-types), whereas the sign was uncertain for less recurrent types (e.g., N, NW, SE, and W). The projected changes in weather type frequencies in the region can be viewed not only as indicators of change in rainfall response but may also be used to inform impact studies pertinent to water resource planning and management, extreme weather analysis, and agricultural production.

  18. Trends in survival of chronic lymphocytic leukemia patients in Germany and the USA in the first decade of the twenty-first century

    Directory of Open Access Journals (Sweden)

    Dianne Pulte

    2016-03-01

    Full Text Available Abstract Background Recent population-based studies in the United States of America (USA and other countries have shown improvements in survival for patients with chronic lymphocytic leukemia (CLL diagnosed in the early twenty-first century. Here, we examine the survival for patients diagnosed with CLL in Germany in 1997–2011. Methods Data were extracted from 12 cancer registries in Germany and compared to the data from the USA. Period analysis was used to estimate 5- and 10-year relative survival (RS. Results Five- and 10-year RS estimates in 2009–2011 of 80.2 and 59.5 %, respectively, in Germany and 82.4 and 64.7 %, respectively, in the USA were observed. Overall, 5-year RS increased significantly in Germany and the difference compared to the survival in the USA which slightly decreased between 2003–2005 and 2009–2011. However, age-specific analyses showed persistently higher survival for all ages except for 15–44 in the USA. In general, survival decreased with age, but the age-related disparity was small for patients younger than 75. In both countries, 5-year RS was >80 % for patients less than 75 years of age but <70 % for those age 75+. Conclusions Overall, 5-year survival for patients with CLL is good, but 10-year survival is significantly lower, and survival was much lower for those age 75+. Major differences in survival between countries were not observed. Further research into ways to increase survival for older CLL patients are needed to reduce the persistent large age-related survival disparity.

  19. Specific Antigens by Federal Entity in Patients at the Transplant Unit of Specialities Hospital, National Medical Center Twenty-First Century, Mexico.

    Science.gov (United States)

    Hernández Rivera, J C H; Ibarra Villanueva, A; Espinoza Pérez, R; Cancino López, J D; Silva Rueda, I R; Rodríguez Gómez, R; García Covarrubias, L; Reyes Díaz, E; Pérez López, M J; Salazar Mendoza, M

    2016-03-01

    The study of the kidney transplant involves understanding the immunologic basis, such as histocompatibility and the genetic basis of a population. In Mexico, the study of the genetic basis has led to a genetic map by federal entities. We performed an HLA study with 1,276 kidney transplant patients (recipients and donors) in the Hospital of the National Medical Center Twenty-First Century, determining HLA class I (A, B, and Cw) and class II (DRβ1 and DQβ1) antigens with the use of SSOP-PCR. A descriptive analysis was conducted with measures of central tendency (mean, SD). Of 1,276 HLA patients studied, we obtained 2,552 results for each class by the composition of the 2 haplotypes, and for HLA-Cw we processed 796 patients, for a total of 1,592 antigens for this class. We found antigens specific to each federal entity, and it was found that the Federal District had the highest number of specific antigens (10) followed by Morelos (7), Querétaro and Mexico State (3 each), and Tamaulipas, Aguascalientes, Michoacán, Guerrero, Puebla, and Oaxaca (1 each). The genetic map allows us to know proportions of antigens in every state in the center and south of Mexico owing to the diversity and area of influence of the National Medical Center XXIst Century, as well as the wide number of patients. Furthermore, there are still preserved proportionally distinct genetic roots in every entity. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. MAGNETIC NULL POINTS IN KINETIC SIMULATIONS OF SPACE PLASMAS

    Energy Technology Data Exchange (ETDEWEB)

    Olshevsky, Vyacheslav; Innocenti, Maria Elena; Cazzola, Emanuele; Lapenta, Giovanni [Centre for Mathematical Plasma Astrophysics (CmPA), KU Leuven (Belgium); Deca, Jan [Laboratory for Atmospheric and Space Physics (LASP), University of Colorado Boulder, Boulder, CO (United States); Divin, Andrey [St. Petersburg State University, St. Petersburg (Russian Federation); Peng, Ivy Bo; Markidis, Stefano, E-mail: sya@mao.kiev.ua [High Performance Computing and Visualization (HPCViz), KTH Royal Institute of Technology, Stockholm (Sweden)

    2016-03-01

    We present a systematic attempt to study magnetic null points and the associated magnetic energy conversion in kinetic particle-in-cell simulations of various plasma configurations. We address three-dimensional simulations performed with the semi-implicit kinetic electromagnetic code iPic3D in different setups: variations of a Harris current sheet, dipolar and quadrupolar magnetospheres interacting with the solar wind, and a relaxing turbulent configuration with multiple null points. Spiral nulls are more likely created in space plasmas: in all our simulations except lunar magnetic anomaly (LMA) and quadrupolar mini-magnetosphere the number of spiral nulls prevails over the number of radial nulls by a factor of 3–9. We show that often magnetic nulls do not indicate the regions of intensive energy dissipation. Energy dissipation events caused by topological bifurcations at radial nulls are rather rare and short-lived. The so-called X-lines formed by the radial nulls in the Harris current sheet and LMA simulations are rather stable and do not exhibit any energy dissipation. Energy dissipation is more powerful in the vicinity of spiral nulls enclosed by magnetic flux ropes with strong currents at their axes (their cross sections resemble 2D magnetic islands). These null lines reminiscent of Z-pinches efficiently dissipate magnetic energy due to secondary instabilities such as the two-stream or kinking instability, accompanied by changes in magnetic topology. Current enhancements accompanied by spiral nulls may signal magnetic energy conversion sites in the observational data.

  1. TID Simulation of Advanced CMOS Devices for Space Applications

    Science.gov (United States)

    Sajid, Muhammad

    2016-07-01

    This paper focuses on Total Ionizing Dose (TID) effects caused by accumulation of charges at silicon dioxide, substrate/silicon dioxide interface, Shallow Trench Isolation (STI) for scaled CMOS bulk devices as well as at Buried Oxide (BOX) layer in devices based on Silicon-On-Insulator (SOI) technology to be operated in space radiation environment. The radiation induced leakage current and corresponding density/concentration electrons in leakage current path was presented/depicted for 180nm, 130nm and 65nm NMOS, PMOS transistors based on CMOS bulk as well as SOI process technologies on-board LEO and GEO satellites. On the basis of simulation results, the TID robustness analysis for advanced deep sub-micron technologies was accomplished up to 500 Krad. The correlation between the impact of technology scaling and magnitude of leakage current with corresponding total dose was established utilizing Visual TCAD Genius program.

  2. Space Environment Simulation for Material Processing by Acoustic Levitation

    Institute of Scientific and Technical Information of China (English)

    解文军; 魏炳波

    2001-01-01

    Single-axis acoustic levitation of four polymer samples has been realized in air under the ground-based laboratory conditions for the purpose of space environment simulation of containerless processing. The levitation capabilities are investigated by numerical calculations based on a model of the boundary element method corresponding to our levitator and following Gor'kov and Barmatz's method. The calculated results, such as the resonant distance between the reflector and the vibrating source and the positions of levitated samples, agree well with experimental observation, and the effect of gravity on the time-averaged potential for levitation force is also revealed. As an application, the containerless melting and solidification of a liquid crystal, 4-Pentylphenyl-4'-methybenzoate, is successfully accomplished, in which undercooling up to 16 K is obtained and the rotation and oscillation of the sample during solidification may result in fragmentation of the usual radiating surface growth morphology.

  3. Simulating strongly correlated multiparticle systems in a truncated Hilbert space

    Energy Technology Data Exchange (ETDEWEB)

    Ernst, Thomas; Hallwood, David W.; Gulliksen, Jake; Brand, Joachim [New Zealand Institute for Advanced Study and Centre for Theoretical Chemistry and Physics, Massey University, Private Bag 102904, North Shore, Auckland 0745 (New Zealand); Meyer, Hans-Dieter [Theoretische Chemie, Physikalisch-Chemisches Institut, Universitaet Heidelberg, Im Neuenheimer Feld 229, D-69120 Heidelberg (Germany)

    2011-08-15

    Representing a strongly interacting multiparticle wave function in a finite product basis leads to errors. Simple rescaling of the contact interaction can preserve the low-lying energy spectrum and long-wavelength structure of wave functions in one-dimensional systems and thus correct for the basis set truncation error. The analytic form of the rescaling is found for a two-particle system where the rescaling is exact. A detailed comparison between finite Hilbert space calculations and exact results for up to five particles show that rescaling can significantly improve the accuracy of numerical calculations in various external potentials. In addition to ground-state energies, the low-lying excitation spectrum, density profile, and correlation functions are studied. The results give a promising outlook for numerical simulations of trapped ultracold atoms.

  4. Analysis of the Thermo-Elastic Response of Space Reflectors to Simulated Space Environment

    Science.gov (United States)

    Allegri, G.; Ivagnes, M. M.; Marchetti, M.; Poscente, F.

    2002-01-01

    The evaluation of space environment effects on materials and structures is a key matter to develop a proper design of long duration missions: since a large part of satellites operating in the earth orbital environment are employed for telecommunications, the development of space antennas and reflectors featured by high dimensional stability versus space environment interactions represents a major challenge for designers. The structural layout of state of the art space antennas and reflectors is very complex, since several different sensible elements and materials are employed: particular care must be placed in evaluating the actual geometrical configuration of the reflectors operating in the space environment, since very limited distortions of the designed layout can produce severe effects on the quality of the signal both received and transmitted, especially for antennas operating at high frequencies. The effects of thermal loads due to direct sunlight exposition and to earth and moon albedo can be easily taken into account employing the standard methods of structural analysis: on the other hand the thermal cycling and the exposition to the vacuum environment produce a long term damage accumulation which affects the whole structure. The typical effects of the just mentioned exposition are the outgassing of polymeric materials and the contamination of the exposed surface, which can affect sensibly the thermo-mechanical properties of the materials themselves and, therefore, the structural global response. The main aim of the present paper is to evaluate the synergistic effects of thermal cycling and of the exposition to high vacuum environment on an innovative antenna developed by Alenia Spazio S.p.a.: to this purpose, both an experimental and numerical research activity has been developed. A complete prototype of the antenna has been exposed to the space environment simulated by the SAS facility: this latter is constituted by an high vacuum chamber, equipped by

  5. Imaging Simulations for DESTINY, the Dark Energy Space Telescope

    Science.gov (United States)

    Lauer, T. R.; DESTINY Science Team

    2004-12-01

    We describe a mission concept for a 1.8-meter near-infrared (NIR) grism-mode space telescope optimized to return richly sampled Hubble diagrams of Type Ia and Type II supernovae (SNe) over the redshift range 0.5 the Universe as a function of time, and characterizing the nature of dark energy. The central concept for our proposed Dark Energy Space Telescope (DESTINY) is an all-grism NIR survey camera. SNe will be discovered by repeated imaging of an area located at the north ecliptic pole. Grism spectra with resolving power l/Dl = R * 100 will provide broad-band spectrophotometry, redshifts, SNe classification, as well as valuable time-resolved diagnostic data for understanding the SN explosion physics. Our approach features only a single mode of operation, a single detector technology, and a single instrument. Although grism spectroscopy is slow compared to SN detection in any single broad-band filter for photometry, or to conventional slit spectra for spectral diagnostics, the multiplex advantage of observing a large field-of-view over a full octave in wavelength simultaneously makes this approach highly competitive. In this poster we present exposure simulations to demonstrate the efficiency of the DESTINY approach.

  6. Mutagenesis of Bacillus subtilis spores exposed to simulated space environment

    Science.gov (United States)

    Munakata, N.; Natsume, T.; Takahashi, K.; Hieda, K.; Panitz, C.; Horneck, G.

    Bacterial spores can endure in a variety of extreme earthly environments. However, some conditions encountered during the space flight could be detrimental to DNA in the spore, delimiting the possibility of transpermia. We investigate the genetic consequences of the exposure to space environments in a series of preflight simulation project of EXPOSE. Using Bacillus subtilis spores of repair-proficient HA101 and repair-deficient TKJ6312 strains, the mutations conferring resistance to rifampicin were detected, isolated and sequenced. Most of the mutations were located in a N-terminal region of the rpoB gene encoding RNA polymerase beta-subunit. Among several potentially mutagenic factors, high vacuum, UV radiation, heat, and accelerated heavy ions induced mutations with varying efficiencies. A majority of mutations induced by vacuum exposure carried a tandem double-base change (CA to TT) at a unique sequence context of TCAGC. Results indicate that the vacuum and high temperature may act synergistically for the induction of mutations.

  7. Simulating atmospheric free-space optical propagation: rainfall attenuation

    Science.gov (United States)

    Achour, Maha

    2002-04-01

    With recent advances and interest in Free-Space Optics (FSO) for commercial deployments, more attention has been placed on FSO weather effects and the availability of global weather databases. The Meteorological Visual Range (Visibility) is considered one of the main weather parameters necessary to estimate FSO attenuation due to haze, fog and low clouds. Proper understanding of visibility measurements conducted throughout the years is essential. Unfortunately, such information is missing from most of the databases, leaving FSO players no choice but to use the standard visibility equation based on 2% contrast and other assumptions on the source luminance and its background. Another challenge is that visibility is measured using the visual wavelength of 550 nm. Extrapolating the measured attenuations to longer infrared wavelengths is not trivial and involves extensive experimentations. Scattering of electromagnetic waves by spherical droplets of different sizes is considered to simulate FSO scattering effects. This paper serves as an introduction to a series of publications regarding simulation of FSO atmospheric propagation. This first part focuses on attenuation due to rainfall. Additional weather parameters, such as rainfall rate, temperature and relative humidity are considered to effectively build the rain model. Comparison with already published experimental measurement is performed to validate the model. The scattering cross section due to rain is derived from the density of different raindrop sizes and the raindrops fall velocity is derived from the overall rainfall rate. Absorption due the presence of water vapor is computed using the temperature and relative humidity measurements.

  8. Sea-level rise and its possible impacts given a 'beyond 4°C world' in the twenty-first century.

    Science.gov (United States)

    Nicholls, Robert J; Marinova, Natasha; Lowe, Jason A; Brown, Sally; Vellinga, Pier; de Gusmão, Diogo; Hinkel, Jochen; Tol, Richard S J

    2011-01-13

    The range of future climate-induced sea-level rise remains highly uncertain with continued concern that large increases in the twenty-first century cannot be ruled out. The biggest source of uncertainty is the response of the large ice sheets of Greenland and west Antarctica. Based on our analysis, a pragmatic estimate of sea-level rise by 2100, for a temperature rise of 4°C or more over the same time frame, is between 0.5 m and 2 m--the probability of rises at the high end is judged to be very low, but of unquantifiable probability. However, if realized, an indicative analysis shows that the impact potential is severe, with the real risk of the forced displacement of up to 187 million people over the century (up to 2.4% of global population). This is potentially avoidable by widespread upgrade of protection, albeit rather costly with up to 0.02 per cent of global domestic product needed, and much higher in certain nations. The likelihood of protection being successfully implemented varies between regions, and is lowest in small islands, Africa and parts of Asia, and hence these regions are the most likely to see coastal abandonment. To respond to these challenges, a multi-track approach is required, which would also be appropriate if a temperature rise of less than 4°C was expected. Firstly, we should monitor sea level to detect any significant accelerations in the rate of rise in a timely manner. Secondly, we need to improve our understanding of the climate-induced processes that could contribute to rapid sea-level rise, especially the role of the two major ice sheets, to produce better models that quantify the likely future rise more precisely. Finally, responses need to be carefully considered via a combination of climate mitigation to reduce the rise and adaptation for the residual rise in sea level. In particular, long-term strategic adaptation plans for the full range of possible sea-level rise (and other change) need to be widely developed.

  9. Lights, camera, action research: The effects of didactic digital movie making on students' twenty-first century learning skills and science content in the middle school classroom

    Science.gov (United States)

    Ochsner, Karl

    Students are moving away from content consumption to content production. Short movies are uploaded onto video social networking sites and shared around the world. Unfortunately they usually contain little to no educational value, lack a narrative and are rarely created in the science classroom. According to new Arizona Technology standards and ISTE NET*S, along with the framework from the Partnership for 21st Century Learning Standards, our society demands students not only to learn curriculum, but to think critically, problem solve effectively, and become adept at communicating and collaborating. Didactic digital movie making in the science classroom may be one way that these twenty-first century learning skills may be implemented. An action research study using a mixed-methods approach to collect data was used to investigate if didactic moviemaking can help eighth grade students learn physical science content while incorporating 21st century learning skills of collaboration, communication, problem solving and critical thinking skills through their group production. Over a five week period, students researched lessons, wrote scripts, acted, video recorded and edited a didactic movie that contained a narrative plot to teach a science strand from the Arizona State Standards in physical science. A pretest/posttest science content test and KWL chart was given before and after the innovation to measure content learned by the students. Students then took a 21st Century Learning Skills Student Survey to measure how much they perceived that communication, collaboration, problem solving and critical thinking were taking place during the production. An open ended survey and a focus group of four students were used for qualitative analysis. Three science teachers used a project evaluation rubric to measure science content and production values from the movies. Triangulating the science content test, KWL chart, open ended questions and the project evaluation rubric, it

  10. Globalização social: desafio do século XXI GLOBALIZATION SOCIAL: CHALLENGE OF THE TWENTY-FIRST CENTURY

    Directory of Open Access Journals (Sweden)

    Antônio Carlos dos Santos

    2010-08-01

    Full Text Available A crítica de muitos à globalização é conseqüência dos rumos que ela está tomando. Embora a globalização seja um processo dinâmico em andamento, o seu avanço tem ocorrido de forma desequilibrada, gerando instabilidade política, econômica e social em várias regiões do planeta. O presente trabalho procura, de forma teórica, mostrar a falta da globalização social como um dos fatores que tem provocado desequilíbrio na dinâmica do processo de globalização. Pelo lado econômico, observa-se que a globalização ocorre de forma acelerada e já alcança os mais distantes pontos da face da Terra, ao passo que, pelo lado social, observa-se que a globalização está ausente em algumas regiões e, em outro tanto, ela ocorre de forma lenta e sem muito interesse. De nada vale os benefícios da globalização econômica se não existir a globalização social. Esse e o desafio do século XXI.The criticism of many of globalization is a consequence of directions it is taking. While globalization is a dynamic process in progress, its progress has occurred so unbalanced, creating politicalinstability, economic and social development in various regions of the planet. This paper demand, so theoretically, show the lack of social globalisation as one of the factors that have causedimbalance in the dynamics of the globalization process. On the economic side there is that globalization occurs so rapidly and have reached the most distant points of the face of theEarth, while the social side, there is that globalisation is absent in some regions, and in another both, it happens so slowly and without much interest. It is not worth the benefits of economicglobalization if there is the social globalisation. That and the challenge of the twenty-first century.

  11. Doing It In The SWMF Way: From Separate Space Physics Simulation Programs To The Framework For Space Weather Simulation.

    Science.gov (United States)

    Volberg, O.; Toth, G.; Sokolov, I.; Ridley, A. J.; Gombosi, T. I.; de Zeeuw, D. C.; Hansen, K. C.; Chesney, D. R.; Stout, Q. F.; Powell, K. G.; Kane, K. J.; Oehmke, R. C.

    2003-12-01

    The NASA-funded Space Weather Modeling Framework (SWMF) is developed to provide "plug and play" type Sun-to-Earth simulation capabilities serving the space physics modeling community. In its fully developed form, the SWMF will comprise a series of interoperating models of physics domains, ranging from the surface of the Sun to the upper atmosphere of the Earth. In its current form the SWMF links together five models: Global Magnetosphere, Inner Heliosphere, Ionosphere Electrodynamics, Upper Atmosphere, and Inner Magnetosphere. The framework permits to switch models of any type. The SWMF is a structured collection of software building blocks that can be used or customized to develop Sun-Earth system modeling components, and to assemble them into application. The SWMF consist of utilities and data structures for creating model components and coupling them. The SWMF contains Control Model, which controls initialization and execution of the components. It is responsible for component registration, processor layout for each component and coupling schedules. A component is created from the user-supplied physics code by adding a wrapper, which provides the control functions and coupling interface to perform the data exchange with other components. Both the wrapper and coupling interface are constructed from the building blocks provided by the framework itself. The current SWMF implementation is based on the latest component technology and uses many important concepts of Object-Oriented Programming emulated in Fortran 90. Currently it works on Linux Beowulf clusters, SGI Origin 2000 and Compaq ES45 machines.

  12. Climate-related uncertainties in projections of the twenty-first century terrestrial carbon budget: off-line model experiments using IPCC greenhouse-gas scenarios and AOGCM climate projections

    Energy Technology Data Exchange (ETDEWEB)

    Ito, Akihiko [Japan Agency for Marine-Earth Science and Technology, Frontier Research Center for Global Change, Yokohama (Japan)

    2005-04-01

    A terrestrial ecosystem model (Sim-CYCLE) was driven by multiple climate projections to investigate uncertainties in predicting the interactions between global environmental change and the terrestrial carbon cycle. Sim-CYCLE has a spatial resolution of 0.5 , and mechanistically evaluates photosynthetic and respiratory CO{sub 2} exchange. Six scenarios for atmospheric-CO{sub 2} concentrations in the twenty-first century, proposed by the Intergovernmental Panel on Climate Change, were considered. For each scenario, climate projections by a coupled atmosphere-ocean general circulation model (AOGCM) were used to assess the uncertainty due to socio-economic predictions. Under a single CO{sub 2} scenario, climate projections with seven AOGCMs were used to investigate the uncertainty stemming from uncertainty in the climate simulations. Increases in global photosynthesis and carbon storage differed considerably among scenarios, ranging from 23 to 37% and from 24 to 81 Pg C, respectively. Among the AOGCM projections, increases ranged from 26 to 33% and from 48 to 289 Pg C, respectively. There were regional heterogeneities in both climatic change and carbon budget response, and different carbon-cycle components often responded differently to a given environmental change. Photosynthetic CO{sub 2} fixation was more sensitive to atmospheric CO{sub 2}, whereas soil carbon storage was more sensitive to temperature. Consequently, uncertainties in the CO{sub 2} scenarios and climatic projections may create additional uncertainties in projecting atmospheric-CO{sub 2} concentrations and climates through the interactive feedbacks between the atmosphere and the terrestrial ecosystem. (orig.)

  13. Decision Support Tool and Simulation Testbed for Airborne Spacing and Merging in Super Dense Operations Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The key innovation in this effort is the development of a decision support tool and simulation testbed for Airborne Spacing and Merging (ASM). We focus on concepts...

  14. LISP based simulation generators for modeling complex space processes

    Science.gov (United States)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  15. Evaluation of the effects of solar radiation on glass. [space environment simulation

    Science.gov (United States)

    Firestone, R. F.; Harada, Y.

    1979-01-01

    The degradation of glass used on space structures due to electromagnetic and particulate radiation in a space environment was evaluated. The space environment was defined and a simulated space exposure apparatus was constructed. Four optical materials were exposed to simulated solar and particulate radiation in a space environment. Sapphire and fused silica experienced little change in transmittance, while optical crown glass and ultra low expansion glass darkened appreciably. Specimen selection and preparation, exposure conditions, and the effect of simulated exposure are discussed. A selective bibliography of the effect of radiation on glass is included.

  16. Bridging the climate-induced water gap in the twenty-first century: adaptation support based on water supply, demand, adaptation and financing.

    Science.gov (United States)

    Straatsma, Menno; Droogers, Peter; Brandsma, Jaïrus; Buytaert, Wouter; Karssenberg, Derek; Van Beek, Rens; Wada, Yoshihide; Sutanudjaja, Edwin; Vitolo, Claudia; Schmitz, Oliver; Meijer, Karen; Van Aalst, Maaike; Bierkens, Marc

    2014-05-01

    Water scarcity affects large parts of the world. Over the course of the twenty-first century, water demand is likely to increase due to population growth and associated food production, and increased economic activity, while water supply is projected to decrease in many regions due to climate change. Despite recent studies that analyze the effect of climate change on water scarcity, e.g. using climate projections under representative concentration pathways (RCP) of the fifth assessment report of the IPCC (AR5), decision support for closing the water gap between now and 2100 does not exist at a meaningful scale and with a global coverage. In this study, we aimed (i) to assess the joint impact of climatic and socio-economic change on water scarcity, (ii) to integrate impact and potential adaptation in one workflow, (iii) to prioritize adaptation options to counteract water scarcity based on their financial, regional socio-economic and environmental implications, and (iv) to deliver all this information in an integrated user-friendly web-based service. To enable the combination of global coverage with local relevance, we aggregated all results for 1604 water provinces (food producing units) delineated in this study, which is five times smaller than previous food producing units. Water supply was computed using the PCR-GLOBWB hydrological and water resources model, parameterized at 5 arcminutes for the whole globe, excluding Antarctica and Greenland. We ran PCR-GLOBWB with a daily forcing derived from five different GCM models from the CMIP5 (GFDL-ESM2M, Hadgem2-ES, IPSL-CMA5-LR, MIROC-ESM-CHEM, NorESM1-M) that were bias corrected using observation-based WATCH data between 1960-1999. For each of the models all four RCPs (RCP 2.6, 4.5, 6.0, and 8.5) were run, producing the ensemble of 20 future projections. The blue water supply was aggregated per month and per water province. Industrial, domestic and irrigation water demands were computed for a limited number of

  17. Twenty-First Century Educational Theory and the Challenges of Modern Education: Appealing to the Heritage of the General Teaching Theory of the Secondary Educational Curriculum and the Learning Process

    Science.gov (United States)

    Klarin, Mikhail V.

    2016-01-01

    The article presents an analysis of educational theory in light of the challenges confronting education in the twenty-first century. The author examines how our ideas about the methods for managing the transmission of culture, the subject of education, and the consequences of these changes for the theory of education have changed. The author…

  18. Regional variation of carbonaceous aerosols from space and simulations

    Science.gov (United States)

    Mukai, Sonoyo; Sano, Itaru; Nakata, Makiko; Kokhanovsky, Alexander

    2017-04-01

    effect on carbonaceous aerosols. And then the selected data observed by ADEOS-2/GLI and POLDER in 2003 are treated by using Vector form Method of Successive Order of Scattering (VMSOS) for radiative transfer simulations in the semi-infinite atmosphere [2]. Finally the obtained optical properties of the carbonaceous aerosols are investigated in comparison with the numerical model simulations of SPRINTARS. In spite of the limited case studies, it has been pointed out that NUV-channel data are effective for retrieval of the carbonaceous aerosol properties. Therefore we have to treat with this issue for not only detection of biomass burning plume but also retrieval itself. If that happens, synthetic analysis based on multi-channel and/or polarization measurements become practical, and the proposed procedure and results are available for a feasibility study of coming space missions. [1] Sano, I., Y. Okada, M. Mukai and S. Mukai, "Retrieval algorithm based on combined use of POLDER and GLI data for biomass aerosols," J. RSSJ, vol. 29, no. 1, pp. 54-59, doi:10.11440/rssj.29.54, 2009. [2] Mukai, S., M. Nakata, M. Yasumoto, I. Sano and A. Kokhanovsky, "A study of aerosol pollution episode due to agriculture biomass burning in the east-central China using satellite data," Front. Environ. Sci., vol. 3:57, doi: 10.3389/fenvs.2015.00057, 2015.

  19. Simulating Emerging Space Industries with Agent-Based Modeling Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Vision for Space Exploration (VSE) calls for encouraging commercial participation as a top-level objective. Given current and future commercial activities, how...

  20. Efficient conformational space exploration in ab initio protein folding simulation.

    Science.gov (United States)

    Ullah, Ahammed; Ahmed, Nasif; Pappu, Subrata Dey; Shatabda, Swakkhar; Ullah, A Z M Dayem; Rahman, M Sohel

    2015-08-01

    Ab initio protein folding simulation largely depends on knowledge-based energy functions that are derived from known protein structures using statistical methods. These knowledge-based energy functions provide us with a good approximation of real protein energetics. However, these energy functions are not very informative for search algorithms and fail to distinguish the types of amino acid interactions that contribute largely to the energy function from those that do not. As a result, search algorithms frequently get trapped into the local minima. On the other hand, the hydrophobic-polar (HP) model considers hydrophobic interactions only. The simplified nature of HP energy function makes it limited only to a low-resolution model. In this paper, we present a strategy to derive a non-uniform scaled version of the real 20×20 pairwise energy function. The non-uniform scaling helps tackle the difficulty faced by a real energy function, whereas the integration of 20×20 pairwise information overcomes the limitations faced by the HP energy function. Here, we have applied a derived energy function with a genetic algorithm on discrete lattices. On a standard set of benchmark protein sequences, our approach significantly outperforms the state-of-the-art methods for similar models. Our approach has been able to explore regions of the conformational space which all the previous methods have failed to explore. Effectiveness of the derived energy function is presented by showing qualitative differences and similarities of the sampled structures to the native structures. Number of objective function evaluation in a single run of the algorithm is used as a comparison metric to demonstrate efficiency.

  1. Desert Cyanobacteria under simulated space and Martian conditions

    Science.gov (United States)

    Billi, D.; Ghelardini, P.; Onofri, S.; Cockell, C. S.; Rabbow, E.; Horneck, G.

    2008-09-01

    The environment in space and on planets such as Mars, can be lethal to living organisms and high levels of tolerance to desiccation, cold and radiation are needed for survival: rock-inhabiting cyanobacteria belonging to the genus Chroococcidiopsis can fulfil these requirements [1]. These cyanobacteria constantly appear in the most extreme and dry habitats on Earth, including the McMurdo Dry Valleys (Antarctica) and the Atacama Desert (Chile), which are considered the closest terrestrial analogs of two Mars environmental extremes: cold and aridity. In their natural environment, these cyanobacteria occupy the last refuges for life inside porous rocks or at the stone-soil interfaces, where they survive in a dry, dormant state for prolonged periods. How desert strains of Chroococcidiopsis can dry without dying is only partially understood, even though experimental evidences support the existence of an interplay between mechanisms to avoid (or limit) DNA damage and repair it: i) desert strains of Chroococcidiopsis mend genome fragmentation induced by ionizing radiation [2]; ii) desiccation-survivors protect their genome from complete fragmentation; iii) in the dry state they show a survival to an unattenuated Martian UV flux greater than that of Bacillus subtilis spores [3], and even though they die following atmospheric entry after having orbited the Earth for 16 days [4], they survive to simulated shock pressures up to 10 GPa [5]. Recently additional experiments were carried out at the German Aerospace Center (DLR) of Cologne (Germany) in order to identify suitable biomarkers to investigate the survival of Chroococcidiopsis cells present in lichen-dominated communities, in view of their direct and long term space exposition on the International Space Station (ISS) in the framework of the LIchens and Fungi Experiments (LIFE, EXPOSEEuTEF, ESA). Multilayers of dried cells of strains CCMEE 134 (Beacon Valley, Antarctica), and CCMEE 123 (costal desert, Chile ), shielded by

  2. Development of automation and robotics for space via computer graphic simulation methods

    Science.gov (United States)

    Fernandez, Ken

    1988-01-01

    A robot simulation system, has been developed to perform automation and robotics system design studies. The system uses a procedure-oriented solid modeling language to produce a model of the robotic mechanism. The simulator generates the kinematics, inverse kinematics, dynamics, control, and real-time graphic simulations needed to evaluate the performance of the model. Simulation examples are presented, including simulation of the Space Station and the design of telerobotics for the Orbital Maneuvering Vehicle.

  3. Investigating Patterns for the Process-Oriented Modelling and Simulation of Space in Complex Systems

    OpenAIRE

    Sampson, Adam T.; Welch, Peter H.; Warren, Douglas N.; Andrews, Paul S.; Bjørndalen, John Markus; Stepney, Susan; Timmis, Jon

    2008-01-01

    Complex systems modelling and simulation is becoming increasingly important to numerous disciplines. The CoSMoS project aims to produce a unified infrastructure for modelling and simulating all sorts of complex systems, making use of design patterns and the process-oriented programming model. We provide a description of CoSMoS and present a case study into the modelling of space in complex systems. We describe how two models - absolute geometric space and relational network space - can be cap...

  4. Global Cropland Area Database (GCAD) derived from Remote Sensing in Support of Food Security in the Twenty-first Century: Current Achievements and Future Possibilities

    Science.gov (United States)

    Teluguntla, Pardhasaradhi G.; Thenkabail, Prasad S.; Xiong, Jun N.; Gumma, Murali Krishna; Giri, Chandra; Milesi, Cristina; Ozdogan, Mutlu; Congalton, Russ; Tilton, James; Sankey, Temuulen Tsagaan; Massey, Richard; Phalke, Aparna; Yadav, Kamini

    2015-01-01

    to biofuels (Bindraban et al., 2009), limited water resources for irrigation expansion (Turral et al., 2009), limits on agricultural intensifications, loss of croplands to urbanization (Khan and Hanjra, 2008), increasing meat consumption (and associated demands on land and water) (Vinnari and Tapio, 2009), environmental infeasibility for cropland expansion (Gordon et al., 2009), and changing climate have all put pressure on our continued ability to sustain global food security in the twenty-first century. So, how does the World continue to meet its food and nutrition needs?. Solutions may come from bio-technology and precision farming, however developments in these fields are not currently moving at rates that will ensure global food security over next few decades. Further, there is a need for careful consideration of possible harmful effects of bio-technology. We should not be looking back 30– 50 years from now, like we have been looking back now at many mistakes made during the green revolution. During the green revolution the focus was only on getting more yield per unit area. Little thought was put about serious damage done to our natural environments, water resources, and human health as a result of detrimental factors such as uncontrolled use of herbicides-pesticides-nutrients, drastic groundwater mining, and salinization of fertile soils due to over irrigation. Currently, there is talk of a “second green revolution” or even an “ever green revolution”, but clear ideas on what these terms actually mean are still debated and are evolving. One of the biggest issues that are not given adequate focus is the use of large quantities of water for food production. Indeed, an overwhelming proportion (60-90%) of all human water use in India goes for producing their food (Falkenmark, M., & Rockström, 2006). But such intensive water use for food production is no longer tenable due to increasing pressure for water use alternatives such as increasing urbanization

  5. Technology Assessment: Democracy’s Crucible, the Future of Science and Technology, and Implications for Our Defense in the Twenty-first Century

    Science.gov (United States)

    2010-01-01

    nanotech , and hyper­ space, for example. In each of these exciting domains, the green flag of welcome progress continues to fly proudly, yet there...criminal purposes are dimly under­ stood and easily dismissed as near science fiction; however, it is much less clear in the cybertech world, the nanotech ... foods worked their way into the American diet al­ most clandestinely and were gradually accepted; not so in Europe. Little serious thought these days is

  6. Unified Approach to Modeling and Simulation of Space Communication Networks and Systems

    Science.gov (United States)

    Barritt, Brian; Bhasin, Kul; Eddy, Wesley; Matthews, Seth

    2010-01-01

    Network simulator software tools are often used to model the behaviors and interactions of applications, protocols, packets, and data links in terrestrial communication networks. Other software tools that model the physics, orbital dynamics, and RF characteristics of space systems have matured to allow for rapid, detailed analysis of space communication links. However, the absence of a unified toolset that integrates the two modeling approaches has encumbered the systems engineers tasked with the design, architecture, and analysis of complex space communication networks and systems. This paper presents the unified approach and describes the motivation, challenges, and our solution - the customization of the network simulator to integrate with astronautical analysis software tools for high-fidelity end-to-end simulation. Keywords space; communication; systems; networking; simulation; modeling; QualNet; STK; integration; space networks

  7. A Path Space Extension for Robust Light Transport Simulation

    DEFF Research Database (Denmark)

    Hachisuka, Toshiya; Pantaleoni, Jacopo; Jensen, Henrik Wann

    2012-01-01

    We present a new sampling space for light transport paths that makes it possible to describe Monte Carlo path integration and photon density estimation in the same framework. A key contribution of our paper is the introduction of vertex perturbations, which extends the space of paths with loosely...... coupled connections. The new framework enables the computation of path probabilities in the same space under the same measure, which allows us to use multiple importance sampling to combine Monte Carlo path integration and photon density estimation. The resulting algorithm, unified path sampling, can...

  8. NUMERICAL SIMULATION OF CELLULAR/DENDRITIC PRIMARY SPACING

    Institute of Scientific and Technical Information of China (English)

    W.Q.Zhang; L.Xiao

    2004-01-01

    A numerical model has been established to calculate the primary spacing of cellular or dendritic structure with fluid flow considered. The computing results show that the primary spacing depends on the growing velocity, the temperature gradient on the interface and fluid flow. There is a critical growing velocity for the cell-dendrite transition, which has a relationship with the temperature gradient: Rcr=(3-4)×10-9GT. Fluid flow leads to an increase of the primary spacing for dendritic growth but a decrease for cellular growth,resulting in an instability on the interface.

  9. Simulating Nonlinear Dynamics of Deployable Space Structures Project

    Data.gov (United States)

    National Aeronautics and Space Administration — To support NASA's vital interest in developing much larger solar array structures over the next 20 years, MotionPort LLC's Phase I SBIR project will strengthen...

  10. Issues in visual support to real-time space system simulation solved in the Systems Engineering Simulator

    Science.gov (United States)

    Yuen, Vincent K.

    1989-01-01

    The Systems Engineering Simulator has addressed the major issues in providing visual data to its real-time man-in-the-loop simulations. Out-the-window views and CCTV views are provided by three scene systems to give the astronauts their real-world views. To expand the window coverage for the Space Station Freedom workstation a rotating optics system is used to provide the widest field of view possible. To provide video signals to as many viewpoints as possible, windows and CCTVs, with a limited amount of hardware, a video distribution system has been developed to time-share the video channels among viewpoints at the selection of the simulation users. These solutions have provided the visual simulation facility for real-time man-in-the-loop simulations for the NASA space program.

  11. Simulations and Tests of Prototype Antenna System for Low Frequency Radio Experiment (LORE) Space Payload for Space Weather Observations

    Science.gov (United States)

    Pethe, Kaiwalya; Galande, Shridhar; Jamadar, Sachin; Mahajan, S. P.; Patil, R. A.; Joshi, B. C.; Manoharan, P. K.; Roy, Jayashree; Kate, G.

    2016-03-01

    Low frequency Radio Experiment (LORE) is a proposed space payload for space weather observations from space, operating between few kHz to 30 MHz. This paper presents preliminary design and practical implementation of LORE antenna systems, which consist of three mutually orthogonal mono-poles. Detailed computational electromagnetic simulations, carried out to study the performance of the antenna systems, are presented followed up by laboratory tests of the antennas as well as radiation tests with a long range test range, designed for this purpose. These tests form the first phase of the design and implementation of the full LORE prototype later in the year.

  12. High-performing simulations of the space radiation environment for the International Space Station and Apollo Missions

    Science.gov (United States)

    Lund, Matthew Lawrence

    The space radiation environment is a significant challenge to future manned and unmanned space travels. Future missions will rely more on accurate simulations of radiation transport in space through spacecraft to predict astronaut dose and energy deposition within spacecraft electronics. The International Space Station provides long-term measurements of the radiation environment in Low Earth Orbit (LEO); however, only the Apollo missions provided dosimetry data beyond LEO. Thus dosimetry analysis for deep space missions is poorly supported with currently available data, and there is a need to develop dosimetry-predicting models for extended deep space missions. GEANT4, a Monte Carlo Method, provides a powerful toolkit in C++ for simulation of radiation transport in arbitrary media, thus including the spacecraft and space travels. The newest version of GEANT4 supports multithreading and MPI, resulting in faster distributive processing of simulations in high-performance computing clusters. This thesis introduces a new application based on GEANT4 that greatly reduces computational time using Kingspeak and Ember computational clusters at the Center for High Performance Computing (CHPC) to simulate radiation transport through full spacecraft geometry, reducing simulation time to hours instead of weeks without post simulation processing. Additionally, this thesis introduces a new set of detectors besides the historically used International Commission of Radiation Units (ICRU) spheres for calculating dose distribution, including a Thermoluminescent Detector (TLD), Tissue Equivalent Proportional Counter (TEPC), and human phantom combined with a series of new primitive scorers in GEANT4 to calculate dose equivalence based on the International Commission of Radiation Protection (ICRP) standards. The developed models in this thesis predict dose depositions in the International Space Station and during the Apollo missions showing good agreement with experimental measurements

  13. A Symplectic Multi-Particle Tracking Model for Self-Consistent Space-Charge Simulation

    CERN Document Server

    Qiang, Ji

    2016-01-01

    Symplectic tracking is important in accelerator beam dynamics simulation. So far, to the best of our knowledge, there is no self-consistent symplectic space-charge tracking model available in the accelerator community. In this paper, we present a two-dimensional and a three-dimensional symplectic multi-particle spectral model for space-charge tracking simulation. This model includes both the effect from external fields and the effect of self-consistent space-charge fields using a split-operator method. Such a model preserves the phase space structure and shows much less numerical emittance growth than the particle-in-cell model in the illustrative examples.

  14. Impact of land-use and land-cover changes on CRCM5 climate projections over North America for the twenty-first century

    Science.gov (United States)

    Alexandru, Adelina; Sushama, Laxmi

    2016-08-01

    The aim of this study is to assess the impact of land-use and land-cover change (LULCC) on regional climate projections for North America. To this end, two transient climate change simulations, with and without LULCC, but identical atmospheric forcing, are performed with the 5th generation of the Canadian Regional Climate Model (CRCM5) driven by CanESM2 model for the (2006-2100)-RCP4.5 scenario. For the simulation with LULCC, land-cover data sets are taken from the Global Change Assessment Model representing the RCP4.5 scenario for the period 2006-2100. LULCC in RCP4.5 scenario point to significant reduction in cultivated land (e.g. Canadian Prairies and Mississippi basin) due to intense afforestation. Results suggest that biogeophysical effects of LULCC on climate, assessed through differences between the all-forcing (atmospheric and LULCC) run and the atmospheric forcing run (with constant land cover) are substantial for relevant surface variables. It is shown that the afforestation of cropland lead to warmer regional climates, especially in winter (warming above 1.5 °C), as compared with climates resulting from atmospheric forcings alone. The investigation of processes leading to this response shows high sensitivity of the results to changes in albedo as a response to LULCC. Additional roughness, evaporative cooling and water soil availability also seem to play an important role in regional climate especially for the summer season in certain afforested areas (e.g., southeastern US).

  15. Magnetic null points in kinetic simulations of space plasmas

    OpenAIRE

    Olshevsky, Vyacheslav; Deca, Jan; Divin, Andrey; Peng, Ivy Bo; Markidis, Stefano; Innocenti, Maria Elena; Cazzola, Emanuele; Lapenta, Giovanni

    2015-01-01

    We present a systematic attempt to study magnetic null points and the associated magnetic energy conversion in kinetic Particle-in-Cell simulations of various plasma configurations. We address three-dimensional simulations performed with the semi-implicit kinetic electromagnetic code iPic3D in different setups: variations of a Harris current sheet, dipolar and quadrupolar magnetospheres interacting with the solar wind; and a relaxing turbulent configuration with multiple null points. Spiral n...

  16. Monte Carlo Simulation of Argon in Nano-Space

    Institute of Scientific and Technical Information of China (English)

    CHEN Min; YANG Chun; GUO Zeng-Yuan

    2000-01-01

    Monte Carlo simulations are performed to investigate the thermodynamic properties of argon confined in nano-scale cubes constructed of graphite walls. A remarkable depression of the system pressures is observed. The simulations reveal that the length-scale of the cube, the magnitude of the interaction between the fluid and the graphite wall and the density of the fluid exhibit reasonable effects on the thermodynamic property shifts of the luid.

  17. Quasi-static Deployment Simulation for Deployable Space Truss Structures

    Institute of Scientific and Technical Information of China (English)

    陈务军; 付功义; 何艳丽; 董石麟

    2004-01-01

    A new method was proposed for quasi-static deployment analysis of deployable space truss structures. The structure is assumed a rigid assembly, whose constraints are classified as three categories:rigid member constraint, joint-attached kinematic constraint and boundary constraint. And their geometric constraint equations and derivative matrices are formulated. The basis of the null space and M-P inverse of the geometric constraint matrix are employed to determine the solution for quasi-static deployment analysis. The influence introduced by higher terms of constraints is evaluated subsequently. The numerical tests show that the new method is efficient.

  18. Postnatal development under conditions of simulated weightlessness and space flight

    Science.gov (United States)

    Walton, K.

    1998-01-01

    The adaptability of the developing nervous system to environmental influences and the mechanisms underlying this plasticity has recently become a subject of interest in space neuroscience. Ground studies on neonatal rats using the tail suspension model of weightlessness have shown that the force of gravity clearly influences the events underlying the postnatal development of motor function. These effects depend on the age of the animal, duration of the perturbation and the motor function studied. A nine-day flight study has shown that a dam and neonates can develop under conditions of space flight. The motor function of the flight animals after landing was consistent with that seen in the tail suspension studies, being marked by limb joint extension. However, there were expected differences due to: (1) the unloading of the vestibular system in flight, which did not occur in the ground-based experiments; (2) differences between flight and suspension durations; and (3) the inability to evaluate motor function during the flight. The next step is to conduct experiments in space with the flexibility and rigor that is now limited to ground studies: an opportunity offered by the International Space Station. Copyright 1998 Published by Elsevier Science B.V.

  19. Geant4 electromagnetic physics updates for space radiation effects simulation

    Science.gov (United States)

    Ivantchenko, Anton; Nieminen, Petteri; Incerti, Sebastien; Santin, Giovanni; Ivantchenko, Vladimir; Grichine, Vladimir; Allison, John; Karamitos, Mathiew

    The Geant4 toolkit is used in many applications including space science studies. The new Geant4 version 10.0 released in December 2013 includes a major revision of the toolkit and offers multi-threaded mode for event level parallelism. At the same time, Geant4 electromagnetic and hadronic physics sub-libraries have been significantly updated. In order to validate the new and updated models Geant4 verification tests and benchmarks were extended. Part of these developments was sponsored by the European Space Agency in the context of research aimed at modelling radiation biological end effects. In this work, we present an overview of results of several benchmarks for electromagnetic physics models relevant to space science. For electromagnetic physics, recently Compton scattering, photoelectric effect, and Rayleigh scattering models have been improved and extended down to lower energies. Models of ionization and fluctuations have also been improved; special micro-dosimetry models for Silicon and liquid water were introduced; the main multiple scattering model was consolidated; and the atomic de-excitation module has been made available to all models. As a result, Geant4 predictions for space radiation effects obtained with different Physics Lists are in better agreement with the benchmark data than previous Geant4 versions. Here we present results of electromagnetic tests and models comparison in the energy interval 10 eV - 10 MeV.

  20. Reliability and maintenance simulation of the Hubble Space Telescope

    Science.gov (United States)

    Pizzano, F.

    1986-01-01

    An analytical approach is presented which was developed and implemented at MSFC specifically for the Space Telescope Program to provide comparisons of critical item failures, system downstates, on-orbit servicing versus return for ground maintenance, overall system downtime, and to obtain a measure of expected uptime for science functions.

  1. Private ground infrastructures for space exploration missions simulations

    Science.gov (United States)

    Souchier, Alain

    2010-06-01

    The Mars Society, a private non profit organisation devoted to promote the red planet exploration, decided to implement simulated Mars habitat in two locations on Earth: in northern Canada on the rim of a meteoritic crater (2000), in a US Utah desert, location of a past Jurassic sea (2001). These habitats have been built with large similarities to actual planned habitats for first Mars exploration missions. Participation is open to everybody either proposing experimentations or wishing only to participate as a crew member. Participants are from different organizations: Mars Society, Universities, experimenters working with NASA or ESA. The general philosophy of the work conducted is not to do an innovative scientific work on the field but to learn how the scientific work is affected or modified by the simulation conditions. Outside activities are conducted with simulated spacesuits limiting the experimenter abilities. Technology or procedures experimentations are also conducted as well as experimentations on the crew psychology and behaviour.

  2. Winter weather regimes over the Mediterranean region: their role for the regional climate and projected changes in the twenty-first century

    Science.gov (United States)

    Rojas, M.; Li, L. Z.; Kanakidou, M.; Hatzianastassiou, N.; Seze, G.; Le Treut, H.

    2013-08-01

    The winter time weather variability over the Mediterranean is studied in relation to the prevailing weather regimes (WRs) over the region. Using daily geopotential heights at 700 hPa from the ECMWF ERA40 Reanalysis Project and Cluster Analysis, four WRs are identified, in increasing order of frequency of occurrence, as cyclonic (22.0 %), zonal (24.8 %), meridional (25.2 %) and anticyclonic (28.0 %). The surface climate, cloud distribution and radiation patterns associated with these winter WRs are deduced from satellite (ISCCP) and other observational (E-OBS, ERA40) datasets. The LMDz atmosphere-ocean regional climate model is able to simulate successfully the same four Mediterranean weather regimes and reproduce the associated surface and atmospheric conditions for the present climate (1961-1990). Both observational- and LMDz-based computations show that the four Mediterranean weather regimes control the region's weather and climate conditions during winter, exhibiting significant differences between them as for temperature, precipitation, cloudiness and radiation distributions within the region. Projections (2021-2050) of the winter Mediterranean weather and climate are obtained using the LMDz model and analysed in relation to the simulated changes in the four WRs. According to the SRES A1B emission scenario, a significant warming (between 2 and 4 °C) is projected to occur in the region, along with a precipitation decrease by 10-20 % in southern Europe, Mediterranean Sea and North Africa, against a 10 % precipitation increase in northern European areas. The projected changes in temperature and precipitation in the Mediterranean are explained by the model-predicted changes in the frequency of occurrence as well as in the intra-seasonal variability of the regional weather regimes. The anticyclonic configuration is projected to become more recurrent, contributing to the decreased precipitation over most of the basin, while the cyclonic and zonal ones become more

  3. CO{sub 2} and non-CO{sub 2} radiative forcings in climate projections for twenty-first century mitigation scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Strassmann, Kuno M. [University of Bern, Climate and Environmental Physics, Bern (Switzerland); Plattner, G.K. [ETH Zuerich, Institute of Biogeochemistry and Pollutant Dynamics, Zurich (Switzerland); Joos, F. [University of Bern, Climate and Environmental Physics, Bern (Switzerland); University of Bern, Oeschger Centre for Climate Change Research, Bern (Switzerland)

    2009-11-15

    Climate is simulated for reference and mitigation emissions scenarios from Integrated Assessment Models using the Bern2.5CC carbon cycle-climate model. Mitigation options encompass all major radiative forcing agents. Temperature change is attributed to forcings using an impulse-response substitute of Bern2.5CC. The contribution of CO{sub 2} to global warming increases over the century in all scenarios. Non-CO{sub 2} mitigation measures add to the abatement of global warming. The share of mitigation carried by CO{sub 2}, however, increases when radiative forcing targets are lowered, and increases after 2000 in all mitigation scenarios. Thus, non-CO{sub 2} mitigation is limited and net CO{sub 2} emissions must eventually subside. Mitigation rapidly reduces the sulfate aerosol loading and associated cooling, partly masking Greenhouse Gas mitigation over the coming decades. A profound effect of mitigation on CO{sub 2} concentration, radiative forcing, temperatures and the rate of climate change emerges in the second half of the century. (orig.)

  4. A SLAM II simulation model for analyzing space station mission processing requirements

    Science.gov (United States)

    Linton, D. G.

    1985-01-01

    Space station mission processing is modeled via the SLAM 2 simulation language on an IBM 4381 mainframe and an IBM PC microcomputer with 620K RAM, two double-sided disk drives and an 8087 coprocessor chip. Using a time phased mission (payload) schedule and parameters associated with the mission, orbiter (space shuttle) and ground facility databases, estimates for ground facility utilization are computed. Simulation output associated with the science and applications database is used to assess alternative mission schedules.

  5. Language Simulations: The Blending Space for Writing and Critical Thinking

    Science.gov (United States)

    Kovalik, Doina L.; Kovalik, Ludovic M.

    2007-01-01

    This article describes a language simulation involving six distinct phases: an in-class quick response, a card game, individual research, a classroom debate, a debriefing session, and an argumentative essay. An analysis of student artifacts--quick-response writings and final essays, respectively, both addressing the definition of liberty in a…

  6. Flexible Space-Filling Designs for Complex System Simulations

    Science.gov (United States)

    2013-06-01

    Systems Engineering Approved by: _________________________________________________________ Peter Denning, Chair, Department of Computer Science... Kirby , 2001; and Baker, Mavris, & Schrage, 2002). These meta-models approximate the underlying dependencies of the simulation output responses to the...Journal of Graphical and Statistics, 12, 512–530. Kirby , M. R. (2001). A methodology for technology identification, evaluation, and selection in

  7. Flight Simulator: Use of SpaceGraph Display in an Instructor/Operator Station. Final Report.

    Science.gov (United States)

    Sher, Lawrence D.

    This report describes SpaceGraph, a new computer-driven display technology capable of showing space-filling images, i.e., true three dimensional displays, and discusses the advantages of this technology over flat displays for use with the instructor/operator station (IOS) of a flight simulator. Ideas resulting from 17 brainstorming sessions with…

  8. The role of simulation prior to manufacturing in space

    Science.gov (United States)

    Shaw, M. C.

    1983-01-01

    Prior to manufacturing in space, it is useful to conduct analog experiments where possible so that problems that are apt to be encountered may be identified and planning toward their solution considered. An example is presented involving containerless casting in a near zero gravitational field using paraffin wax as the material cast surrounded by a heated fluid immiscible with the wax that renders it neutrally buoyant.

  9. Experimental Studies of NAK in a Simulated Space Environment

    Science.gov (United States)

    Gibson, M. A.; Sanzi, J.; Ljubanovic, D.

    Space fission power systems are being developed at the National Aeronautics and Space Administration (NASA) and Department of Energy (DOE) with a short term goal of building a full scale, non-nuclear, Technology Demonstration Unit (TDU) test at NASA's Glenn Research Center. Due to the geometric constraints, mass restrictions, and fairly high tempera- tures associated with space reactors, liquid metals are typically used as the primary coolant. A eutectic mixture of sodium (22 percent) and potassium (78 percent), or NaK, has been chosen as the coolant for the TDU with a total system capacity of approximately 55L. NaK, like all alkali metals, is very reactive, and warrants certain safety considerations. To adequately examine the risk associated with the personnel, facility, and test hardware during a potential NaK leak in the large scale TDU test, a small scale experiment was performed in which NaK was released in a thermal vacuum chamber under controlled conditions. The study focused on detecting NaK leaks in the vacuum environment as well as the molecular flow of the NaK vapor. This paper reflects the work completed during the NaK experiment and provides results and discussion relative to the findings.

  10. Distributed communication and psychosocial performance in simulated space dwelling groups

    Science.gov (United States)

    Hienz, R. D.; Brady, J. V.; Hursh, S. R.; Ragusa, L. C.; Rouse, C. O.; Gasior, E. D.

    2005-05-01

    The present report describes the development and application of a distributed interactive multi-person simulation in a computer-generated planetary environment as an experimental test bed for modeling the human performance effects of variations in the types of communication modes available, and in the types of stress and incentive conditions underlying the completion of mission goals. The results demonstrated a high degree of interchangeability between communication modes (audio, text) when one mode was not available. Additionally, the addition of time pressure stress to complete tasks resulted in a reduction in performance effectiveness, and these performance reductions were ameliorated via the introduction of positive incentives contingent upon improved performances. The results obtained confirmed that cooperative and productive psychosocial interactions can be maintained between individually isolated and dispersed members of simulated spaceflight crews communicating and problem-solving effectively over extended time intervals without the benefit of one another's physical presence.

  11. Transiting Exoplanet Simulations with the James Webb Space Telescope

    CERN Document Server

    Batalha, Natasha; Lunine, Jonathan; Clampin, Mark; Lindler, Don

    2015-01-01

    In this white paper, we assess the potential for JWST to characterize the atmospheres of super-Earth exoplanets, by simulating a range of transiting spectra with different masses and temperatures. Our results are based on a JWST simulator tuned to the expected performance of the workhorse spectroscopic instrument NIRSpec, and is based on the latest exoplanet transit models by Howe & Burrows (2012). This study is especially timely since the observing modes for the science instruments on JWST are finalized (Clampin 2010) and because NASA has selected the TESS mission as an upcoming Explorer. TESS is expected to identify more than 1000 transiting exoplanet candidates, including a sample of about 100 nearby (<50 pc) super- Earths (Ricker et al. 2010).

  12. Plants science that supports the hope in the twenty-first century. We control the plant freely or not; 21 seiki no kibo wo sasaeru shokubutsu kagaku. Wareware wa shokubutsu wo jizai ni ayatsurerunoka?

    Energy Technology Data Exchange (ETDEWEB)

    Yoshida, Shigeo

    1999-10-01

    It becomes a basic factor that living and rich food practice are enjoyed in natural environment which senses peace of mind, while the cultural convenience by the clean energy is maintained, when the state in which people in the twenty-first century comfortably live is assumed. One of the ultimate goals for guaranteeing our comfortable life in the scientific research should be put for the establishment of the technology which swimmingly raises the function of the higher plant. The prosperity of future mankind also depends on the utilization of the vegetation in every face such as food security, maintenance of natural environment, solution of energy problem. (NEDO)

  13. Performing the comic side of bodily abjection: A study of twenty-first century female stand-up comedy in a multi-cultural and multi-racial Britain

    OpenAIRE

    Blunden, Pamela

    2011-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University. This thesis is a socio-cultural study of the development of female stand-up comedy in the first decade of the twenty-first century within a multi-racial and multi-cultural Britain. It also engages with the theory and practice of performance and asks the question: ‘In what ways can it be said that female stand-up comics perform the comic side of bodily abjection?’ This question is applied to t...

  14. Behavior of ionic conducting IPN actuators in simulated space conditions

    Science.gov (United States)

    Fannir, Adelyne; Plesse, Cédric; Nguyen, Giao T. M.; Laurent, Elisabeth; Cadiergues, Laurent; Vidal, Frédéric

    2016-04-01

    The presentation focuses on the performances of flexible all-polymer electroactive actuators under space-hazardous environmental factors in laboratory conditions. These bending actuators are based on high molecular weight nitrile butadiene rubber (NBR), poly(ethylene oxide) (PEO) derivative and poly(3,4-ethylenedioxithiophene) (PEDOT). The electroactive PEDOT is embedded within the PEO/NBR membrane which is subsequently swollen with an ionic liquid as electrolyte. Actuators have been submitted to thermal cycling test between -25 to 60°C under vacuum (2.4 10-8 mbar) and to ionizing Gamma radiations at a level of 210 rad/h during 100 h. Actuators have been characterized before and after space environmental condition ageing. In particular, the viscoelasticity properties and mechanical resistance of the materials have been determined by dynamic mechanical analysis and tensile tests. The evolution of the actuation properties as the strain and the output force have been characterized as well. The long-term vacuuming, the freezing temperature and the Gamma radiations do not affect significantly the thermomechanical properties of conducting IPNs actuators. Only a slight decrease on actuation performances has been observed.

  15. A General Simulator Using State Estimation for a Space Tug Navigation System. [computerized simulation, orbital position estimation and flight mechanics

    Science.gov (United States)

    Boland, J. S., III

    1975-01-01

    A general simulation program is presented (GSP) involving nonlinear state estimation for space vehicle flight navigation systems. A complete explanation of the iterative guidance mode guidance law, derivation of the dynamics, coordinate frames, and state estimation routines are given so as to fully clarify the assumptions and approximations involved so that simulation results can be placed in their proper perspective. A complete set of computer acronyms and their definitions as well as explanations of the subroutines used in the GSP simulator are included. To facilitate input/output, a complete set of compatable numbers, with units, are included to aid in data development. Format specifications, output data phrase meanings and purposes, and computer card data input are clearly spelled out. A large number of simulation and analytical studies were used to determine the validity of the simulator itself as well as various data runs.

  16. Adjoint-based Gradient Estimation Using the Space-time Solutions of Unknown Conservation Law Simulations

    CERN Document Server

    Chen, Han

    2016-01-01

    Many control applications can be formulated as optimization constrained by conservation laws. Such optimization can be efficiently solved by gradient-based methods, where the gradient is obtained through the adjoint method. Traditionally, the adjoint method has not been able to be implemented in "gray-box" conservation law simulations. In gray-box simulations, the analytical and numerical form of the conservation law is unknown, but the space-time solution of relevant flow quantities is available. Without the adjoint gradient, optimization can be challenging for problems with many control variables. However, much information about the gray-box simulation is contained in its space-time solution, which motivates us to estimate the adjoint gradient by leveraging the space-time solution. This article considers a type of gray-box simulations where the flux function is partially unknown. A method is introduced to estimate the adjoint gradient at a cost independent of the number of control variables. The method firs...

  17. Electrical behaviour of a silicone elastomer under simulated space environment

    Science.gov (United States)

    Roggero, A.; Dantras, E.; Paulmier, T.; Tonon, C.; Balcon, N.; Rejsek-Riba, V.; Dagras, S.; Payan, D.

    2015-04-01

    The electrical behavior of a space-used silicone elastomer was characterized using surface potential decay and dynamic dielectric spectroscopy techniques. In both cases, the dielectric manifestation of the glass transition (dipole orientation) and a charge transport phenomenon were observed. An unexpected linear increase of the surface potential with temperature was observed around Tg in thermally-stimulated potential decay experiments, due to molecular mobility limiting dipolar orientation in one hand, and 3D thermal expansion reducing the materials capacitance in the other hand. At higher temperatures, the charge transport process, believed to be thermally activated electron hopping with an activation energy of about 0.4 eV, was studied with and without the silica and iron oxide fillers present in the commercial material. These fillers were found to play a preponderant role in the low-frequency electrical conductivity of this silicone elastomer, probably through a Maxwell-Wagner-Sillars relaxation phenomenon.

  18. 3D Simulations of Space Charge Effects in Particle Beams

    Energy Technology Data Exchange (ETDEWEB)

    Adelmann, A

    2002-10-01

    For the first time, it is possible to calculate the complicated three-dimensional proton accelerator structures at the Paul Scherrer Institut (PSI). Under consideration are external and self effects, arising from guiding and space-charge forces. This thesis has as its theme the design, implementation and validation of a tracking program for charged particles in accelerator structures. This work form part of the discipline of Computational Science and Engineering (CSE), more specifically in computational accelerator modelling. The physical model is based on the collisionless Vlasov-Maxwell theory, justified by the low density ({approx} 10{sup 9} protons/cm{sup 3}) of the beam and of the residual gas. The probability of large angle scattering between the protons and the residual gas is then sufficiently low, as can be estimated by considering the mean free path and the total distance a particle travels in the accelerator structure. (author)

  19. 3D Simulations of Space Charge Effects in Particle Beams

    Energy Technology Data Exchange (ETDEWEB)

    Adelmann, A

    2002-10-01

    For the first time, it is possible to calculate the complicated three-dimensional proton accelerator structures at the Paul Scherrer Institut (PSI). Under consideration are external and self effects, arising from guiding and space-charge forces. This thesis has as its theme the design, implementation and validation of a tracking program for charged particles in accelerator structures. This work form part of the discipline of Computational Science and Engineering (CSE), more specifically in computational accelerator modelling. The physical model is based on the collisionless Vlasov-Maxwell theory, justified by the low density ({approx} 10{sup 9} protons/cm{sup 3}) of the beam and of the residual gas. The probability of large angle scattering between the protons and the residual gas is then sufficiently low, as can be estimated by considering the mean free path and the total distance a particle travels in the accelerator structure. (author)

  20. Developing a space network interface simulator: The NTS approach

    Science.gov (United States)

    Hendrzak, Gary E.

    1993-01-01

    This paper describes the approach used to redevelop the Network Control Center (NCC) Test System (NTS), a hardware and software facility designed to make testing of the NCC Data System (NCCDS) software efficient, effective, and as rigorous as possible prior to operational use. The NTS transmits and receives network message traffic in real-time. Data transfer rates and message content are strictly controlled and are identical to that of the operational systems. NTS minimizes the need for costly and time-consuming testing with the actual external entities (e.g., the Hubble Space Telescope (HST) Payload Operations Control Center (POCC) and the White Sands Ground Terminal). Discussed are activities associated with the development of the NTS, lessons learned throughout the project's lifecycle, and resulting productivity and quality increases.

  1. Molecular dynamics simulation of interparticle spacing and many-body effect in gold supracrystals.

    Science.gov (United States)

    Liu, X P; Ni, Y; He, L H

    2016-04-01

    Interparticle spacing in supracrystals is a crucial parameter for photoelectric applications as it dominates the transport rates between neighboring nanoparticles (NPs). Based on large-scale molecular dynamics simulations, we calculate interparticle spacing in alkylthiol-stabilized gold supracrystals as a function of the NP size, ligand length and external pressure. The repulsive many-body interactions in the supracrystals are also quantified by comparing the interparticle spacing with that between two individual NPs at equilibrium. Our results are consistent with available experiments, and are expected to help precise control of interparticle spacing in supracrystal devices.

  2. Phase space structures in gyrokinetic simulations of fusion plasma turbulence

    Science.gov (United States)

    Ghendrih, Philippe; Norscini, Claudia; Cartier-Michaud, Thomas; Dif-Pradalier, Guilhem; Abiteboul, Jérémie; Dong, Yue; Garbet, Xavier; Gürcan, Ozgür; Hennequin, Pascale; Grandgirard, Virginie; Latu, Guillaume; Morel, Pierre; Sarazin, Yanick; Storelli, Alexandre; Vermare, Laure

    2014-10-01

    Gyrokinetic simulations of fusion plasmas give extensive information in 5D on turbulence and transport. This paper highlights a few of these challenging physics in global, flux driven simulations using experimental inputs from Tore Supra shot TS45511. The electrostatic gyrokinetic code GYSELA is used for these simulations. The 3D structure of avalanches indicates that these structures propagate radially at localised toroidal angles and then expand along the field line at sound speed to form the filaments. Analysing the poloidal mode structure of the potential fluctuations (at a given toroidal location), one finds that the low modes m = 0 and m = 1 exhibit a global structure; the magnitude of the m = 0 mode is much larger than that of the m = 1 mode. The shear layers of the corrugation structures are thus found to be dominated by the m = 0 contribution, that are comparable to that of the zonal flows. This global mode seems to localise the m = 2 mode but has little effect on the localisation of the higher mode numbers. However when analysing the pulsation of the latter modes one finds that all modes exhibit a similar phase velocity, comparable to the local zonal flow velocity. The consequent dispersion like relation between the modes pulsation and the mode numbers provides a means to measure the zonal flow. Temperature fluctuations and the turbulent heat flux are localised between the corrugation structures. Temperature fluctuations are found to exhibit two scales, small fluctuations that are localised by the corrugation shear layers, and appear to bounce back and forth radially, and large fluctuations, also readily observed on the flux, which are associated to the disruption of the corrugations. The radial ballistic velocity of both avalanche events if of the order of 0.5ρ∗c0 where ρ∗ = ρ0/a, a being the tokamak minor radius and ρ0 being the characteristic Larmor radius, ρ0 = c0/Ω0. c0 is the reference ion thermal velocity and Ω0 = qiB0/mi the reference

  3. A simulation model for probabilistic analysis of Space Shuttle abort modes

    Science.gov (United States)

    Hage, R. T.

    1993-01-01

    A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.

  4. Twenty-First-Century Aerial Mining

    Science.gov (United States)

    2015-04-01

    blockade (fig. 3).19 It has two parallel inbound and outbound shipping channels, each 1,200 feet wide with a dredged depth averaging 40 feet. East...sufficient excess capability to accept a grinding war of attri- tion in the island interior. The duration and cost of an operation might well have been...height in the Second World War as part of Operation Starvation against Japan. The value of this low- cost , persistent weapons system has been

  5. A twenty first century approach to inspection

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, R.K. [R. Brooks Associates, Inc., Williamson, NY (United States)

    1998-12-31

    Over the past ten years, visual inspection tooling has changed dramatically. Driven by both the consumer and industrial vision markets, CCD technology is advancing in pace with computer technology. Just ten years ago, cameras were mostly tube type, very expensive, black and white and large in diameter. There were few practical inspection uses. Times have changed. High resolution images are now achievable through a 1 millimeter probe. Very high resolution cameras are available in 6 millimeter diameter. Advancements in robotic, remote positioning now allows one to access previously inaccessible locations. Visual recognition of deposits is now a reality. A recent trip to Mars is testimony to this. Visual inspection is becoming one of the most worthwhile, cost effective technologies for inspections in the world. Current visual technology, and the advancements to come, will save companies billions of dollars.

  6. Resettlement in the twenty-first century

    Directory of Open Access Journals (Sweden)

    Anthony Oliver-Smith

    2014-02-01

    Full Text Available Deficiencies in planning, preparation and implementation of involuntary resettlement and relocation projects have produced far more failures than successes. Indeed, it is questionable whether resettlement as currently practised could be categorised as a form of protection.

  7. Departmentalization and Twenty-First Century Skills

    Science.gov (United States)

    Watts, Toy Coles

    2012-01-01

    The purpose of this study was to investigate the relationship between school organizational style and student outcomes. The research questions that guided this study were, "Is there a difference in mathematical performance of fourth graders who receive departmentalized instruction as compared to fourth grade students who receive…

  8. Twenty-First Century Pathologists' Advocacy.

    Science.gov (United States)

    Allen, Timothy Craig

    2017-07-01

    Pathologists' advocacy plays a central role in the establishment of continuously improving patient care quality and patient safety, and in the maintenance and progress of pathology as a profession. Pathology advocacy's primary goal is the betterment of patient safety and quality medical care; however, payment is a necessary and appropriate component to both, and has a central role in advocacy. Now is the time to become involved in pathology advocacy; the Medicare Access and Children's Health Insurance Program (CHIP) Reauthorization Act of 2015 (MACRA) and the Protecting Access to Medicare Act of 2014 (PAMA) are 2 of the most consequential pieces of legislation impacting the pathology and laboratory industry in the last 20 years. Another current issue of far-reaching impact for pathologists is balance billing, and yet many pathologists have little or no understanding of balance billing. Pathologists at all stages of their careers, and in every professional setting, need to participate. Academic pathologists have a special obligation to, if not become directly involved in advocacy, at least have a broad and current understanding of those issues, as well as the need and responsibility of pathologists to actively engage in advocacy efforts to address them, in order to teach residents the place of advocacy, and its value, as an inseparable and indispensable component of their professional responsibilities.

  9. Using Jupyter Notebooks for Interactive Space Science Simulations

    Science.gov (United States)

    Schmidt, Albrecht

    2016-04-01

    Jupyter Notebooks can be used as an effective means to communicate scientific ideas through Web-based visualisations and, at the same time, give a user more than a pre-defined set of options to manipulate the visualisations. To some degree, even computations can be done without too much knowledge of the underlying data structures and infrastructure to discover novel aspects of the data or tailor view to users' needs. Here, we show how to combine Jupyter Notebooks with other open-source tools to provide rich and interactive views on space data, especially the visualisation of spacecraft operations. Topics covered are orbit visualisation, spacecraft orientation, instrument timelines as well as performance analysis of mission segments. Technically, also the re-use and integration of existing components will be shown, both on the code level as well on the visualisation level so that the effort which was put into the development of new components could be reduced. Another important aspect is the bridging of the gap between operational data and the scientific exploitation of the payload data, for which also a way forward will be shown. A lesson learned from the implementation and use of a prototype is the synergy between the team who provisions the notebooks and the consumers, who both share access to the same code base, if not resources; this often simplifies communication and deployment.

  10. Improving Charging-Breeding Simulations with Space-Charge Effects

    Science.gov (United States)

    Bilek, Ryan; Kwiatkowski, Ania; Steinbrügge, René

    2016-09-01

    Rare-isotope-beam facilities use Highly Charged Ions (HCI) for accelerators accelerating heavy ions and to improve measurement precision and resolving power of certain experiments. An Electron Beam Ion Trap (EBIT) is able to create HCI through successive electron impact, charge breeding trapped ions into higher charge states. CBSIM was created to calculate successive charge breeding with an EBIT. It was augmented by transferring it into an object-oriented programming language, including additional elements, improving ion-ion collision factors, and exploring the overlap of the electron beam with the ions. The calculation is enhanced with the effects of residual background gas by computing the space charge due to charge breeding. The program assimilates background species, ionizes and charge breeds them alongside the element being studied, and allows them to interact with the desired species through charge exchange, giving fairer overview of realistic charge breeding. Calculations of charge breeding will be shown for realistic experimental conditions. We reexamined the implementation of ionization energies, cross sections, and ion-ion interactions when charge breeding.

  11. Recent advances in numerical simulation of space-plasma-physics problems

    Science.gov (United States)

    Birmingham, T. J.

    1983-01-01

    Computer simulations have become an increasingly popular, important and insightful tool for studying space plasmas. This review describes MHD and particle simulations, both of which treat the plasma and the electromagnetic field in which it moves in a self consistent fashion but on drastically different spatial and temporal scales. The complementary roles of simulation, observations and theory are stressed. Several examples of simulations being carried out in the area of magnetospheric plasma physics are described to illustrate the power, potential and limitations of the approach.

  12. DC link current simulation of voltage source inverter with random space vector pulse width modulation

    Directory of Open Access Journals (Sweden)

    Chen Guoqiang

    2016-01-01

    Full Text Available Aiming at analysis complexity, a simulation model is built and presented to analyze and demonstrate the characteristics of the direct current (DC link current of the three-phase two-level inverter with the random space vector pulse width modulation (SVPWM strategy. The developing procedure and key subsystems of the simulation model are given in detail. Several experiments are done using the simulation model. The results verify the efficiency and convenience of the simulation model and show that the random SVPWM scheme, especially the random switching frequency scheme, can efficiently suppress the harmonic peaks of the DC link current.

  13. Dispersion analysis and linear error analysis capabilities of the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    Previous error analyses conducted by the Guidance and Dynamics Branch of NASA have used the Guidance Analysis Program (GAP) as the trajectory simulation tool. Plans are made to conduct all future error analyses using the Space Vehicle Dynamics Simulation (SVDS) program. A study was conducted to compare the inertial measurement unit (IMU) error simulations of the two programs. Results of the GAP/SVDS comparison are presented and problem areas encountered while attempting to simulate IMU errors, vehicle performance uncertainties and environmental uncertainties using SVDS are defined. An evaluation of the SVDS linear error analysis capability is also included.

  14. Non-linear mechanical simulations in space structures engineering; Uchu kozobutsu kogaku ni okeru hisenkei kikai rikigaku simulation

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, K. [Kinki Univ Osaka (Japan)

    1997-08-05

    Space structure engineering comes under the combination of different engineering fields. Role of numerical analysis and simulation is large in space structure engineering. Considering space structure engineering as an application of non-linear mechanical simulation, constrains with simulation technology and analysis examples are explained in this paper. Following important points are confirmed by simulating the motion of membrane under external force acting on inner and outer surfaces at a particular point of square satellite model which rotates in tension membrane. When the external force is removed, each material point and membrane centre move with vibration, however centre of gravity of satellite moves very rapidly along the vertical direction of membrane centre. Further external disturbance (which develops on material points) propagates at all material points and membrane centre as a twisted wave. Maximum relative displacement between inner surface and outer surface decreases with the increment of initial angular velocity but not on the points of upper membrane. Fluctuation of position motion of satellite resulted from external disturbance is become small as initial rotation angular velocity increases. 10 refs., 6 figs., 2 tabs.

  15. Numerical simulations of aerodynamic contribution of flows about a space-plane-type configuration

    Science.gov (United States)

    Matsushima, Kisa; Takanashi, Susume; Fujii, Kozo; Obayashi, Shigeru

    1987-01-01

    The slightly supersonic viscous flow about the space-plane under development at the National Aerospace Laboratory (NAL) in Japan was simulated numerically using the LU-ADI algorithm. The wind-tunnel testing for the same plane also was conducted with the computations in parallel. The main purpose of the simulation is to capture the phenomena which have a great deal of influence to the aerodynamic force and efficiency but is difficult to capture by experiments. It includes more accurate representation of vortical flows with high angles of attack of an aircraft. The space-plane shape geometry simulated is the simplified model of the real space-plane, which is a combination of a flat and slender body and a double-delta type wing. The comparison between experimental results and numerical ones will be done in the near future. It could be said that numerical results show the qualitatively reliable phenomena.

  16. CERN Proton Synchrotron booster space charge simulations with a realistic model for alignment and field errors*

    Science.gov (United States)

    Forte, V.; Benedetto, E.; McAteer, M.

    2016-12-01

    The CERN Proton Synchrotron booster (PSB) is one of the machines of the LHC injector chain which will be upgraded within the LHC Injectors Upgrade (LIU) project. The injection energy of the PSB will be increased to 160 MeV in order to mitigate direct space charge effects, considered to be the main performance limitation, aiming to double the brightness for the LHC beams. In order to better predict the gain to be expected, space charge simulations are being carried out. As a first step, benchmarking between simulations and measurements is needed. Efforts to establish a realistic modeling of field and alignment errors aim at extending the basic model of the machine toward a more realistic one. Simulations of beam dynamics with strong space charge and realistic errors are presented and analyzed in this paper.

  17. Feasibility Analysis on Simulation of PLCS Malfunction Event using SPACE Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ung Soo; Lee, Cheol Shin; Sohn, Jong Joo [KEPCO-E and C, Daejeon (Korea, Republic of)

    2011-10-15

    A computer code named 'Safety and Performance Analysis Code (SPACE)' has been being developed in order to replace several existing computer codes used in designing nuclear power plant (NPP) in Korea. This SPACE code is a system code and should be able to simulate various plant events, needed for safety analysis of pressurized water reactors (PWRs), such as loss of coolant accident (LOCA), steam line break (SLB), feedwater line break (FLB), steam generator tube rupture (SGTR), and several anticipated operational occurrences (AOOs). Therefore, respective simulations of above events with the SPACE code should be verified and validated to utilize this code in the safety analysis. In this work, a feasibility analysis is performed for the simulation of pressurizer level control system (PLCS) malfunction event for the Shin-Kori units 3 and 4 (SKN 3 and 4)

  18. Benchmark of Space Charge Simulations and Comparison with Experimental Results for High Intensity, Low Energy Accelerators

    CERN Document Server

    Cousineau, Sarah M

    2005-01-01

    Space charge effects are a major contributor to beam halo and emittance growth leading to beam loss in high intensity, low energy accelerators. As future accelerators strive towards unprecedented levels of beam intensity and beam loss control, a more comprehensive understanding of space charge effects is required. A wealth of simulation tools have been developed for modeling beams in linacs and rings, and with the growing availability of high-speed computing systems, computationally expensive problems that were inconceivable a decade ago are now being handled with relative ease. This has opened the field for realistic simulations of space charge effects, including detailed benchmarks with experimental data. A great deal of effort is being focused in this direction, and several recent benchmark studies have produced remarkably successful results. This paper reviews the achievements in space charge benchmarking in the last few years, and discusses the challenges that remain.

  19. Development of a simulation environment to test space missions COTS technologies

    Science.gov (United States)

    Saraf, S.; Knoll, A.; Melanson, P.; Tafazoli, M.

    2002-07-01

    The Canadian Space Agency's (CSA) Software and Ground Segment Section (SGS) has the mandate to develop innovative emerging software and on-board satellite and ground segment computer technologies. To that end, there is an ongoing development of a simulation environment to test COTS (Commercial-Of-The-Shelf) technologies. There are severe cost constraints in all aspects of many space missions due to the limited return on investment and scarce commercialization opportunities that come with many science missions. There is an opportunity to explore the innovative implementation of COTS technologies to reduce the mission cost and maximize performance available from COTS components. However, using COTS technologies in the space environment has ist constraints and therefore designing a spacecraft mission has to involve some new techniques that allow implementation of these components and minimize the risk of failure. The goal of our project is to develop a simulation environment, itself using COTS components, and then to allow the seamless integration of various components to test spacecraft mission concepts. For example, one of the aspects of using COTS processors in space is to protect them from the radiation environment. The current state of the simulation tests an innovative software EDAC (Error Detection and Correction) package and a redundant processor configuration to investigate protection against the effects of radiation and other failures on a generic mission. It also includes the capability to test formation-flying concepts that have the potential to revolutionize cost reduction efforts for space missions and to enable new space applications. This paper describes the simulation environment in detail and illustrates some of the technologies being tested for possible future space missions. The paper concludes with a look at the future development of the simulation environment and possible benefits of its use as well as the lessons learned to date.

  20. Simulated Partners and Collaborative Exercise (SPACE) to boost motivation for astronauts: study protocol

    OpenAIRE

    Feltz, Deborah L.; Ploutz-Snyder, Lori; Winn, Brian; Kerr, Norbert L.; Pivarnik, James M; Ede, Alison; Hill, Christopher; Samendinger, Stephen; Jeffery, William

    2016-01-01

    Background Astronauts may have difficulty adhering to exercise regimens at vigorous intensity levels during long space missions. Vigorous exercise is important for aerobic and musculoskeletal health during space missions and afterwards. A key impediment to maintaining vigorous exercise is motivation. Finding ways to motivate astronauts to exercise at levels necessary to mitigate reductions in musculoskeletal health and aerobic capacity have not been explored. The focus of Simulated Partners a...

  1. Space Charge Simulations in the Fermilab Recycler for PIP-II

    Energy Technology Data Exchange (ETDEWEB)

    Ainsworth, Robert [Fermilab; Adamson, Philip [Fermilab; Kourbanis, Ioanis [Fermilab; Stern, Eric [Fermilab

    2016-06-01

    Proton Improvement Plan-II (PIP-II) is Fermilab's plan for providing powerful, high-intensity proton beams to the laboratory's experiments. Upgrades are foreseen for the recycler which will cope with bunches containing fifty percent more beam. Of particular concern is large space charge tune shifts caused by the intensity increase. Simulations performed using Synergia are detailed focusing on the space charge footprint.

  2. Simulation of the preliminary General Electric SP-100 space reactor concept using the ATHENA computer code

    Science.gov (United States)

    Fletcher, C. D.

    The capability to perform thermal-hydraulic analyses of a space reactor using the ATHENA computer code is demonstrated. The fast reactor, liquid-lithium coolant loops, and lithium-filled heat pipes of the preliminary General electric SP-100 design were modeled with ATHENA. Two demonstration transient calculations were performed simulating accident conditions. Calculated results are available for display using the Nuclear Plant Analyzer color graphics analysis tool in addition to traditional plots. ATHENA-calculated results appear reasonable, both for steady state full power conditions, and for the two transients. This analysis represents the first known transient thermal-hydraulic simulation using an integral space reactor system model incorporating heat pipes.

  3. 26th Space Simulation Conference Proceedings. Environmental Testing: The Path Forward

    Science.gov (United States)

    Packard, Edward A.

    2010-01-01

    Topics covered include: A Multifunctional Space Environment Simulation Facility for Accelerated Spacecraft Materials Testing; Exposure of Spacecraft Surface Coatings in a Simulated GEO Radiation Environment; Gravity-Offloading System for Large-Displacement Ground Testing of Spacecraft Mechanisms; Microscopic Shutters Controlled by cRIO in Sounding Rocket; Application of a Physics-Based Stabilization Criterion to Flight System Thermal Testing; Upgrade of a Thermal Vacuum Chamber for 20 Kelvin Operations; A New Approach to Improve the Uniformity of Solar Simulator; A Perfect Space Simulation Storm; A Planetary Environmental Simulator/Test Facility; Collimation Mirror Segment Refurbishment inside ESA s Large Space; Space Simulation of the CBERS 3 and 4 Satellite Thermal Model in the New Brazilian 6x8m Thermal Vacuum Chamber; The Certification of Environmental Chambers for Testing Flight Hardware; Space Systems Environmental Test Facility Database (SSETFD), Website Development Status; Wallops Flight Facility: Current and Future Test Capabilities for Suborbital and Orbital Projects; Force Limited Vibration Testing of JWST NIRSpec Instrument Using Strain Gages; Investigation of Acoustic Field Uniformity in Direct Field Acoustic Testing; Recent Developments in Direct Field Acoustic Testing; Assembly, Integration and Test Centre in Malaysia: Integration between Building Construction Works and Equipment Installation; Complex Ground Support Equipment for Satellite Thermal Vacuum Test; Effect of Charging Electron Exposure on 1064nm Transmission through Bare Sapphire Optics and SiO2 over HfO2 AR-Coated Sapphire Optics; Environmental Testing Activities and Capabilities for Turkish Space Industry; Integrated Circuit Reliability Simulation in Space Environments; Micrometeoroid Impacts and Optical Scatter in Space Environment; Overcoming Unintended Consequences of Ambient Pressure Thermal Cycling Environmental Tests; Performance and Functionality Improvements to Next Generation

  4. STATE SPACE MODELING AND SIMULATION OF SENSORLESS PERMANENT MAGNET BLDC MOTOR

    Directory of Open Access Journals (Sweden)

    N. MURUGANANTHAM

    2010-10-01

    Full Text Available Brushless DC (BLDC motor simulation can be simply implemented with the required control scheme using specialized simulink built-in tools and block sets such as simpower systems toolbox. But it requires powerful processor requirements, large random access memory and long simulation time. To overcome these drawbacks this paper presents a state space modeling, simulation and control of permanent magnet brushless DC motor. By reading the instantaneous position of the rotor as an output, different variables of the motor can be controlled without the need of any external sensors or position detection techniques. Simulink is utilized with the assistance of MATLAB to give a very flexible and reliable simulation. With state space model representation, the motor performance can be analyzed for variation of motor parameters.

  5. Modeling and Simulation of DC Power Electronics Systems Using Harmonic State Space (HSS) Method

    DEFF Research Database (Denmark)

    Kwon, Jun Bum; Wang, Xiongfei; Bak, Claus Leth

    2015-01-01

    For the efficiency and simplicity of electric systems, the dc based power electronics systems are widely used in variety applications such as electric vehicles, ships, aircrafts and also in homes. In these systems, there could be a number of dynamic interactions between loads and other dc...... based on the state-space averaging and generalized averaging, these also have limitations to show the same results as with the non-linear time domain simulations. This paper presents a modeling and simulation method for a large dc power electronic system by using Harmonic State Space (HSS) modeling....... Through this method, the required computation time and CPU memory for large dc power electronics systems can be reduced. Besides, the achieved results show the same results as with the non-linear time domain simulation, but with the faster simulation time which is beneficial in a large network....

  6. European Space Agency's launcher multibody dynamics simulator used for system and subsystem level analyses

    Science.gov (United States)

    Baldesi, Gianluigi; Toso, Mario

    2012-06-01

    Virtual simulation is currently a key activity in the specification, design, verification and operations of space systems. System modelling and simulation support in fact a number of use cases across the spacecraft development life cycle, including activities such as system design validation, software verification and validation, spacecraft unit and sub-system test activities, etc. As the reliance on virtual modelling, simulation and justification has substantially grown in recent years, a more coordinated and consistent approach to the development of such simulation tools across project phases can bring substantial benefit in reducing the overall space programme schedule, risk and cost. By capitalizing on the ESA (European Space Agency) Structures and Mechanisms division's strong expertise in dynamics (multibody software), a generic multibody flight simulator was created to simulate a wide variety of launch vehicle dynamics and control problems at system level since 2001. The backbone of the multibody dynamics simulator is DCAP (Dynamic and Control Analysis Package), a multibody software, developed by ESA together with industry, with more than 30 years heritage in space applications. This software is a suite of fast, effective computer programs that provides the user with capabilities to model, simulate and analyze the dynamics and control performances of coupled rigid and flexible structural systems subjected to possibly time-varying structural characteristics and space environmental loads. The simulator uses the formulation for the dynamics of multi-rigid/flexible-body systems based on Order( n) algorithm. This avoids the explicit computation of a global mass matrix and its inversion, and the computational burden in these schemes increases only linearly with the number n of the system's degrees of freedom. A dedicated symbolic manipulation pre-processor is then used in the coding optimization. With the implementation of dedicated interfaces to other specialised

  7. Concept verification of three dimensional free motion simulator for space robot

    Science.gov (United States)

    Okamoto, Osamu; Nakaya, Teruomi; Pokines, Brett

    1994-01-01

    In the development of automatic assembling technologies for space structures, it is an indispensable matter to investigate and simulate the movements of robot satellites concerned with mission operation. The movement investigation and simulation on the ground will be effectively realized by a free motion simulator. Various types of ground systems for simulating free motion have been proposed and utilized. Some of these methods are a neutral buoyancy system, an air or magnetic suspension system, a passive suspension balance system, and a free flying aircraft or drop tower system. In addition, systems can be simulated by computers using an analytical model. Each free motion simulation method has limitations and well known problems, specifically, disturbance by water viscosity, limited number of degrees-of-freedom, complex dynamics induced by the attachment of the simulation system, short experiment time, and the lack of high speed super-computer simulation systems, respectively. The basic idea presented here is to realize 3-dimensional free motion. This is achieved by combining a spherical air bearing, a cylindrical air bearing, and a flat air bearing. A conventional air bearing system has difficulty realizing free vertical motion suspension. The idea of free vertical suspension is that a cylindrical air bearing and counter balance weight realize vertical free motion. This paper presents a design concept, configuration, and basic performance characteristics of an innovative free motion simulator. A prototype simulator verifies the feasibility of 3-dimensional free motion simulation.

  8. OPSMODEL, an or-orbit operations simulation modeling tool for Space Station

    Science.gov (United States)

    Davis, William T.; Wright, Robert L.

    1988-01-01

    The 'OPSMODEL' operations-analysis and planning tool simulates on-orbit crew operations for the NASA Space Station, furnishing a quantitative measure of the effectiveness of crew activities in various alternative Station configurations while supporting engineering and cost analyses. OPSMODEL is entirely data-driven; the top-down modeling structure of the software allows the user to control both the content and the complexity level of model definition during data base population. Illustrative simulation samples are given.

  9. Apu/hydraulic/actuator Subsystem Computer Simulation. Space Shuttle Engineering and Operation Support, Engineering Systems Analysis. [for the space shuttle

    Science.gov (United States)

    1975-01-01

    Major developments are examined which have taken place to date in the analysis of the power and energy demands on the APU/Hydraulic/Actuator Subsystem for space shuttle during the entry-to-touchdown (not including rollout) flight regime. These developments are given in the form of two subroutines which were written for use with the Space Shuttle Functional Simulator. The first subroutine calculates the power and energy demand on each of the three hydraulic systems due to control surface (inboard/outboard elevons, rudder, speedbrake, and body flap) activity. The second subroutine incorporates the R. I. priority rate limiting logic which limits control surface deflection rates as a function of the number of failed hydraulic. Typical results of this analysis are included, and listings of the subroutines are presented in appendicies.

  10. Characterization of a lower-body exoskeleton for simulation of space-suited locomotion

    Science.gov (United States)

    Carr, Christopher E.; Newman, Dava J.

    2008-02-01

    In a previous analysis of suited and unsuited locomotion energetics, we found evidence that space suits act as springs during running. Video images from the lunar surface suggest that knee torques create, in large part, this spring effect. We hypothesized that a lower-body exoskeleton, properly constructed, could be used to simulate the knee torques of a range of space suits. Here we report characterization of a lower-body exoskeleton. Equivalent spring stiffness of each exoskeleton leg varies as a function of exoskeleton knee angle and load, and the exoskeleton joint-torque relationship closely matches the current NASA space suit, or Extravehicular Mobility Unit, knee torques in form and magnitude. We have built an exoskeleton with two physical non-linear springs, which achieve space-suit like joint-torques. Therefore space-suit legs act as springs, with this effect most pronounced when locomotion requires large changes in knee flexion such as during running.

  11. 对《21世纪大学英语》性别角色的研究%Research on the gender roles of"twenty-first Century university English"

    Institute of Scientific and Technical Information of China (English)

    王莉

    2013-01-01

    The gender image in the university English teaching material will influence the gender consciousness of college students of life transition time, this paper uses content analysis method to study the gender roles of"twenty-first Century university English", found that gender differences and gender stereotype exists in impression material.%大学英语教材中的性别形象会潜移默化的影响处于人生转折期的大学生的性别意识,因此本文用内容分析的方法对《21世纪大学英语》教材中的性别角色进行研究,发现教材中存在性别性别差异和性别刻板印象。

  12. Documentation of GEMASS entry to touchdown simulation. [space shuttle orbiter capability

    Science.gov (United States)

    Waibel, R. H.

    1977-01-01

    The entry-to-touchdown space shuttle orbiter simulation capability incorporated into the GEMASS subprogram 33 (3-DOF) is documented. A digital autopilot interfaces between GEMASS and the guidance. Vehicle attitude is determined by use of the ability of GEMASS to integrate differential equations in addition to the equations of motion. Vehicle aerodynamic characteristics are obtained from an aerodynamic data tape and control surface deflections required to trim the vehicle, and trimmed aerodynamic coefficients are determined internally. Several indicators to allow evaluation of subsystem performance are included; as is the ability of the user to activate any of several error dispersion sources. The performance of the simulation compares well with more sophisticated simulations.

  13. Simulations of minor mergers - II. The phase-space structure of thick discs

    NARCIS (Netherlands)

    Villalobos, Alvaro; Helmi, Amina

    2009-01-01

    We analyse the phase-space structure of simulated thick discs that are the result of a 5:1 mass-ratio merger between a disc galaxy and a satellite. Our main goal is to establish what would be the imprints of a merger origin for the Galactic thick disc. We find that the spatial distribution predicted

  14. Simulations of minor mergers. II. The phase-space structure of thick discs

    NARCIS (Netherlands)

    Villalobos, ´Alvaro; Helmi, Amina

    2009-01-01

    We analyse the phase-space structure of simulated thick discs that are the result of a significant merger between a disc galaxy and a satellite. Our main goal is to establish what would be the characteristic imprints of a merger origin for the Galactic thick disc. We find that the spatial distributi

  15. Preparation, control, and use of standard operating procedures in a space simulation laboratory

    Science.gov (United States)

    Parish, R. P., Jr.

    1975-01-01

    The degree of success in the operation of a space simulation laboratory is a direct function of the role of its standard operating procedures. Their proper use in a thermal vacuum test effects a wellrun test program. Preparation and procedure control are discussed.

  16. Being an "Agent Provocateur": Utilising Online Spaces for Teacher Professional Development in Virtual Simulation Games

    Science.gov (United States)

    deNoyelles, Aimee; Raider-Roth, Miriam

    2016-01-01

    This article details the results of an action research study which investigated how teachers used online learning community spaces to develop and support their teaching and learning of the Jewish Court of All Time (JCAT), a web-mediated, character-playing, simulation game that engages participants with social, historical and cultural curricula.…

  17. Instrumentation for Ground-Based Testing in Simulated Space and Planetary Conditions

    Science.gov (United States)

    Kleiman, Jacob; Horodetsky, Sergey; Issoupov, Vitali

    This paper is an overview of instrumentation developed and created by ITL Inc. for simulated testing and performance evaluation of spacecraft materials, structures, mechanisms, assemblies and components in different space and planetary environments. The LEO Space Environment Simulator allows simulation of the synergistic effect of ultra-high vacuum conditions, 5 eV neutral atomic oxygen beams, Vacuum-Ultraviolet (VUV) and Near-Ultraviolet (NUV) radiation, and temperature conditions. The simulated space environmental conditions can be controlled in-situ using a quadruple mass-spectrometer, Time-of-Flight technique, as well as Quartz Crystal Microbalance sensors. The new NUV System is capable of delivering an NUV power intensity of up to 10 Equivalent Suns. The design of the system uses horizontal orientation of the 5 kW Mercury lamp, focusing of NUV radiation is achieved due to a parabolic reflector. To address the Lunar/Martian surface environments, the Planetary Environmental Simulator/Test Facility has been developed and built to allow for physical evaluation of the effects of the Lunar/Martian dust environments in conjunction with other factors (ultra-high vacuum or planetary atmospheric conditions, VUV/NUV radiation, thermal cycling, and darkness). The ASTM E 595/ASTM E 1559 Outgassing Test Facility provides the means for the outgassing test of materials with the objective to select materials with low outgassing properties for spacecraft use and allows to determine the following outgassing parameters: Total Mass Loss, Collected Volatile Condensable Materials, and Water Vapor Regained.

  18. The PLATO Simulator: Modelling of High-Precision High-Cadence Space-Based Imaging

    CERN Document Server

    Marcos-Arenal, P; De Ridder, J; Aerts, C; Huygen, R; Samadi, R; Green, J; Piotto, G; Salmon, S; Catala, C; Rauer, H

    2014-01-01

    Many aspects of the design trade-off of a space-based instrument and its performance can best be tackled through simulations of the expected observations. The complex interplay of various noise sources in the course of the observations make such simulations an indispensable part of the assessment and design study of any space-based mission. We present a formalism to model and simulate photometric time series of CCD images by including models of the CCD and its electronics, the telescope optics, the stellar field, the jitter movements of the spacecraft, and all important natural noise sources. This formalism has been implemented in a versatile end-to-end simulation software tool, called PLATO Simulator, specifically designed for the PLATO space mission to be operated from L2, but easily adaptable to similar types of missions. We provide a detailed description of several noise sources and discuss their properties, in connection with the optical design, the allowable level of jitter, the quantum efficiency of th...

  19. SPH Simulation of Acoustic Waves: Effects of Frequency, Sound Pressure, and Particle Spacing

    Directory of Open Access Journals (Sweden)

    Y. O. Zhang

    2015-01-01

    Full Text Available Acoustic problems consisting of multiphase systems or with deformable boundaries are difficult to describe using mesh-based methods, while the meshfree, Lagrangian smoothed particle hydrodynamics (SPH method can handle such complicated problems. In this paper, after solving linearized acoustic equations with the standard SPH theory, the feasibility of the SPH method in simulating sound propagation in the time domain is validated. The effects of sound frequency, maximum sound pressure amplitude, and particle spacing on numerical error and time cost are then subsequently discussed based on the sound propagation simulation. The discussion based on a limited range of frequency and sound pressure demonstrates that the rising of sound frequency increases simulation error, and the increase is nonlinear, whereas the rising sound pressure has limited effects on the error. In addition, decreasing the particle spacing reduces the numerical error, while simultaneously increasing the CPU time. The trend of both changes is close to linear on a logarithmic scale.

  20. LABORATORY TESTING TO SIMULATE VAPOR SPACE CORROSION IN RADIOACTIVE WASTE STORAGE TANKS

    Energy Technology Data Exchange (ETDEWEB)

    Wiersma, B.; Garcia-Diaz, B.; Gray, J.

    2013-08-30

    Radioactive liquid waste has been stored in underground carbon steel tanks for nearly 70 years at the Hanford nuclear facility. Vapor space corrosion of the tank walls has emerged as an ongoing challenge to overcome in maintaining the structural integrity of these tanks. The interaction between corrosive and inhibitor species in condensates/supernates on the tank wall above the liquid level, and their interaction with vapor phase constituents as the liquid evaporates from the tank wall influences the formation of corrosion products and the corrosion of the carbon steel. An effort is underway to gain an understanding of the mechanism of vapor space corrosion. Localized corrosion, in the form of pitting, is of particular interest in the vapor space. CPP testing was utilized to determine the susceptibility of the steel in a simulated vapor space environment. The tests also investigated the impact of ammonia gas in the vapor space area on the corrosion of the steel. Vapor space coupon tests were also performed to investigate the evolution of the corrosion products during longer term exposures. These tests were also conducted at vapor space ammonia levels of 50 and 550 ppm NH{sub 3} (0.005, and 0.055 vol.%) in air. Ammonia was shown to mitigate vapor space corrosion.

  1. Simulations of the MATROSHKA experiment at the international space station using PHITS.

    Science.gov (United States)

    Sihver, L; Sato, T; Puchalska, M; Reitz, G

    2010-08-01

    Concerns about the biological effects of space radiation are increasing rapidly due to the perspective of long-duration manned missions, both in relation to the International Space Station (ISS) and to manned interplanetary missions to Moon and Mars in the future. As a preparation for these long-duration space missions, it is important to ensure an excellent capability to evaluate the impact of space radiation on human health, in order to secure the safety of the astronauts/cosmonauts and minimize their risks. It is therefore necessary to measure the radiation load on the personnel both inside and outside the space vehicles and certify that organ- and tissue-equivalent doses can be simulated as accurate as possible. In this paper, simulations are presented using the three-dimensional Monte Carlo Particle and Heavy-Ion Transport code System (PHITS) (Iwase et al. in J Nucl Sci Tech 39(11):1142-1151, 2002) of long-term dose measurements performed with the European Space Agency-supported MATROSHKA (MTR) experiment (Reitz and Berger in Radiat Prot Dosim 120:442-445, 2006). MATROSHKA is an anthropomorphic phantom containing over 6,000 radiation detectors, mimicking a human head and torso. The MTR experiment, led by the German Aerospace Center (DLR), was launched in January 2004 and has measured the absorbed doses from space radiation both inside and outside the ISS. Comparisons of simulations with measurements outside the ISS are presented. The results indicate that PHITS is a suitable tool for estimation of doses received from cosmic radiation and for study of the shielding of spacecraft against cosmic radiation.

  2. Energy content of stormtime ring current from phase space mapping simulations

    Science.gov (United States)

    Chen, Margaret W.; Schulz, Michael; Lyons, Larry R.

    1993-01-01

    We perform a phase space mapping study to estimate the enhancement in energy content that results from stormtime particle transport in the equatorial magnetosphere. Our pre-storm phase space distribution is based on a steady-state transport model. Using results from guiding-center simulations of ion transport during model storms having main phases of 3 hr, 6 hr, and 12 hr, we map phase space distributions of ring current protons from the pre-storm distribution in accordance with Liouville's theorem. We find that transport can account for the entire ten to twenty-fold increase in magnetospheric particle energy content typical of a major storm if a realistic stormtime enhancement of the phase space density f is imposed at the nightside tail plasma sheet (represented by an enhancement of f at the neutral line in our model).

  3. OPR1000 Control Rod Drop Accident Simulation using the SPACE Code

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Chang Keun; Ha, Sang Jun; Moon, Chan Kook [Korea Hydro and Nuclear Power, Daejeon (Korea, Republic of)

    2012-05-15

    The Korea nuclear industry has developed a best estimated two-phase three-filed thermal-hydraulic analysis code, SPACE (Safety and Performance Analysis Code for Nuclear Power Plants), for safety analysis and design of a PWR (Pressurized Water Reactor). As the first phase, the demo version of the SPACE code was released in March 2010. The code has been verified and improved according to the Validation and Verification (V and V) matrix prepared for the SPACE code as the second phase of the development. In this study, a Control Rod Drop accident has been simulated using the SPACE code as one aspect of the V and V work. The results from this test were compared with tests of the RETRAN and CESEC codes

  4. Standard Lunar Regolith Simulants for Space Resource Utilization Technologies Development: Effects of Materials Choices

    Science.gov (United States)

    Sibille, Laurent; Carpenter, Paul K.

    2006-01-01

    As NASA turns its exploration ambitions towards the Moon once again, the research and development of new technologies for lunar operations face the challenge of meeting the milestones of a fastpace schedule, reminiscent of the 1960's Apollo program. While the lunar samples returned by the Apollo and Luna missions have revealed much about the Moon, these priceless materials exist in too scarce quantities to be used for technology development and testing. The need for mineral materials chosen to simulate the characteristics of lunar regoliths is a pressing issue that is being addressed today through the collaboration of scientists, engineers and NASA program managers. The issue of reproducing the properties of lunar regolith for research and technology development purposes was addressed by the recently held 2005 Workshop on Lunar Regolith Simulant Materials at Marshall Space Flight Center. The recommendation of the workshop of establishing standard simulant materials to be used in lunar technology development and testing will be discussed here with an emphasis on space resource utilization. The variety of techniques and the complexity of functional interfaces make these simulant choices critical in space resource utilization.

  5. Monte Carlo simulations for the space radiation superconducting shield project (SR2S)

    Science.gov (United States)

    Vuolo, M.; Giraudo, M.; Musenich, R.; Calvelli, V.; Ambroglini, F.; Burger, W. J.; Battiston, R.

    2016-02-01

    Astronauts on deep-space long-duration missions will be exposed for long time to galactic cosmic rays (GCR) and Solar Particle Events (SPE). The exposure to space radiation could lead to both acute and late effects in the crew members and well defined countermeasures do not exist nowadays. The simplest solution given by optimized passive shielding is not able to reduce the dose deposited by GCRs below the actual dose limits, therefore other solutions, such as active shielding employing superconducting magnetic fields, are under study. In the framework of the EU FP7 SR2S Project - Space Radiation Superconducting Shield - a toroidal magnetic system based on MgB2 superconductors has been analyzed through detailed Monte Carlo simulations using Geant4 interface GRAS. Spacecraft and magnets were modeled together with a simplified mechanical structure supporting the coils. Radiation transport through magnetic fields and materials was simulated for a deep-space mission scenario, considering for the first time the effect of secondary particles produced in the passage of space radiation through the active shielding and spacecraft structures. When modeling the structures supporting the active shielding systems and the habitat, the radiation protection efficiency of the magnetic field is severely decreasing compared to the one reported in previous studies, when only the magnetic field was modeled around the crew. This is due to the large production of secondary radiation taking place in the material surrounding the habitat.

  6. Monte Carlo simulations for the space radiation superconducting shield project (SR2S).

    Science.gov (United States)

    Vuolo, M; Giraudo, M; Musenich, R; Calvelli, V; Ambroglini, F; Burger, W J; Battiston, R

    2016-02-01

    Astronauts on deep-space long-duration missions will be exposed for long time to galactic cosmic rays (GCR) and Solar Particle Events (SPE). The exposure to space radiation could lead to both acute and late effects in the crew members and well defined countermeasures do not exist nowadays. The simplest solution given by optimized passive shielding is not able to reduce the dose deposited by GCRs below the actual dose limits, therefore other solutions, such as active shielding employing superconducting magnetic fields, are under study. In the framework of the EU FP7 SR2S Project - Space Radiation Superconducting Shield--a toroidal magnetic system based on MgB2 superconductors has been analyzed through detailed Monte Carlo simulations using Geant4 interface GRAS. Spacecraft and magnets were modeled together with a simplified mechanical structure supporting the coils. Radiation transport through magnetic fields and materials was simulated for a deep-space mission scenario, considering for the first time the effect of secondary particles produced in the passage of space radiation through the active shielding and spacecraft structures. When modeling the structures supporting the active shielding systems and the habitat, the radiation protection efficiency of the magnetic field is severely decreasing compared to the one reported in previous studies, when only the magnetic field was modeled around the crew. This is due to the large production of secondary radiation taking place in the material surrounding the habitat.

  7. Experimental Trapped-ion Quantum Simulation of the Kibble-Zurek dynamics in momentum space

    Science.gov (United States)

    Cui, Jin-Ming; Huang, Yun-Feng; Wang, Zhao; Cao, Dong-Yang; Wang, Jian; Lv, Wei-Min; Luo, Le; del Campo, Adolfo; Han, Yong-Jian; Li, Chuan-Feng; Guo, Guang-Can

    2016-01-01

    The Kibble-Zurek mechanism is the paradigm to account for the nonadiabatic dynamics of a system across a continuous phase transition. Its study in the quantum regime is hindered by the requisite of ground state cooling. We report the experimental quantum simulation of critical dynamics in the transverse-field Ising model by a set of Landau-Zener crossings in pseudo-momentum space, that can be probed with high accuracy using a single trapped ion. We test the Kibble-Zurek mechanism in the quantum regime in the momentum space and find the measured scaling of excitations is in accordance with the theoretical prediction. PMID:27633087

  8. Experimental Trapped-ion Quantum Simulation of the Kibble-Zurek dynamics in momentum space

    Science.gov (United States)

    Cui, Jin-Ming; Huang, Yun-Feng; Wang, Zhao; Cao, Dong-Yang; Wang, Jian; Lv, Wei-Min; Luo, Le; Del Campo, Adolfo; Han, Yong-Jian; Li, Chuan-Feng; Guo, Guang-Can

    2016-09-01

    The Kibble-Zurek mechanism is the paradigm to account for the nonadiabatic dynamics of a system across a continuous phase transition. Its study in the quantum regime is hindered by the requisite of ground state cooling. We report the experimental quantum simulation of critical dynamics in the transverse-field Ising model by a set of Landau-Zener crossings in pseudo-momentum space, that can be probed with high accuracy using a single trapped ion. We test the Kibble-Zurek mechanism in the quantum regime in the momentum space and find the measured scaling of excitations is in accordance with the theoretical prediction.

  9. Simulation of Cascaded Longitudinal-Space-Charge Amplifier at the Fermilab Accelerator Science & Technology (Fast) Facility

    Energy Technology Data Exchange (ETDEWEB)

    Halavanau, A. [Northern Illinois U.; Piot, P. [Northern Illinois U.

    2015-12-01

    Cascaded Longitudinal Space Charge Amplifiers (LSCA) have been proposed as a mechanism to generate density modulation over a board spectral range. The scheme has been recently demonstrated in the optical regime and has confirmed the production of broadband optical radiation. In this paper we investigate, via numerical simulations, the performance of a cascaded LSCA beamline at the Fermilab Accelerator Science & Technology (FAST) facility to produce broadband ultraviolet radiation. Our studies are carried out using elegant with included tree-based grid-less space charge algorithm.

  10. Reinforcement Learning in Large State Spaces Simulated Robotic Soccer as a Testbed

    OpenAIRE

    Tuyls, Karl; Maes, Sam; Manderick, Bernard

    2003-01-01

    Large state spaces and incomplete information are two problems that stand out in learning in multi-agent systems. In this paper we tackle them both by using a combination of decision trees and Bayesian networks (BNs) to model the environment and the Q-function. Simulated robotic soccer is used as a testbed, since there agents are faced with both large state spaces and incomplete information. The long-term goal of this research is to define generic techniques that allow agents to learn in larg...

  11. Preparing for N(f) = 2 simulations at small lattice spacings

    CERN Document Server

    Della Morte, M.; Leder, B.; Takeda, S.; Witzel, O.; Wolff, U.; Meyer, H.; Simma, H.; Sommer, R.

    2007-01-01

    We discuss some large effects of dynamical fermions. One is a cutoff effect, others concern the contribution of multi-pion states to correlation functions and are expected to survive the continuum limit. We then turn to the preparation for simulations at small lattice spacings which we are planning down to around a=0.04fm in order to understand the size of O(a^2)-effects of the standard O(a)-improved theory. The dependence of the lattice spacing on the bare coupling is determined through the Schr'odinger functional renormalized coupling.

  12. ML-Space: Hybrid Spatial Gillespie and Particle Simulation of Multi-level Rule-based Models in Cell Biology.

    Science.gov (United States)

    Bittig, Arne; Uhrmacher, Adelinde

    2016-08-03

    Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.

  13. Validated simulator for space debris removal with nets and other flexible tethers applications

    Science.gov (United States)

    Gołębiowski, Wojciech; Michalczyk, Rafał; Dyrek, Michał; Battista, Umberto; Wormnes, Kjetil

    2016-12-01

    In the context of active debris removal technologies and preparation activities for the e.Deorbit mission, a simulator for net-shaped elastic bodies dynamics and their interactions with rigid bodies, has been developed. Its main application is to aid net design and test scenarios for space debris deorbitation. The simulator can model all the phases of the debris capturing process: net launch, flight and wrapping around the target. It handles coupled simulation of rigid and flexible bodies dynamics. Flexible bodies were implemented using Cosserat rods model. It allows to simulate flexible threads or wires with elasticity and damping for stretching, bending and torsion. Threads may be combined into structures of any topology, so the software is able to simulate nets, pure tethers, tether bundles, cages, trusses, etc. Full contact dynamics was implemented. Programmatic interaction with simulation is possible - i.e. for control implementation. The underlying model has been experimentally validated and due to significant gravity influence, experiment had to be performed in microgravity conditions. Validation experiment for parabolic flight was a downscaled process of Envisat capturing. The prepacked net was launched towards the satellite model, it expanded, hit the model and wrapped around it. The whole process was recorded with 2 fast stereographic camera sets for full 3D trajectory reconstruction. The trajectories were used to compare net dynamics to respective simulations and then to validate the simulation tool. The experiments were performed on board of a Falcon-20 aircraft, operated by National Research Council in Ottawa, Canada. Validation results show that model reflects phenomenon physics accurately enough, so it may be used for scenario evaluation and mission design purposes. The functionalities of the simulator are described in detail in the paper, as well as its underlying model, sample cases and methodology behind validation. Results are presented and

  14. Simulations of beam emittance growth from the collectiverelaxation of space-charge nonuniformities

    Energy Technology Data Exchange (ETDEWEB)

    Lund, Steven M.; Grote, David P.; Davidson, Ronald C.

    2004-05-01

    Beams injected into a linear focusing channel typically have some degree of space-charge nonuniformity. For unbunched beams with high space-charge intensity propagating in linear focusing channels, Debye screening of self-field interactions tends to make the transverse density profile flat. An injected particle distribution with a large systematic charge nonuniformity will generally be far from an equilibrium of the focusing channel and the initial condition will launch a broad spectrum of collective modes. These modes can phase-mix and experience nonlinear interactions which result in an effective relaxation to a more thermal-equilibrium-like distribution characterized by a uniform density profile. This relaxation transfers self-field energy from the initial space-charge nonuniformity to the local particle temperature, thereby increasing beam phase space area (emittance growth). Here they employ two-dimensional electrostatic particle in cell (PIC) simulations to investigate the effects of initial transverse space-charge nonuniformities on the equality of beams with high space-charge intensity propagating in a continuous focusing channel. Results are compared to theoretical bounds of emittance growth developed in previous studies. Consistent with earlier theory, it is found that a high degree of initial distribution nonuniformity can be tolerated with only modest emittance growth and that beam control can be maintained. The simulations also provide information on the rate of relaxation and characteristic levels of fluctuations in the relaxed states. This research suggests that a surprising degree of initial space-charge nonuniformity can be tolerated in practical intense beam experiments.

  15. Development of the reentry flight dynamics simulator for evaluation of space shuttle orbiter entry systems

    Science.gov (United States)

    Rowell, L. F.; Powell, R. W.; Stone, H. W., Jr.

    1980-01-01

    A nonlinear, six degree of freedom, digital computer simulation of a vehicle which has constant mass properties and whose attitudes are controlled by both aerodynamic surfaces and reaction control system thrusters was developed. A rotating, oblate Earth model was used to describe the gravitational forces which affect long duration Earth entry trajectories. The program is executed in a nonreal time mode or connected to a simulation cockpit to conduct piloted and autopilot studies. The program guidance and control software used by the space shuttle orbiter for its descent from approximately 121.9 km to touchdown on the runway.

  16. Simulation Evaluation of Controller-Managed Spacing Tools under Realistic Operational Conditions

    Science.gov (United States)

    Callantine, Todd J.; Hunt, Sarah M.; Prevot, Thomas

    2014-01-01

    Controller-Managed Spacing (CMS) tools have been developed to aid air traffic controllers in managing high volumes of arriving aircraft according to a schedule while enabling them to fly efficient descent profiles. The CMS tools are undergoing refinement in preparation for field demonstration as part of NASA's Air Traffic Management (ATM) Technology Demonstration-1 (ATD-1). System-level ATD-1 simulations have been conducted to quantify expected efficiency and capacity gains under realistic operational conditions. This paper presents simulation results with a focus on CMS-tool human factors. The results suggest experienced controllers new to the tools find them acceptable and can use them effectively in ATD-1 operations.

  17. Atmospheric effects on Quaternary polarization encoding for free space communication, laboratory simulation

    CERN Document Server

    Soorat, Ram

    2015-01-01

    We have simulated atmospheric effects such as fog and smoke in laboratory environment to simulate depolarisation due to atmospheric effects during a free space optical communi- cation. This has been used to study noise in two components of quaternary encoding for polarization shift keying. Individual components of a Quaternary encoding, such as vertical and horizontal as well as 45$^\\circ$ and 135$^\\circ$ , are tested separately and indicates that the depo- larization effects are different for these two situation. However, due to a differential method used to extract information bits, the protocol shows extremely low bit error rates. The information obtained is useful during deployment of a fully functional Quaternary encoded PolSK scheme in free space.

  18. Gyrokinetic simulations of fusion plasmas using a spectral velocity space representation

    CERN Document Server

    Parker, Joseph Thomas

    2016-01-01

    Magnetic confinement fusion reactors suffer severely from heat and particle losses through turbulent transport, which has inspired the construction of ever larger and more expensive reactors. Numerical simulations are vital to their design and operation, but particle collisions are too infrequent for fluid descriptions to be valid. Instead, strongly magnetised fusion plasmas are described by the gyrokinetic equations, a nonlinear integro-differential system for evolving the particle distribution functions in a five-dimensional position and velocity space, and the consequent electromagnetic field. Due to the high dimensionality, simulations of small reactor sections require hundreds of thousands of CPU hours on High Performance Computing platforms. We develop a Hankel-Hermite spectral representation for velocity space that exploits structural features of the gyrokinetic system. The representation exactly conserves discrete free energy in the absence of explicit dissipation, while our Hermite hypercollision ope...

  19. Resistance of Antarctic black fungi and cryptoendolithic communities to simulated space and Martian conditions.

    Science.gov (United States)

    Onofri, S; Barreca, D; Selbmann, L; Isola, D; Rabbow, E; Horneck, G; de Vera, J P P; Hatton, J; Zucconi, L

    2008-01-01

    Dried colonies of the Antarctic rock-inhabiting meristematic fungi Cryomyces antarcticus CCFEE 515, CCFEE 534 and C. minteri CCFEE 5187, as well as fragments of rocks colonized by the Antarctic cryptoendolithic community, were exposed to a set of ground-based experiment verification tests (EVTs) at the German Aerospace Center (DLR, Köln, Germany). These were carried out to test the tolerance of these organisms in view of their possible exposure to space conditions outside of the International Space Station (ISS). Tests included single or combined simulated space and Martian conditions. Responses were analysed both by cultural and microscopic methods. Thereby, colony formation capacities were measured and the cellular viability was assessed using live/dead dyes FUN 1 and SYTOX Green. The results clearly suggest a general good resistance of all the samples investigated. C. minteri CCFEE 5187, C. antarcticus CCFEE 515 and colonized rocks were selected as suitable candidates to withstand space flight and long-term permanence in space on the ISS in the framework of the LIchens and Fungi Experiments (LIFE programme, European Space Agency).

  20. Portable Simulator for On-Board International Space Station Emergency Training

    Science.gov (United States)

    Bolt, Kathy; Root, Michael

    2014-01-01

    The crew on-board the International Space Station (ISS) have to be prepared for any possible emergency. The emergencies of most concern are a fire, depressurization or a toxic atmosphere. The crew members train on the ground before launch but also need to practice their emergency response skills while they are on orbit for 6 months. On-Board Training (OBT) events for emergency response proficiency used to require the crew and ground teams to use paper "scripts" that showed the path through the emergency procedures. This was not very realistic since the participants could read ahead and never deviate from this scripted path. The new OBT emergency simulator allows the crew to view dynamic information on an iPad only when it would become available during an event. The simulator interface allows the crew member to indicate hatch closures, don and doff masks, read pressures, and sample smoke or atmosphere levels. As the crew executes their actions using the on-board simulator, the ground teams are able to monitor those actions via ground display data flowing through the ISS Ku Band communication system which syncs the on-board simulator software with a ground simulator which is accessible in all the control centers. The OBT Working Group (OBT WG), led by the Chief Training Office (CTO) at Johnson Space center is a Multilateral working group with partners in Russia, Japan, Germany and U.S.A. The OBTWG worked together to create a simulator based on these principles: (a) Create a dynamic simulation that gives real-time data feedback; (b) Maintain real-time interface between Mission Control Centers and crew during OBTs; (c) Provide flexibility for decision making during drill execution; (d) Materially reduce Instructor and Flight Control Team man-hour costs involved with developing, updating, and maintaining emergency OBT cases/scenarios; and (e) Introduce an element of surprise to emergency scenarios so the team can't tell the outcome of the case by reading ahead in a

  1. On the simulation of tether-nets for space debris capture with Vortex Dynamics

    Science.gov (United States)

    Botta, Eleonora M.; Sharf, Inna; Misra, Arun K.; Teichmann, Marek

    2016-06-01

    Tether-nets are one of the more promising methods for the active removal of space debris. The aim of this paper is to study the dynamics of this type of systems in space, which is still not well-known and the simulation of which has multiple outstanding issues. In particular, the focus is on the deployment and capture phases of a net-based active debris removal mission, and on the effect of including the bending stiffness of the net threads on the dynamical characteristics of the net and on the computational efficiency. Lumped-parameter modeling of the net in Vortex Dynamics, without bending stiffness representation, is introduced first and validated then, against results obtained with an equivalent model in Matlab, using numerical simulations of the deployment phase. A model able to reproduce the bending stiffness of the net in Vortex Dynamics is proposed, and the outcome of a net deployment simulation is compared to the results of simulation without bending stiffness. A simulation of net-based capture of a derelict spacecraft is analyzed from the point of view of evaluating the effect of modeling the bending stiffness. From comparison of simulations with and without bending stiffness representation, it is found that bending stiffness has a significant influence both on the simulation results and on the computation time. When bending stiffness is included, the net is more resistant to the changes in its shape caused both by the motion of the corner masses (during deployment) and by the contact with the debris (during capture).

  2. NASA/IEEE MSST 2004 Twelfth NASA Goddard Conference on Mass Storage Systems and Technologies in cooperation with the Twenty-First IEEE Conference on Mass Storage Systems and Technologies

    Science.gov (United States)

    Kobler, Ben (Editor); Hariharan, P. C. (Editor)

    2004-01-01

    MSST2004, the Twelfth NASA Goddard / Twenty-first IEEE Conference on Mass Storage Systems and Technologies has as its focus long-term stewardship of globally-distributed storage. The increasing prevalence of e-anything brought about by widespread use of applications based, among others, on the World Wide Web, has contributed to rapid growth of online data holdings. A study released by the School of Information Management and Systems at the University of California, Berkeley, estimates that over 5 exabytes of data was created in 2002. Almost 99 percent of this information originally appeared on magnetic media. The theme for MSST2004 is therefore both timely and appropriate. There have been many discussions about rapid technological obsolescence, incompatible formats and inadequate attention to the permanent preservation of knowledge committed to digital storage. Tutorial sessions at MSST2004 detail some of these concerns, and steps being taken to alleviate them. Over 30 papers deal with topics as diverse as performance, file systems, and stewardship and preservation. A number of short papers, extemporaneous presentations, and works in progress will detail current and relevant research on the MSST2004 theme.

  3. Quantum simulations in phase-space: from quantum optics to ultra-cold physics

    Science.gov (United States)

    Drummond, Peter D.; Chaturvedi, Subhash

    2016-07-01

    As a contribution to the international year of light, we give a brief history of quantum optics in phase-space, with new directions including quantum simulations of multipartite Bell violations, opto-mechanics, ultra-cold atomic systems, matter-wave Bell violations, coherent transport and quantum fluctuations in the early Universe. We mostly focus on exact methods using the positive-P representation, and semiclassical truncated Wigner approximations.

  4. Surrogate models for identifying robust, high yield regions of parameter space for ICF implosion simulations

    Science.gov (United States)

    Humbird, Kelli; Peterson, J. Luc; Brandon, Scott; Field, John; Nora, Ryan; Spears, Brian

    2016-10-01

    Next-generation supercomputer architecture and in-transit data analysis have been used to create a large collection of 2-D ICF capsule implosion simulations. The database includes metrics for approximately 60,000 implosions, with x-ray images and detailed physics parameters available for over 20,000 simulations. To map and explore this large database, surrogate models for numerous quantities of interest are built using supervised machine learning algorithms. Response surfaces constructed using the predictive capabilities of the surrogates allow for continuous exploration of parameter space without requiring additional simulations. High performing regions of the input space are identified to guide the design of future experiments. In particular, a model for the yield built using a random forest regression algorithm has a cross validation score of 94.3% and is consistently conservative for high yield predictions. The model is used to search for robust volumes of parameter space where high yields are expected, even given variations in other input parameters. Surrogates for additional quantities of interest relevant to ignition are used to further characterize the high yield regions. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, Lawrence Livermore National Security, LLC. LLNL-ABS-697277.

  5. The performance of field scientists undertaking observations of early life fossils while in simulated space suit

    Science.gov (United States)

    Willson, D.; Rask, J. C.; George, S. C.; de Leon, P.; Bonaccorsi, R.; Blank, J.; Slocombe, J.; Silburn, K.; Steele, H.; Gargarno, M.; McKay, C. P.

    2014-01-01

    We conducted simulated Apollo Extravehicular Activity's (EVA) at the 3.45 Ga Australian 'Pilbara Dawn of life' (Western Australia) trail with field and non-field scientists using the University of North Dakota's NDX-1 pressurizable space suit to overview the effectiveness of scientist astronauts employing their field observation skills while looking for stromatolite fossil evidence. Off-world scientist astronauts will be faced with space suit limitations in vision, human sense perception, mobility, dexterity, the space suit fit, time limitations, and the psychological fear of death from accidents, causing physical fatigue reducing field science performance. Finding evidence of visible biosignatures for past life such as stromatolite fossils, on Mars, is a very significant discovery. Our preliminary overview trials showed that when in simulated EVAs, 25% stromatolite fossil evidence is missed with more incorrect identifications compared to ground truth surveys but providing quality characterization descriptions becomes less affected by simulated EVA limitations as the science importance of the features increases. Field scientists focused more on capturing high value characterization detail from the rock features whereas non-field scientists focused more on finding many features. We identified technologies and training to improve off-world field science performance. The data collected is also useful for NASA's "EVA performance and crew health" research program requirements but further work will be required to confirm the conclusions.

  6. Combining annual daylight simulation with photobiology data to assess the relative circadian efficacy of interior spaces

    Energy Technology Data Exchange (ETDEWEB)

    Pechacek, C.S.; Andersen, M. [Massachusetts Inst. of Technology, Cambridge, MA (United States). Dept. of Architecture, Building Technology; Lockley, S.W. [Harvard Medical School, Boston, MA (United States). Div. of Sleep Medicine, Brigham and Women' s Hospital

    2008-07-01

    This paper addressed the issue of hospital design and the role of daylight in patient health care. It presented a new approach for integrating empirical data and findings in photobiology into the performance assessment of a space, thus combining both visual and health-related criteria. Previous studies have reported significant health care outcomes in daylit environments, although the mechanism and photoreceptor systems controlling these effects remain unknown. This study focused on furthering the previous studies beyond windows to describing the characteristics of daylight that may promote human health by providing daylighting for the appropriate synchronization of circadian rhythms, and then make specific daylighting recommendations, grounded in biological findings. In particular, this study investigated the use of daylight autonomy (DA) to simulate the probabilistic and temporal potential of daylight for human health needs. Results of photobiology research were used to define threshold values for lighting, which were then used as goals for simulations. These goals included spectrum, intensity and timing of light at the human eye. The study investigated the variability of key architectural decisions in hospital room design to determine their influence on achieving the goals. The simulations showed how choices in building orientation, window size, user-window position and interior finishes affect the circadian efficacy of a space. Design decisions can improve or degrade the health potential for the space considered. While the findings in this research were specific to hospitals, the results can be applied to other building types such as office buildings and residences. 33 refs., 7 figs.

  7. Algorithm of Attitude Control and Its Simulation of Free-Flying Space Robot

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Reaction wheel or reaction thruster is employed to maintain the attitude of the base of space robot fixed in attitude control of free-flying space robot.However, in this method, a large amount of fuel will be consumed, and it will shorten the on-orbit life span of space robot, it also vibrate the system and make the system unsteady.The restricted minimum disturbance map (RMDM) based algorithm of attitude control is presented to keep the attitude of the base fixed during the movement of the manipulator.In this method it is realized by planning motion trajectory of the end-effector of manipulator without using reaction wheel or reaction thruster.In order to verify the feasibility and effectiveness of the algorithm attitude control presented in this paper, computer simulation experiments have been made and the experimental results demonstrate that this algorithm is feasible.

  8. Space, the Final Frontier”: How Good are Agent-Based Models at Simulating Individuals and Space in Cities?

    Directory of Open Access Journals (Sweden)

    Alison Heppenstall

    2016-01-01

    Full Text Available Cities are complex systems, comprising of many interacting parts. How we simulate and understand causality in urban systems is continually evolving. Over the last decade the agent-based modeling (ABM paradigm has provided a new lens for understanding the effects of interactions of individuals and how through such interactions macro structures emerge, both in the social and physical environment of cities. However, such a paradigm has been hindered due to computational power and a lack of large fine scale datasets. Within the last few years we have witnessed a massive increase in computational processing power and storage, combined with the onset of Big Data. Today geographers find themselves in a data rich era. We now have access to a variety of data sources (e.g., social media, mobile phone data, etc. that tells us how, and when, individuals are using urban spaces. These data raise several questions: can we effectively use them to understand and model cities as complex entities? How well have ABM approaches lent themselves to simulating the dynamics of urban processes? What has been, or will be, the influence of Big Data on increasing our ability to understand and simulate cities? What is the appropriate level of spatial analysis and time frame to model urban phenomena? Within this paper we discuss these questions using several examples of ABM applied to urban geography to begin a dialogue about the utility of ABM for urban modeling. The arguments that the paper raises are applicable across the wider research environment where researchers are considering using this approach.

  9. Time-Accurate Unsteady Pressure Loads Simulated for the Space Launch System at Wind Tunnel Conditions

    Science.gov (United States)

    Alter, Stephen J.; Brauckmann, Gregory J.; Kleb, William L.; Glass, Christopher E.; Streett, Craig L.; Schuster, David M.

    2015-01-01

    A transonic flow field about a Space Launch System (SLS) configuration was simulated with the Fully Unstructured Three-Dimensional (FUN3D) computational fluid dynamics (CFD) code at wind tunnel conditions. Unsteady, time-accurate computations were performed using second-order Delayed Detached Eddy Simulation (DDES) for up to 1.5 physical seconds. The surface pressure time history was collected at 619 locations, 169 of which matched locations on a 2.5 percent wind tunnel model that was tested in the 11 ft. x 11 ft. test section of the NASA Ames Research Center's Unitary Plan Wind Tunnel. Comparisons between computation and experiment showed that the peak surface pressure RMS level occurs behind the forward attach hardware, and good agreement for frequency and power was obtained in this region. Computational domain, grid resolution, and time step sensitivity studies were performed. These included an investigation of pseudo-time sub-iteration convergence. Using these sensitivity studies and experimental data comparisons, a set of best practices to date have been established for FUN3D simulations for SLS launch vehicle analysis. To the author's knowledge, this is the first time DDES has been used in a systematic approach and establish simulation time needed, to analyze unsteady pressure loads on a space launch vehicle such as the NASA SLS.

  10. Large-Eddy Simulations of Magnetohydrodynamic Turbulence in Astrophysics and Space Physics

    CERN Document Server

    Miesch, Mark S; Brandenburg, Axel; Petrosyan, Arakel; Pouquet, Annick; Cambon, Claude; Jenko, Frank; Uzdensky, Dmitri; Stone, James; Tobias, Steve; Toomre, Juri; Velli, Marco

    2015-01-01

    We live in an age in which high-performance computing is transforming the way we do science. Previously intractable problems are now becoming accessible by means of increasingly realistic numerical simulations. One of the most enduring and most challenging of these problems is turbulence. Yet, despite these advances, the extreme parameter regimes encountered in astrophysics and space physics (as in atmospheric and oceanic physics) still preclude direct numerical simulation. Numerical models must take a Large Eddy Simulation (LES) approach, explicitly computing only a fraction of the active dynamical scales. The success of such an approach hinges on how well the model can represent the subgrid-scales (SGS) that are not explicitly resolved. In addition to the parameter regime, astrophysical and heliophysical applications must also face an equally daunting challenge: magnetism. The presence of magnetic fields in a turbulent, electrically conducting fluid flow can dramatically alter the coupling between large and...

  11. Agent-based simulation of pedestrian behaviour in closed spaces: a museum case study

    CERN Document Server

    Pluchino, Alessandro; Inturri, Giuseppe; Rapisarda, Andrea; Ignaccolo, Matteo

    2013-01-01

    In order to analyse the behaviour of pedestrians at the very fine scale, while moving along the streets, in open spaces or inside a building, simulation modelling becomes an essential tool. In these spatial environments, in the presence of unusual demand flows, simulation requires the ability to model the local dynamics of individual decision making and behaviour, which is strongly affected by the geometry, randomness, social preferences, local and collective behaviour of other individuals. The dynamics of people visiting and evacuating a museum offers an excellent case study along this line. In this paper we realize an agent-based simulation of the Castello Ursino museum in Catania (Italy), evaluating its carrying capacity in terms of both satisfaction of the visitors in regime of normal fruition and their safety under alarm conditions.

  12. Discrete Simulation of Flexible Plate Structure Using State-Space Formulation

    Institute of Scientific and Technical Information of China (English)

    S. Md. Salleh; M. O. Tokhi

    2008-01-01

    This paper presents the development of dynamic simulation of a flexible plate structure with various boundary conditions. A flexible square plate is considered. A finite-difference method is used to discretise the governing partial differential equation formulation describing its dynamic behaviour. The model thus developed has been validated against characteristic parameters of the plate. The model thus developed is further formulated using discrete state-space representation, to allow easy and fast implementation for simulating the dynamic behaviour of the plate with various boundary conditions. The simulation algorithm thus developed is validated, and accurate results with representation of the first five modes of vibration of the plate have been achieved. The algorithm thus developed is used in subsequent research work as a platform for development and verification of suitable control strategies for vibration suppression of flexible plate structures.

  13. Frequency Domain Modeling and Simulation of DC Power Electronic Systems Using Harmonic State Space Method

    DEFF Research Database (Denmark)

    Kwon, Jun Bum; Wang, Xiongfei; Blaabjerg, Frede

    2016-01-01

    For the efficiency and simplicity of electric systems, the dc power electronic systems are widely used in a variety of applications such as electric vehicles, ships, aircraft and also in homes. In these systems, there could be a number of dynamic interactions and frequency coupling between network...... with different switching frequency or harmonics from ac-dc converters makes that harmonics and frequency coupling are both problems of ac system and challenges of dc system. This paper presents a modeling and simulation method for a large dc power electronic system by using Harmonic State Space (HSS) modeling...... and loads and other converters. Hence, time-domain simulations are usually required to consider such a complex system behavior. However, simulations in the time-domain may increase the calculation time and the utilization of computer memory. Furthermore, frequency coupling driven by multiple converters...

  14. Numerical simulations of the electrodynamic interactions between the Tethered-Satellite-System and space plasma

    Science.gov (United States)

    Vashi, Bharat I.

    1992-01-01

    The first Tethered-Satellite-System (TSS-1), scheduled for a flight in late 1992, is expected to provide relevant information related to the concept of generating an emf in a 20-km-long (or longer) conducting wire. This paper presents numerical simulations of the electrodynamic interactions between the TSS system and space plasma, using a 2D and 3D models of the system. The 2D case code simulates the motion of a long cylinder past a plasma, which is composed of electrons and H(+) ions. The system is solved by allowing the plasma to flow past the cylinder with an imposed magnetic field. The more complex 3D case is considered to study the dynamics in great detail. Results of 2D simulation show that the interaction of a satellite with plasma flowing perpendicularly to the magnetic field results in an enhancement in the current collection.

  15. Using parallel computing for the display and simulation of the space debris environment

    Science.gov (United States)

    Möckel, M.; Wiedemann, C.; Flegel, S.; Gelhaus, J.; Vörsmann, P.; Klinkrad, H.; Krag, H.

    2011-07-01

    Parallelism is becoming the leading paradigm in today's computer architectures. In order to take full advantage of this development, new algorithms have to be specifically designed for parallel execution while many old ones have to be upgraded accordingly. One field in which parallel computing has been firmly established for many years is computer graphics. Calculating and displaying three-dimensional computer generated imagery in real time requires complex numerical operations to be performed at high speed on a large number of objects. Since most of these objects can be processed independently, parallel computing is applicable in this field. Modern graphics processing units (GPUs) have become capable of performing millions of matrix and vector operations per second on multiple objects simultaneously. As a side project, a software tool is currently being developed at the Institute of Aerospace Systems that provides an animated, three-dimensional visualization of both actual and simulated space debris objects. Due to the nature of these objects it is possible to process them individually and independently from each other. Therefore, an analytical orbit propagation algorithm has been implemented to run on a GPU. By taking advantage of all its processing power a huge performance increase, compared to its CPU-based counterpart, could be achieved. For several years efforts have been made to harness this computing power for applications other than computer graphics. Software tools for the simulation of space debris are among those that could profit from embracing parallelism. With recently emerged software development tools such as OpenCL it is possible to transfer the new algorithms used in the visualization outside the field of computer graphics and implement them, for example, into the space debris simulation environment. This way they can make use of parallel hardware such as GPUs and Multi-Core-CPUs for faster computation. In this paper the visualization software

  16. Peculiar velocities in redshift space: formalism, N-body simulations and perturbation theory

    CERN Document Server

    Okumura, Teppei; Vlah, Zvonimir; Desjacques, Vincent

    2013-01-01

    Direct measurements of peculiar velocities of galaxies and clusters of galaxies can in principle provide explicit information on the three dimensional mass distribution, but this information is modulated by the fact that velocity field is sampled at galaxy positions, and is thus probing galaxy momentum. We derive expressions for the cross power spectrum between the density and momentum field and the auto spectrum of the momentum field in redshift space, by extending the distribution function method to these statistics. The resulting momentum cross and auto power spectra in redshift space are expressed as infinite sums over velocity moment correlators in real space, as is the case for the density power spectrum in redshift space. We compare the predictions of the velocity statistics to those measured from N-body simulations for both dark matter and halos. We find that in redshift space linear theory predictions for the density-momentum cross power spectrum as well as for the momentum auto spectrum fail to pred...

  17. RENEW v3.2 user's manual, maintenance estimation simulation for Space Station Freedom Program

    Science.gov (United States)

    Bream, Bruce L.

    1993-04-01

    RENEW is a maintenance event estimation simulation program developed in support of the Space Station Freedom Program (SSFP). This simulation uses reliability and maintainability (R&M) and logistics data to estimate both average and time dependent maintenance demands. The simulation uses Monte Carlo techniques to generate failure and repair times as a function of the R&M and logistics parameters. The estimates are generated for a single type of orbital replacement unit (ORU). The simulation has been in use by the SSFP Work Package 4 prime contractor, Rocketdyne, since January 1991. The RENEW simulation gives closer estimates of performance since it uses a time dependent approach and depicts more factors affecting ORU failure and repair than steady state average calculations. RENEW gives both average and time dependent demand values. Graphs of failures over the mission period and yearly failure occurrences are generated. The averages demand rate for the ORU over the mission period is also calculated. While RENEW displays the results in graphs, the results are also available in a data file for further use by spreadsheets or other programs. The process of using RENEW starts with keyboard entry of the R&M and operational data. Once entered, the data may be saved in a data file for later retrieval. The parameters may be viewed and changed after entry using RENEW. The simulation program runs the number of Monte Carlo simulations requested by the operator. Plots and tables of the results can be viewed on the screen or sent to a printer. The results of the simulation are saved along with the input data. Help screens are provided with each menu and data entry screen.

  18. RENEW v3.2 user's manual, maintenance estimation simulation for Space Station Freedom Program

    Science.gov (United States)

    Bream, Bruce L.

    1993-01-01

    RENEW is a maintenance event estimation simulation program developed in support of the Space Station Freedom Program (SSFP). This simulation uses reliability and maintainability (R&M) and logistics data to estimate both average and time dependent maintenance demands. The simulation uses Monte Carlo techniques to generate failure and repair times as a function of the R&M and logistics parameters. The estimates are generated for a single type of orbital replacement unit (ORU). The simulation has been in use by the SSFP Work Package 4 prime contractor, Rocketdyne, since January 1991. The RENEW simulation gives closer estimates of performance since it uses a time dependent approach and depicts more factors affecting ORU failure and repair than steady state average calculations. RENEW gives both average and time dependent demand values. Graphs of failures over the mission period and yearly failure occurrences are generated. The averages demand rate for the ORU over the mission period is also calculated. While RENEW displays the results in graphs, the results are also available in a data file for further use by spreadsheets or other programs. The process of using RENEW starts with keyboard entry of the R&M and operational data. Once entered, the data may be saved in a data file for later retrieval. The parameters may be viewed and changed after entry using RENEW. The simulation program runs the number of Monte Carlo simulations requested by the operator. Plots and tables of the results can be viewed on the screen or sent to a printer. The results of the simulation are saved along with the input data. Help screens are provided with each menu and data entry screen.

  19. MODELLING AND SIMULATION OF TWO-LEVEL SPACE VECTOR PWM INVERTER USING PHOTOVOLTAIC CELLS AS DC SOURCE

    Directory of Open Access Journals (Sweden)

    Ayse KOCALMIS BILHAN

    2013-01-01

    Full Text Available A space vector PWM method for a two level inverter is proposed in this paper. A two level inverter using space vector modulation strategy has been modeled and simulated with a passive R-L load. Photovoltaic cells are used as DC source for input of two-level inverter. Simulation results are presented for various operation conditions to verify the system model. In this paper, MATLAB/Simulink package program has been used for modeling and simulation of PV cells and two-level space vector pulse width modulation (SVPWM inverter.

  20. Time Accurate Unsteady Pressure Loads Simulated for the Space Launch System at a Wind Tunnel Condition

    Science.gov (United States)

    Alter, Stephen J.; Brauckmann, Gregory J.; Kleb, Bil; Streett, Craig L; Glass, Christopher E.; Schuster, David M.

    2015-01-01

    Using the Fully Unstructured Three-Dimensional (FUN3D) computational fluid dynamics code, an unsteady, time-accurate flow field about a Space Launch System configuration was simulated at a transonic wind tunnel condition (Mach = 0.9). Delayed detached eddy simulation combined with Reynolds Averaged Naiver-Stokes and a Spallart-Almaras turbulence model were employed for the simulation. Second order accurate time evolution scheme was used to simulate the flow field, with a minimum of 0.2 seconds of simulated time to as much as 1.4 seconds. Data was collected at 480 pressure taps at locations, 139 of which matched a 3% wind tunnel model, tested in the Transonic Dynamic Tunnel (TDT) facility at NASA Langley Research Center. Comparisons between computation and experiment showed agreement within 5% in terms of location for peak RMS levels, and 20% for frequency and magnitude of power spectral densities. Grid resolution and time step sensitivity studies were performed to identify methods for improved accuracy comparisons to wind tunnel data. With limited computational resources, accurate trends for reduced vibratory loads on the vehicle were observed. Exploratory methods such as determining minimized computed errors based on CFL number and sub-iterations, as well as evaluating frequency content of the unsteady pressures and evaluation of oscillatory shock structures were used in this study to enhance computational efficiency and solution accuracy. These techniques enabled development of a set of best practices, for the evaluation of future flight vehicle designs in terms of vibratory loads.

  1. Simulation of turbulences and fog effects on the free space optical link inside of experimental box

    Science.gov (United States)

    Latal, Jan; Vitasek, Jan; Hajek, Lukas; Vanderka, Ales; Koudelka, Petr; Kepak, Stanislav; Vasinek, Vladimir

    2016-12-01

    This paper deals with problematic of Free Space Optical (FSO) Links. The theoretical part describes the effects of atmospheric transmission environment on these FSO connections. The practical part is focused on the creation of an appropriate experimental workplace for turbulences simulation (mechanical and thermal turbulences), fog effects and subsequent measurement of these effects. For definition how big impact these effects on the FSO system have is used the statistical analysis and simulation software Optiwave. Overall there were tested three optical light sources operating at wavelengths of 632.8 nm, 850 nm and 1550 nm respectively. Influences of simulated atmospheric effects on the signal attenuation were observed. Within the frame of simulation in Optiwave software there were studied influences of attenuation on given wavelengths in form of FSO link transmission parameters degradation. Also for the purposes of real measurements it was necessary to fabricate an experimental box. This box was constructed with sizes of 2.5 and 5 meters and was used for simulation of atmospheric environment.

  2. Precision simulation of ground-based lensing data using observations from space

    CERN Document Server

    Mandelbaum, Rachel; Leauthaud, Alexie; Massey, Richard J; Rhodes, Jason

    2011-01-01

    Current and upcoming wide-field, ground-based, broad-band imaging surveys promise to address a wide range of outstanding problems in galaxy formation and cosmology. Several such uses of ground-based data, especially weak gravitational lensing, require highly precise measurements of galaxy image statistics with careful correction for the effects of the point-spread function (PSF). In this paper, we introduce the SHERA (SHEar Reconvolution Analysis) software to simulate ground-based imaging data with realistic galaxy morphologies and observing conditions, starting from space-based data (from COSMOS, the Cosmological Evolution Survey) and accounting for the effects of the space-based PSF. This code simulates ground-based data, optionally with a weak lensing shear applied, in a model-independent way using a general Fourier space formalism. The utility of this pipeline is that it allows for a precise, realistic assessment of systematic errors due to the method of data processing, for example in extracting weak len...

  3. Credibility Assessment of Deterministic Computational Models and Simulations for Space Biomedical Research and Operations

    Science.gov (United States)

    Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Myers, Jerry

    2015-01-01

    Human missions beyond low earth orbit to destinations, such as to Mars and asteroids will expose astronauts to novel operational conditions that may pose health risks that are currently not well understood and perhaps unanticipated. In addition, there are limited clinical and research data to inform development and implementation of health risk countermeasures for these missions. Consequently, NASA's Digital Astronaut Project (DAP) is working to develop and implement computational models and simulations (M&S) to help predict and assess spaceflight health and performance risks, and enhance countermeasure development. In order to effectively accomplish these goals, the DAP evaluates its models and simulations via a rigorous verification, validation and credibility assessment process to ensure that the computational tools are sufficiently reliable to both inform research intended to mitigate potential risk as well as guide countermeasure development. In doing so, DAP works closely with end-users, such as space life science researchers, to establish appropriate M&S credibility thresholds. We will present and demonstrate the process the DAP uses to vet computational M&S for space biomedical analysis using real M&S examples. We will also provide recommendations on how the larger space biomedical community can employ these concepts to enhance the credibility of their M&S codes.

  4. Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes

    CERN Document Server

    Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R

    2001-01-01

    This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...

  5. Simulation of the space debris environment in LEO using a simplified approach

    Science.gov (United States)

    Kebschull, Christopher; Scheidemann, Philipp; Hesselbach, Sebastian; Radtke, Jonas; Braun, Vitali; Krag, H.; Stoll, Enrico

    2017-01-01

    Several numerical approaches exist to simulate the evolution of the space debris environment. These simulations usually rely on the propagation of a large population of objects in order to determine the collision probability for each object. Explosion and collision events are triggered randomly using a Monte-Carlo (MC) approach. So in many different scenarios different objects are fragmented and contribute to a different version of the space debris environment. The results of the single Monte-Carlo runs therefore represent the whole spectrum of possible evolutions of the space debris environment. For the comparison of different scenarios, in general the average of all MC runs together with its standard deviation is used. This method is computationally very expensive due to the propagation of thousands of objects over long timeframes and the application of the MC method. At the Institute of Space Systems (IRAS) a model capable of describing the evolution of the space debris environment has been developed and implemented. The model is based on source and sink mechanisms, where yearly launches as well as collisions and explosions are considered as sources. The natural decay and post mission disposal measures are the only sink mechanisms. This method reduces the computational costs tremendously. In order to achieve this benefit a few simplifications have been applied. The approach of the model partitions the Low Earth Orbit (LEO) region into altitude shells. Only two kinds of objects are considered, intact bodies and fragments, which are also divided into diameter bins. As an extension to a previously presented model the eccentricity has additionally been taken into account with 67 eccentricity bins. While a set of differential equations has been implemented in a generic manner, the Euler method was chosen to integrate the equations for a given time span. For this paper parameters have been derived so that the model is able to reflect the results of the numerical MC

  6. Graphical programming and the use of simulation for space-based manipulators

    Science.gov (United States)

    Mcgrath, Debra S.; Reynolds, James C.

    1989-01-01

    Robotic manipulators are difficult to program even without the special requirements of a zero-gravity environment. While attention should be paid to investigating the usefulness of industrial application programming methods to space manipulators, new methods with potential application to both environments need to be invented. These methods should allow various levels of autonomy and human-in-the-loop interaction and simple, rapid switching among them. For all methods simulation must be integrated to provide reliability and safety. Graphical programming of manipulators is a candidate for an effective robot programming method despite current limitations in input devices and displays. A research project in task-level robot programming has built an innovative interface to a state-of-the-art commercial simulation and robot programming platform. The prototype demonstrates simple augmented methods for graphical programming and simulation which may be of particular interest to those concerned with Space Station applications; its development has also raised important issues for the development of more sophisticated robot programming tools. Both aspects of the project are discussed.

  7. [Mathematical simulation support to the dosimetric monitoring on the Russian segment of the International Space Station].

    Science.gov (United States)

    Mitrikas, V G

    2014-01-01

    To ensure radiation safety of cosmonauts, it is necessary not only to predict, but also to reconstruct absorbed dose dynamics with the knowledge of how long cosmonauts stay in specific space vehicle compartments with different shielding properties and lacking equipment for dosimetric monitoring. In this situation, calculating is one and only way to make a correct estimate of radiation exposure of cosmonaut's organism as a whole (tissue-average dose) and of separate systems and organs. The paper addresses the issues of mathematical simulation of epy radiation environment of standard dosimetric instruments in the Russian segments of the International Space Station (ISS RS). Results of comparing the simulation and experimental data for the complement of dosimeters including ionization chamber-based radiometer R-16, DB8 dosimeters composed of semiconductor detectors, and Pille dosimeters composed of thermoluminescent detectors evidence that the current methods of simulation in support of the ISS RS radiation monitoring provide a sufficiently good agreement between the calculated and experimental data.

  8. Sensor Simulator Supporting the Pilot Data Centres for the Space Situational Awareness (SSA) Preparatory Programme

    Science.gov (United States)

    Sanchez-Ortiz, Noelia; Dominguez-Gonzalez, Raul; Guijarro-Lopez, Nuria; Parrilla-Eudrino, Esther; Rivera-Campos, Andgela; Marina Perez, Eva; Pina Caballero, Fernando; Navarro, Vicente; Wright, Norrie

    2013-08-01

    This paper focuses on SSA's Sensor Simulator (SSIM), and how it is defined to support the testing and evaluation of Sensor Planning System and Data Processing Chain prior to the deployment of real sensors, in the frame of SSA programme. The Sensor Simulator for the Pilot Data Centres reproduces physical models for all system elements involved in the data generation process: observation constraints and strategies (tracking and survey), debris orbit propagation, Near Earth Objects (NEO) orbit propagation, generation of radar, ground based optical and space based optical measurements. A review of the capabilities, main models and associated algorithms is presented in this paper. Examples of the use of SSIM for the simulation of observations of both Space Surveillance and Tracking (SST) and NEO objects are provided, highlighting the differences between these two operational cases. SSIM is designed and implemented to make use of the ESA SIMULUS infrastructure and it will be deployed on top of the Common SSA Integration Framework. A brief description of the architecture of the system is provided.

  9. James Webb Space Telescope Optical Simulation Testbed II. Design of a Three-Lens Anastigmat Telescope Simulator

    CERN Document Server

    Choquet, Élodie; N'Diaye, Mamadou; Perrin, Marshall D; Soummer, Rémi

    2014-01-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a tabletop experiment designed to reproduce the main aspects of wavefront sensing and control (WFSC) for JWST. To replicate the key optical physics of JWST's three-mirror anastigmat (TMA) design at optical wavelengths we have developed a three-lens anastigmat optical system. This design uses custom lenses (plano-convex, plano-concave, and bi-convex) with fourth-order aspheric terms on powered surfaces to deliver the equivalent image quality and sampling of JWST NIRCam at the WFSC wavelength (633~nm, versus JWST's 2.12~micron). For active control, in addition to the segmented primary mirror simulator, JOST reproduces the secondary mirror alignment modes with five degrees of freedom. We present the testbed requirements and its optical and optomechanical design. We study the linearity of the main aberration modes (focus, astigmatism, coma) both as a function of field point and level of misalignments of the secondary mirror. We find that t...

  10. Revisiting Numerical Errors in Direct and Large Eddy Simulations of Turbulence: Physical and Spectral Spaces Analysis

    Science.gov (United States)

    Fedioun, Ivan; Lardjane, Nicolas; Gökalp, Iskender

    2001-12-01

    Some recent studies on the effects of truncation and aliasing errors on the large eddy simulation (LES) of turbulent flows via the concept of modified wave number are revisited. It is shown that all the results obtained for nonlinear partial differential equations projected and advanced in time in spectral space are not straightforwardly applicable to physical space calculations due to the nonequivalence by Fourier transform of spectral aliasing errors and numerical errors on a set of grid points in physical space. The consequences of spectral static aliasing errors on a set of grid points are analyzed in one dimension of space for quadratic products and their derivatives. The dynamical process that results through time stepping is illustrated on the Burgers equation. A method based on midpoint interpolation is proposed to remove in physical space the static grid point errors involved in divergence forms. It is compared to the sharp filtering technique on finer grids suggested by previous authors. Global performances resulting from combination of static aliasing errors and truncation errors are then discussed for all classical forms of the convective terms in Navier-Stokes equations. Some analytical results previously obtained on the relative magnitude of subgrid scale terms and numerical errors are confirmed with 3D realistic random fields. The physical space dynamical behavior and the stability of typical associations of numerical schemes and forms of nonlinear terms are finally evaluated on the LES of self-decaying homogeneous isotropic turbulence. It is shown that the convective form (if conservative properties are not strictly required) associated with highly resolving compact finite difference schemes provides the best compromise, which is nearly equivalent to dealiased pseudo-spectral calculations.

  11. Page mode reading with simulated scotomas: a modest effect of interline spacing on reading speed.

    Science.gov (United States)

    Bernard, Jean-Baptiste; Scherlen, Anne-Catherine; Anne-Catherine, Scherlen; Castet, Eric; Eric, Castet

    2007-12-01

    Crowding is thought to be one potent limiting factor of reading in peripheral vision. While several studies investigated how crowding between horizontally adjacent letters or words can influence eccentric reading, little attention has been paid to the influence of vertically adjacent lines of text. The goal of this study was to examine the dependence of page mode reading performance (speed and accuracy) on interline spacing. A gaze-contingent visual display was used to simulate a visual central scotoma while normally sighted observers read meaningful French sentences following MNREAD principles. The sensitivity of this new material to low-level factors was confirmed by showing strong effects of perceptual learning, print size and scotoma size on reading performance. In contrast, reading speed was only slightly modulated by interline spacing even for the largest range tested: a 26% gain for a 178% increase in spacing. This modest effect sharply contrasts with the dramatic influence of vertical word spacing found in a recent RSVP study. This discrepancy suggests either that vertical crowding is minimized when reading meaningful sentences, or that the interaction between crowding and other factors such as attention and/or visuo-motor control is dependent on the paradigm used to assess reading speed (page vs. RSVP mode).

  12. Simulation of Space Charge Dynamic in Polyethylene Under DC Continuous Electrical Stress

    Science.gov (United States)

    Boukhari, Hamed; Rogti, Fatiha

    2016-10-01

    The space charge dynamic plays a very important role in the aging and breakdown of polymeric insulation materials under high voltage. This is due to the intensification of the local electric field and the attendant chemical-mechanical effects in the vicinity around the trapped charge. In this paper, we have investigated the space charge dynamic in low-density polyethylene under high direct-current voltage, which is evaluated by experimental conditions. The evaluation is on the basis of simulation using a bipolar charge transport model consisting of charge injection, transports, trapping, detrapping, and recombination phenomena. The theoretical formulation of the physical problem is based on the Poisson, the continuity, and the transport equations. Numerical results provide temporal and local distributions of the electric field, the space charge density for the different kinds of charges (net charge density, mobile and trapped of electron density, mobile hole density), conduction and displacement current densities, and the external current. The result shows the appearance of the negative packet-like space charge with a large amount of the bulk under the dc electric field of 100 kV/mm, and the induced distortion of the electric field is largely near to the anode, about 39% higher than the initial electric field applied.

  13. Clustering in the Phase Space of Dark Matter Haloes. I. Results from the Aquarius simulations

    CERN Document Server

    Zavala, Jesus

    2013-01-01

    We present a novel perspective on the clustering of dark matter in phase space by defining the particle phase space average density (P2SAD) as a two-dimensional extension of the two-point correlation function averaged within a certain volume in phase space. This statistics is a very sensitive measure of cold small scale (sub)structure of dark matter haloes. By analysing the structure of P2SAD in Milky-Way-size haloes using the high resolution Aquarius simulations, we find it to be nearly universal at small scales (i.e. small separations in phase space), in the regime dominated by gravitationally bound substructures. This remarkable universality occurs across time and in regions of substantially different ambient densities (by nearly four orders of magnitude), with typical variations in P2SAD of a factor of a few. The maximum variations occur in regions where resolved substructures have been strongly disrupted (e.g. near the halo centre). The universality is also preserved across haloes of similar mass but div...

  14. Computer graphics testbed to simulate and test vision systems for space applications

    Science.gov (United States)

    Cheatham, John B.; Wu, Chris K.; Lin, Y. H.

    1991-01-01

    A system was developed for displaying computer graphics images of space objects and the use of the system was demonstrated as a testbed for evaluating vision systems for space applications. In order to evaluate vision systems, it is desirable to be able to control all factors involved in creating the images used for processing by the vision system. Considerable time and expense is involved in building accurate physical models of space objects. Also, precise location of the model relative to the viewer and accurate location of the light source require additional effort. As part of this project, graphics models of space objects such as the Solarmax satellite are created that the user can control the light direction and the relative position of the object and the viewer. The work is also aimed at providing control of hue, shading, noise and shadows for use in demonstrating and testing imaging processing techniques. The simulated camera data can provide XYZ coordinates, pitch, yaw, and roll for the models. A physical model is also being used to provide comparison of camera images with the graphics images.

  15. Computer graphics testbed to simulate and test vision systems for space applications

    Science.gov (United States)

    Cheatham, John B.; Wu, Chris K.; Lin, Y. H.

    1991-01-01

    A system was developed for displaying computer graphics images of space objects and the use of the system was demonstrated as a testbed for evaluating vision systems for space applications. In order to evaluate vision systems, it is desirable to be able to control all factors involved in creating the images used for processing by the vision system. Considerable time and expense is involved in building accurate physical models of space objects. Also, precise location of the model relative to the viewer and accurate location of the light source require additional effort. As part of this project, graphics models of space objects such as the Solarmax satellite are created that the user can control the light direction and the relative position of the object and the viewer. The work is also aimed at providing control of hue, shading, noise and shadows for use in demonstrating and testing imaging processing techniques. The simulated camera data can provide XYZ coordinates, pitch, yaw, and roll for the models. A physical model is also being used to provide comparison of camera images with the graphics images.

  16. Simulation and observation of driven beam oscillations with space charge in the CERN PS Booster

    CERN Document Server

    McAteer, M; Benedetto, E; Carli, C; Findlay, A; Mikulec, B; Tomás, R

    2014-01-01

    As part of the LHC Injector Upgrade project, the CERN PS Booster will be required to operate at nearly doubled intensity with little allowable increase in emittance growth or beam loss. A campaign of nonlinear optics measurements from turn-by-turn trajectory measurements, with the goal of characterizing and then compensating for higher-order resonances, is planned for after Long Shutdown 1. The trajectory measurement system is expected initially to require high intensity beam in order to have good position measurement resolution, so understanding space charge effects will be important for optics analysis. We present the results of simulations of driven beam oscillations with space charge effects, and comparison with trial beam trajectory measurements.

  17. Contamination Control Assessment of the World's Largest Space Environment Simulation Chamber

    Science.gov (United States)

    Snyder, Aaron; Henry, Michael W.; Grisnik, Stanley P.; Sinclair, Stephen M.

    2012-01-01

    The Space Power Facility s thermal vacuum test chamber is the largest chamber in the world capable of providing an environment for space simulation. To improve performance and meet stringent requirements of a wide customer base, significant modifications were made to the vacuum chamber. These include major changes to the vacuum system and numerous enhancements to the chamber s unique polar crane, with a goal of providing high cleanliness levels. The significance of these changes and modifications are discussed in this paper. In addition, the composition and arrangement of the pumping system and its impact on molecular back-streaming are discussed in detail. Molecular contamination measurements obtained with a TQCM and witness wafers during two recent integrated system tests of the chamber are presented and discussed. Finally, a concluding remarks section is presented.

  18. James Webb Space Telescope Optical Simulation Testbed I: Overview and First Results

    CERN Document Server

    Perrin, Marshall D; Choquet, Élodie; N'Diaye, Mamadou; Levecq, Olivier; Lajoie, Charles-Phillipe; Ygouf, Marie; Leboulleux, Lucie; Egron, Sylvain; Anderson, Rachel; Long, Chris; Elliott, Erin; Hartig, George; Pueyo, Laurent; van der Marel, Roeland; Mountain, Matt

    2014-01-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a tabletop workbench to study aspects of wavefront sensing and control for a segmented space telescope, including both commissioning and maintenance activities. JOST is complementary to existing optomechanical testbeds for JWST (e.g. the Ball Aerospace Testbed Telescope, TBT) given its compact scale and flexibility, ease of use, and colocation at the JWST Science & Operations Center. We have developed an optical design that reproduces the physics of JWST's three-mirror anastigmat using three aspheric lenses; it provides similar image quality as JWST (80% Strehl ratio) over a field equivalent to a NIRCam module, but at HeNe wavelength. A segmented deformable mirror stands in for the segmented primary mirror and allows control of the 18 segments in piston, tip, and tilt, while the secondary can be controlled in tip, tilt and x, y, z position. This will be sufficient to model many commissioning activities, to investigate field depende...

  19. Survival of Deinococcus geothermalis in Biofilms under Desiccation and Simulated Space and Martian Conditions

    Science.gov (United States)

    Frösler, Jan; Panitz, Corinna; Wingender, Jost; Flemming, Hans-Curt; Rettberg, Petra

    2017-05-01

    Biofilm formation represents a successful survival strategy for bacteria. In biofilms, cells are embedded in a matrix of extracellular polymeric substances (EPS). As they are often more stress-tolerant than single cells, biofilm cells might survive the conditions present in space and on Mars. To investigate this topic, the bacterium Deinococcus geothermalis was chosen as a model organism due to its tolerance toward desiccation and radiation. Biofilms cultivated on membranes and, for comparison, planktonically grown cells deposited on membranes were air-dried and exposed to individual stressors that included prolonged desiccation, extreme temperatures, vacuum, simulated martian atmosphere, and UV irradiation, and they were exposed to combinations of stressors that simulate space (desiccation + vacuum + UV) or martian (desiccation + Mars atmosphere + UV) conditions. The effect of sulfatic Mars regolith simulant on cell viability during stress was investigated separately. The EPS produced by the biofilm cells contained mainly polysaccharides and proteins. To detect viable but nonculturable (VBNC) cells, cultivation-independent viability indicators (membrane integrity, ATP, 16S rRNA) were determined in addition to colony counts. Desiccation for 2 months resulted in a decrease of culturability with minor changes of membrane integrity in biofilm cells and major loss of membrane integrity in planktonic bacteria. Temperatures between -25°C and +60°C, vacuum, and Mars atmosphere affected neither culturability nor membrane integrity in both phenotypes. Monochromatic (254 nm; ≥1 kJ m-2) and polychromatic (200-400 nm; >5.5 MJ m-2 for planktonic cells and >270 MJ m-2 for biofilms) UV irradiation significantly reduced the culturability of D. geothermalis but did not affect cultivation-independent viability markers, indicating the induction of a VBNC state in UV-irradiated cells. In conclusion, a substantial proportion of the D. geothermalis population remained viable under

  20. Evaluation of an Electrochromic Device for Variable Emittance in Simulated Space Conditions

    Science.gov (United States)

    Puterbaugh, Rebekah L.; Mychkovsky, Alexander G.; Ponnappan, Rengasamy; Kislov, Nikolai

    2005-02-01

    Unprotected skin and external surfaces of a spacecraft in earth orbit may experience temperature variations from -50°C to +100°C during exposure to cold space or sun. As a result, thermal management of spacecraft becomes extremely important. One latest trend is to provide flexibility and control in the thermal design that involves variable emittance surfaces consisting of electrochromic (EC) coatings. For investigational purposes, a sample electrochromic device is evaluated for variable emittance in simulated space conditions. A vacuum chamber with a liquid nitrogen circulated blackbody shroud is employed to simulate space conditions. The 63.5 × 63.5 mm test sample supplied by a small business research firm is mounted on an aluminum plate heated by an electrical resistance heater. The sample is thermally insulated by a heat shield from all surroundings excluding the active front surface facing the shroud. The heat shield is uniformly maintained at the sample temperature using an independent circuit of resistance heaters and temperature controllers. A steady state energy balance is applied to the test sample to determine the emittance as a function of temperature and DC bias voltage applied across the anode and cathode. Tests were performed to verify the switchability from high to low emittance states and vice versa. The difference between the high and low emittance values (Δɛ) obtained in the present calorimetric measurement is compared with the data obtained from FTIR measurements performed by the supplier of the EC sample. Results obtained in the present experiments compare closely with supplier data and prove the effectiveness of the variable emittance sample in space conditions. The validity of the calorimetric experiment is confirmed by testing materials with known emittances, such as black paint and polished metals. Error analysis of the system predicts an emittance accuracy of ±5% at sample temperatures in the range of -50°C to 100°C.