Sample records for final approach spacing

  1. The ASLOTS concept: An interactive, adaptive decision support concept for Final Approach Spacing of Aircraft (FASA). FAA-NASA Joint University Program

    Simpson, Robert W.


    This presentation outlines a concept for an adaptive, interactive decision support system to assist controllers at a busy airport in achieving efficient use of multiple runways. The concept is being implemented as a computer code called FASA (Final Approach Spacing for Aircraft), and will be tested and demonstrated in ATCSIM, a high fidelity simulation of terminal area airspace and airport surface operations. Objectives are: (1) to provide automated cues to assist controllers in the sequencing and spacing of landing and takeoff aircraft; (2) to provide the controller with a limited ability to modify the sequence and spacings between aircraft, and to insert takeoffs and missed approach aircraft in the landing flows; (3) to increase spacing accuracy using more complex and precise separation criteria while reducing controller workload; and (4) achieve higher operational takeoff and landing rates on multiple runways in poor visibility.

  2. Space tug applications. Final report


    This article is the final report of the conceptual design efforts for a 'space tug'. It includes preliminary efforts, mission analysis, configuration analysis, impact analysis, and conclusions. Of the several concepts evaluated, the nuclear bimodal tug was one of the top candidates, with the two options being the NEBA-1 and NEBA-3 systems. Several potential tug benefits were identified during the mission analysis. The tug enables delivery of large (>3,500 kg) payloads to the outer planets and it increases the GSO delivery capability by 20% relative to current systems. By providing end of life disposal, the tug can be used to extend the life of existing space assets. It can also be used to reboost satellites which were not delivered to their final orbit by the launch system. A specific mission model is the key to validating the tug concept. Once a mission model can be established, mission analysis can be used to determine more precise propellant quantities and burn times. In addition, the specific payloads can be evaluated for mass and volume capability with the launch systems. Results of the economic analysis will be dependent on the total years of operations and the number of missions in the mission model. The mission applications evaluated during this phase drove the need for large propellant quantities and thus did not allow the payloads to step down to smaller and less expensive launch systems

  3. Space Solar Power Program. Final report

    Arif, Humayun; Barbosa, Hugo; Bardet, Christophe; Baroud, Michel; Behar, Alberto; Berrier, Keith; Berthe, Phillipe; Bertrand, Reinhold; Bibyk, Irene; Bisson, Joel; Bloch, Lawrence; Bobadilla, Gabriel; Bourque, Denis; Bush, Lawrence; Carandang, Romeo; Chiku, Takemi; Crosby, Norma; De Seixas, Manuel; De Vries, Joha; Doll, Susan; Dufour, Francois; Eckart, Peter; Fahey, Michael; Fenot, Frederic; Foeckersperger, Stefan; Fontaine, Jean-Emmanuel; Fowler, Robert; Frey, Harald; Fujio, Hironobu; Gasa, Jaume Munich; Gleave, Janet; Godoe, Jostein; Green, Iain; Haeberli, Roman; Hanada, Toshiya; Harris, Peter; Hucteau, Mario; Jacobs, Didier Fernand; Johnson, Richard; Kanno, Yoshitsugu; Koenig, Eva Maria; Kojima, Kazuo; Kondepudi, Phani; Kottbauer, Christian; Kulper, Doede; Kulagin, Konstantin; Kumara, Pekka; Kurz, Rainer; Laaksonen, Jyrki; Lang, Andrew Neill; Lathan, Corinna; Le Fur, Thierry; Lewis, David; Louis, Alain; Mori, Takeshi; Morlanes, Juan; Murbach, Marcus; Nagatomo, Hideo; O' brien, Ivan; Paines, Justin; Palaszewski, Bryan; Palmnaes, Ulf; Paraschivolu, Marius; Pathare, Asmin; Perov, Egor; Persson, Jan; Pessoa-Lopes, Isabel; Pinto, Michel; Porro, Irene; Reichert, Michael; Ritt-Fischer, Monika; Roberts, Margaret; Robertson II, Lawrence; Rogers, Keith; Sasaki, Tetsuo; Scire, Francesca; Shibatou, Katsuya; Shirai, Tatsuya; Shiraishi, Atsushi; Soucaille, Jean-Francois; Spivack, Nova; St. Pierre, Dany; Suleman, Afzal; Sullivan, Thomas; Theelen, Bas Johan; Thonstad, Hallvard; Tsuji, Masatoshi; Uchiumi, Masaharu; Vidqvist, Jouni; Warrell, David; Watanabe, Takafumi; Willis, Richard; Wolf, Frank; Yamakawa, Hiroshi; Zhao, Hong


    Information pertaining to the Space Solar Power Program is presented on energy analysis; markets; overall development plan; organizational plan; environmental and safety issues; power systems; space transportation; space manufacturing, construction, operations; design examples; and finance.

  4. Space Sustainment: A New Approach for America in Space


    international community toward promoting market incentives in international space law. This would open up the competitive space for new entrants ...announces- new -space-situational-awareness-satellite-program.aspx. 29. Gruss, “U.S. Space Assets Face Growing Threat .” 30. McDougall, Heavens and the...November–December 2014 Air & Space Power Journal | 117 SCHRIEVER ESSAY WINNER SECOND PLACE Space Sustainment A New Approach for America in Space Lt

  5. HLW Tank Space Management, Final Report

    Sessions, J.


    The HLW Tank Space Management Team (SM Team) was chartered to select and recommend an HLW Tank Space Management Strategy (Strategy) for the HLW Management Division of Westinghouse Savannah River Co. (WSRC) until an alternative salt disposition process is operational. Because the alternative salt disposition process will not be available to remove soluble radionuclides in HLW until 2009, the selected Strategy must assure that it safely receives and stores HLW at least until 2009 while continuing to supply sludge slurry to the DWPF vitrification process

  6. Small space object imaging : LDRD final report.

    Ackermann, Mark R.; Valley, Michael T.; Kearney, Sean Patrick


    We report the results of an LDRD effort to investigate new technologies for the identification of small-sized (mm to cm) debris in low-earth orbit. This small-yet-energetic debris presents a threat to the integrity of space-assets worldwide and represents significant security challenge to the international community. We present a nonexhaustive review of recent US and Russian efforts to meet the challenges of debris identification and removal and then provide a detailed description of joint US-Russian plans for sensitive, laser-based imaging of small debris at distances of hundreds of kilometers and relative velocities of several kilometers per second. Plans for the upcoming experimental testing of these imaging schemes are presented and a preliminary path toward system integration is identified.

  7. Constructive approaches to the space NPP designing

    Eremin, A.G.; Korobkov, L.S.; Matveev, A.V.; Trukhanov, Yu.L.; Pyshko, A.P.


    An example of designing a space NPP intended for power supply of telecommunication satellite is considered. It is shown that the designing approach based on the introduction of a leading criterion and dividing the design problems in two independent groups (reactor with radiation shield and equipment module) permits to develop the optimal design of a space NPP [ru

  8. AI Techniques for Space: The APSI Approach

    Steel, R.; Niézette, M.; Cesta, A.; Verfaille, G., Lavagna, M.; Donati, A.


    This paper will outline the framework and tools developed under the Advanced Planning and Schedule Initiative (APSI) study performed by VEGA for the European Space Agency in collaboration with three academic institutions, ISTC-CNR, ONERA, and Politecnico di Milano. We will start by illustrating the background history to APSI and why it was needed, giving a brief summary of all the partners within the project and the rolls they played within it. We will then take a closer look at what APSI actually consists of, showing the techniques that were used and detailing the framework that was developed within the scope of the project. We will follow this with an elaboration on the three demonstration test scenarios that have been developed as part of the project, illustrating the re-use and synergies between the three cases along the way. We will finally conclude with a summary of some pros and cons of the approach devised during the project and outline future directions to be further investigated and expanded on within the context of the work performed within the project.

  9. Space Processing Applications Rocket project, SPAR 1. Final report

    Reeves, F.; Chassay, R.


    The experiment objectives, design/operational concepts, and final results of each of nine scientific experiments conducted during the first Space Processing Applications Rocket (SPAR) flight are summarized. The nine individual SPAR experiments, covering a wide and varied range of scientific materials processing objectives, were entitled: solidification of Pb-Sb eutectic, feasibility of producing closed-cell metal foams, characterization of rocket vibration environment by measurement of mixing of two liquids, uniform dispersions of crystallization processing, direct observation of solidification as a function of gravity levels, casting thoria dispersion-strengthened interfaces, contained polycrystalline solidification, and preparation of a special alloy for manufacturing of magnetic hard superconductor under zero-g environment

  10. Space Processing Applications rocket project SPAR III. Final report

    Reeves, F.


    This document presents the engineering report and science payload III test report and summarizes the experiment objectives, design/operational concepts, and final results of each of five scientific experiments conducted during the third Space Processing Applications Rocket (SPAR) flight flown by NASA in December 1976. The five individual SPAR experiments, covering a wide and varied range of scientific materials processing objectives, were entitled: Liquid Mixing, Interaction of Bubbles with Solidification Interfaces, Epitaxial Growth of Single Crystal Film, Containerless Processing of Beryllium, and Contact and Coalescence of Viscous Bodies

  11. Brain Extracellular Space: The Final Frontier of Neuroscience.

    Nicholson, Charles; Hrabětová, Sabina


    Brain extracellular space is the narrow microenvironment that surrounds every cell of the central nervous system. It contains a solution that closely resembles cerebrospinal fluid with the addition of extracellular matrix molecules. The space provides a reservoir for ions essential to the electrical activity of neurons and forms an intercellular chemical communication channel. Attempts to reveal the size and structure of the extracellular space using electron microscopy have had limited success; however, a biophysical approach based on diffusion of selected probe molecules has proved useful. A point-source paradigm, realized in the real-time iontophoresis method using tetramethylammonium, as well as earlier radiotracer methods, have shown that the extracellular space occupies ∼20% of brain tissue and small molecules have an effective diffusion coefficient that is two-fifths that in a free solution. Monte Carlo modeling indicates that geometrical constraints, including dead-space microdomains, contribute to the hindrance to diffusion. Imaging the spread of macromolecules shows them increasingly hindered as a function of size and suggests that the gaps between cells are predominantly ∼40 nm with wider local expansions that may represent dead-spaces. Diffusion measurements also characterize interactions of ions and proteins with the chondroitin and heparan sulfate components of the extracellular matrix; however, the many roles of the matrix are only starting to become apparent. The existence and magnitude of bulk flow and the so-called glymphatic system are topics of current interest and controversy. The extracellular space is an exciting area for research that will be propelled by emerging technologies. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  12. Will space actually be the final frontier of humankind?

    Genta, Giancarlo; Rycroft, Michael


    Science fiction gave us the idea of space as the final frontier. Strongly supported by the pioneers of spaceflight, this was first questioned in the 1970s. The Apollo landings on the Moon did not lead to a permanent human presence on our satellite, the environment of even the most Earth-like planet (Mars) turned out to be more hostile, and the technical difficulties and the cost of spaceflight were worse than expected. So humankind seemed for ever to be bound to its own planet. These rather pessimistic views are re-examined here, in the light of recent technological advances, scientific discoveries and new perspectives. It is suggested that they result from a lack of vision. Thus the ‘final frontier’ myth is found still to hold, but with a much more stretched out timetable for future space programmes that was envisaged in the 1960s. The present generation can take its first faltering steps on the path towards a spacefaring civilization, but the outcome will depend on social, political and economic issues rather than technological and scientific ones.

  13. An innovative approach to space education

    Marton, Christine; Berinstain, Alain B.; Criswick, John


    At present, Canada does not have enough scientists to be competitive in the global economy, which is rapidly changing from a reliance on natural resources and industry to information and technology. Space is the final frontier and it is a multidisciplinary endeavor. It requires a knowledge of science and math, as well as non-science areas such as architecture and law. Thus, it can attract a large number of students with a diverse range of interests and career goals. An overview is presented of the space education program designed by Canadian Alumni of the International Space University (CAISU) to encourage students to pursue studies and careers in science and technology and to improve science literacy in Canada.

  14. Space: the final frontier in the learning of science?

    Milne, Catherine


    In Space, relations, and the learning of science, Wolff-Michael Roth and Pei-Ling Hsu use ethnomethodology to explore high school interns learning shopwork and shoptalk in a research lab that is located in a world class facility for water quality analysis. Using interaction analysis they identify how spaces, like a research laboratory, can be structured as smart spaces to create a workflow (learning flow) so that shoptalk and shopwork can projectively organize the actions of interns even in new and unfamiliar settings. Using these findings they explore implications for the design of curriculum and learning spaces more broadly. The Forum papers of Erica Blatt and Cassie Quigley complement this analysis. Blatt expands the discussion on space as an active component of learning with an examination of teaching settings, beyond laboratory spaces, as active participants of education. Quigley examines smart spaces as authentic learning spaces while acknowledging how internship experiences all empirical elements of authentic learning including open-ended inquiry and empowerment. In this paper I synthesize these ideas and propose that a narrative structure might better support workflow, student agency and democratic decision making.

  15. Approaches to radiation guidelines for space travel

    Fry, R.J.M.


    There are obvious risks in space travel that have loomed larger than any risk from radiation. Nevertheless, NASA has maintained a radiation program that has involved maintenance of records of radiation exposure, and planning so that the astronauts' exposures are kept as low as possible, and not just within the current guidelines. These guidelines are being reexamined currently by NCRP Committee 75 because new information is available, for example, risk estimates for radiation-induced cancer and about the effects of HZE particles. Furthermore, no estimates of risk or recommendations were made for women in 1970 and must now be considered. The current career limit is 400 rem. The appropriateness of this limit and its basis are being examined as well as the limits for specific organs. There is now considerably more information about age-dependency for radiation and this will be taken into account. Work has been carried out on the so-called microlesions caused by HZE particles and on the relative carcinogenic effect of heavy ions, including iron. A remaining question is whether the fluence of HZE particles could reach levels of concern in missions under consideration. Finally, it is the intention of the committee to indicate clearly the areas requiring further research. 21 references, 1 figure, 7 tables

  16. Approaches to radiation guidelines for space travel

    Fry, R.J.M.


    There are obvious risks in space travel that have loomed larger than any risk from radiation. Nevertheless, NASA has maintained a radiation program that has involved maintenance of records of radiation exposure, and planning so that the astronauts' exposures are kept as low as possible, and not just within the current guidelines. These guidelines are being reexamined currently by NCRP Committee 75 because new information is available, for example, risk estimates for radiation-induced cancer and about the effects of HZE particles. The current career limit is 400 rem to the blood forming organs. The appropriateness of this limit and its basis are being examined as well as the limits for specific organs. There is now considerably more information about age-dependency for radiation effects and this will be taken into account. In 1973 a committee of the National Research Council made a separate study of HZE particle effects and it was concluded that the attendant risks did not pose a hazard for low inclination near-earth orbit missions. Since that time work has been carried out on the so-called microlesions caused by HZE particles and on the relative carcinogenic effect of heavy ions, including iron. A remaining question is whether the fluence of HZE particles could reach levels of concern in missions under consideration. Finally, it is the intention of the committee to indicate clearly the areas requiring further research. 26 references, 1 figure, 7 tables

  17. 2009 ESMD Space Grant Faculty Project Final Report

    Murphy, Gloria; Ghanashyam, Joshi; Guo, Jiang; Conrad, James; Bandyopadhyay, Alak; Cross, William


    The Constellation Program is the medium by which we will maintain a presence in low Earth orbit, return to the moon for further exploration and develop procedures for Mars exploration. The foundation for its presence and success is built by the many individuals that have given of their time, talent and even lives to help propel the mission and objectives of NASA. The Exploration Systems Mission Directorate (ESMD) Faculty Fellows Program is a direct contributor to the success of directorate and Constellation Program objectives. It is through programs such as the ESMD Space Grant program that students are inspired and challenged to achieve the technological heights that will propel us to meet the goals and objectives of ESMD and the Constellation Program. It is through ESMD Space Grant programs that future NASA scientists, engineers, and mathematicians begin to dream of taking America to newer heights of space exploration. The ESMD Space Grant program is to be commended for taking the initiative to develop and implement programs that help solidify the mission of NASA. With the concerted efforts of the Kennedy Space Center educational staff, the 2009 ESMD Space Grant Summer Faculty Fellows Program allowed faculty to become more involved with NASA personnel relating to exploration topics for the senior design projects. The 2009 Project was specifically directed towards NASA's Strategic Educational Outcome 1. In-situ placement of Faculty Fellows at the NASA field Centers was essential; this allowed personal interactions with NASA scientists and engineers. In particular, this was critical to better understanding the NASA problems and begin developing a senior design effort to solve the problems. The Faculty Fellows are pleased that the ESMD Space Grant program is taking interest in developing the Senior Design courses at the university level. These courses are needed to help develop the NASA engineers and scientists of the very near future. It has been a pleasure to be

  18. Space Nuclear Thermal Propulsion Test Facilities Subpanel. Final report

    Allen, G.C.; Warren, J.W.; Martinell, J.; Clark, J.S.; Perkins, D.


    On 20 Jul. 1989, in commemoration of the 20th anniversary of the Apollo 11 lunar landing, President George Bush proclaimed his vision for manned space exploration. He stated, 'First for the coming decade, for the 1990's, Space Station Freedom, the next critical step in our space endeavors. And next, for the new century, back to the Moon. Back to the future. And this time, back to stay. And then, a journey into tomorrow, a journey to another planet, a manned mission to Mars.' On 2 Nov. 1989, the President approved a national space policy reaffirming the long range goal of the civil space program: to 'expand human presence and activity beyond Earth orbit into the solar system.' And on 11 May 1990, he specified the goal of landing Astronauts on Mars by 2019, the 50th anniversary of man's first steps on the Moon. To safely and ever permanently venture beyond near Earth environment as charged by the President, mankind must bring to bear extensive new technologies. These include heavy lift launch capability from Earth to low-Earth orbit, automated space rendezvous and docking of large masses, zero gravity countermeasures, and closed loop life support systems. One technology enhancing, and perhaps enabling, the piloted Mars missions is nuclear propulsion, with great benefits over chemical propulsion. Asserting the potential benefits of nuclear propulsion, NASA has sponsored workshops in Nuclear Electric Propulsion and Nuclear Thermal Propulsion and has initiated a tri-agency planning process to ensure that appropriate resources are engaged to meet this exciting technical challenge. At the core of this planning process, NASA, DOE, and DOD established six Nuclear Propulsion Technical Panels in 1991 to provide groundwork for a possible tri-agency Nuclear Propulsion Program and to address the President's vision by advocating an aggressive program in nuclear propulsion. To this end the Nuclear Electric Propulsion Technology Panel has focused it energies

  19. Urban green spaces assessment approach to health, safety and environment

    B. Akbari Neisiani


    Full Text Available The city is alive with dynamic systems, where parks and urban green spaces have high strategic importance which help to improve living conditions. Urban parks are used as visual landscape with so many benefits such as reducing stress, reducing air pollution and producing oxygen, creating opportunities for people to participate in physical activities, optimal environment for children and decreasing noise pollution. The importance of parks is such extent that are discussed as an indicator of urban development. Hereupon the design and maintenance of urban green spaces requires integrated management system based on international standards of health, safety and the environment. In this study, Nezami Ganjavi Park (District 6 of Tehran with the approach to integrated management systems have been analyzed. In order to identify the status of the park in terms of the requirements of the management system based on previous studies and all Tehran Municipality’s considerations, a check list has been prepared and completed by park survey and interview with green space experts. The results showed that the utility of health indicators were 92.33 % (the highest and environmental and safety indicators were 72 %, 84 % respectively. According to SWOT analysis in Nezami Ganjavi Park some of strength points are fire extinguishers, first aid box, annual testing of drinking water and important weakness is using unseparated trash bins also as an opportunities, there are some interesting factors for children and parents to spend free times. Finally, the most important threat is unsuitable park facilities for disabled.

  20. Solid state neutron dosimeter for space applications. Final Report

    Entine, G.; Nagargar, V.; Sharif, D.


    Personnel engaged in space flight are exposed to significant flux of high energy neutrons arising from both primary and secondary sources of ionizing radiation. Presently, there exist no compact neutron sensor capable of being integrated in a flight instrument to provide real time measurement of this radiation flux. A proposal was made to construct such an instrument using special PIN silicon diode which has the property of being insensitive to the other forms of ionizing radiation. Studies were performed to determine the design and construction of a better reading system to allow the PIN diode to be read with high precision. The physics of the device was studied, especially with respect to those factors which affect the sensitivity and reproducibility of the neutron response. This information was then used to develop methods to achieve high sensitivity at low neutron doses. The feasibility was shown of enhancing the PIN diode sensitivity to make possible the measurement of the low doses of neutrons encountered in space flights. The new PIN diode will make possible the development of a very compact, accurate, personal neutron dosimeter

  1. Final report of the SPS space transportation workshop


    After a brief description of space power system concepts and the current status of the SPS program, issues relevant to earth-surface-to-low-earth-orbit (ESLEO) and orbit-to-orbit transport are discussed. For ESLEO, vehicle concepts include shuttle transportation systems, heavy lift launch vehicles, and single-stage-to-orbit vehicles. Orbit transfer vehicle missions include transport of cargo and the SPS module from low earth orbit to geosynchronous earth orbit as well as personnel transport. Vehicles discussed for such missions include chemical rocket orbital transfer vehicles, and electric orbital transfer vehicles. Further discussions include SPS station-keeping and attitude control, intra-orbit transport, and advanced propulsion and vehicle concepts. (LEW)

  2. Stirling Space Engine Program. Volume 1; Final Report

    Dhar, Manmohan


    The objective of this program was to develop the technology necessary for operating Stirling power converters in a space environment and to demonstrate this technology in full-scale engine tests. Hardware development focused on the Component Test Power Converter (CTPC), a single cylinder, 12.5-kWe engine. Design parameters for the CTPC were 150 bar operating pressure, 70 Hz frequency, and hot-and cold-end temperatures of 1050 K and 525 K, respectively. The CTPC was also designed for integration with an annular sodium heat pipe at the hot end, which incorporated a unique "Starfish" heater head that eliminated highly stressed brazed or weld joints exposed to liquid metal and used a shaped-tubed electrochemical milling process to achieve precise positional tolerances. Selection of materials that could withstand high operating temperatures with long life were another focus. Significant progress was made in the heater head (Udimet 700 and Inconel 718 and a sodium-filled heat pipe); the alternator (polyimide-coated wire with polyimide adhesive between turns and a polyimide-impregnated fiberglass overwrap and samarium cobalt magnets); and the hydrostatic gas bearings (carbon graphite and aluminum oxide for wear couple surfaces). Tests on the CTPC were performed in three phases: cold end testing (525 K), engine testing with slot radiant heaters, and integrated heat pipe engine system testing. Each test phase was successful, with the integrated engine system demonstrating a power level of 12.5 kWe and an overall efficiency of 22 percent in its maiden test. A 1500-hour endurance test was then successfully completed. These results indicate the significant achievements made by this program that demonstrate the viability of Stirling engine technology for space applications.

  3. Real Space Approach to CMB deboosting

    Yoho, Amanda; Starkman, Glenn D.; Pereira, Thiago S.


    The effect of our Galaxy's motion through the Cosmic Microwave Background rest frame, which aberrates and Doppler shifts incoming photons measured by current CMB experiments, has been shown to produce mode-mixing in the multipole space temperature coefficients. However, multipole space determinations are subject to many difficulties, and a real-space analysis can provide a straightforward alternative. In this work we describe a numerical method for removing Lorentz- boost effects from real-space temperature maps. We show that to deboost a map so that one can accurately extract the temperature power spectrum requires calculating the boost kernel at a finer pixelization than one might naively expect. In idealized cases that allow for easy comparison to analytic results, we have confirmed that there is indeed mode mixing among the spherical harmonic coefficients of the temperature. We find that using a boost kernel calculated at Nside=8192 leads to a 1% bias in the binned boosted power spectrum at l~2000, while ...

  4. Space Station overall management approach for operations

    Paules, G.


    An Operations Management Concept developed by NASA for its Space Station Program is discussed. The operational goals, themes, and design principles established during program development are summarized. The major operations functions are described, including: space systems operations, user support operations, prelaunch/postlanding operations, logistics support operations, market research, and cost/financial management. Strategic, tactical, and execution levels of operational decision-making are defined.

  5. Space: The Final Frontier-Research Relevant to Mars.

    Boice, John D


    A critically important gap in knowledge surrounds the health consequences of exposure to radiation received gradually over time. Much is known about the health effects of brief high-dose exposures, such as from the atomic bombings in Japan, but the concerns today focus on the frequent low-dose exposures received by members of the public, workers, and, as addressed in this paper, astronauts. Additional guidance is needed by the National Aeronautics and Space Administration (NASA) for planning long-term missions where the rate of radiation exposure is gradual over years and the cumulative amounts high. The direct study of low doses and low-dose rates is of immeasurable value in understanding the possible range of health effects from gradual exposures and in providing guidance for radiation protection, not only of workers and the public but also astronauts. The ongoing Million Person Study (MPS) is 10 times larger than the study of the Japanese atomic bomb survivors of 86,000 survivors with estimated doses. The number of workers with >100 mSv career dose is substantially greater. The large study size, broad range of doses, and long follow-up indicate substantial statistical ability to quantify the risk of exposures that are received gradually over time. The study consists of 360,000 U.S. Department of Energy workers from the Manhattan Project; 150,000 nuclear utility workers from the inception of the nuclear age; 115,000 atomic veterans who participated in above-ground atmospheric tests at the Nevada Test Site and the Bikini and Enewetak Atolls and Johnston Island in the Pacific Proving Grounds (PPG); 250,000 radiologists and medical workers; and 130,000 industrial radiographers. NASA uses an individual risk-based system for radiation protection in contrast to the system of dose limits for occupational exposures used by terrestrial-based organizations. The permissible career exposure limit set by NASA for each astronaut is a 3% risk of exposure-induced death (REID

  6. A Psychosocial Approach to Understanding Underground Spaces

    Eun H. Lee


    Full Text Available With a growing need for usable land in urban areas, subterranean development has been gaining attention. While construction of large underground complexes is not a new concept, our understanding of various socio-cultural aspects of staying underground is still at a premature stage. With projected emergence of underground built environments, future populations may spend much more of their working, transit, and recreational time in underground spaces. Therefore, it is essential to understand the challenges and advantages that such environments have to improve the future welfare of users of underground spaces. The current paper discusses various psycho-social aspects of underground spaces, the impact they can have on the culture shared among the occupants, and possible solutions to overcome some of these challenges.

  7. Phase space approach to quantum dynamics

    Leboeuf, P.


    The Schroedinger equation for the time propagation of states of a quantised two-dimensional spherical phase space is replaced by the dynamics of a system of N particles lying in phase space. This is done through factorization formulae of analytic function theory arising in coherent-state representation, the 'particles' being the zeros of the quantum state. For linear Hamiltonians, like a spin in a uniform magnetic field, the motion of the particles is classical. However, non-linear terms induce interactions between the particles. Their time propagation is studied and it is shown that, contrary to integrable systems, for chaotic maps they tend to fill, as their classical counterpart, the whole phase space. (author) 13 refs., 3 figs

  8. Final Environmental Assessment for the California Space Center at Vandenberg Air Force Base, California


    rooted , mesophylic plant species that Chapter 3. Affected Environment Final Environmental Assessment - California Space Center, Vandenberg Air...Chapter 3. Affected Environment 3-12 Final Environmental Assessment - California Space Center, Vandenberg Air Force Base the root and debris zone of the...protruding objects, slippery soils or mud, and biological hazards including vegetation (i.e. poison oak and stinging nettle ), animals (i.e. insects

  9. Operator space approach to steering inequality

    Yin, Zhi; Marciniak, Marcin; Horodecki, Michał


    In Junge and Palazuelos (2011 Commun. Math. Phys. 306 695–746) and Junge et al (2010 Commun. Math. Phys. 300 715–39) the operator space theory was applied to study bipartite Bell inequalities. The aim of the paper is to follow this line of research and use the operator space technique to analyze the steering scenario. We obtain a bipartite steering functional with unbounded largest violation of steering inequality, as well as constructing all ingredients explicitly. It turns out that the unbounded largest violation is obtained by a non maximally entangled state. Moreover, we focus on the bipartite dichotomic case where we construct a steering functional with unbounded largest violation of steering inequality. This phenomenon is different to the Bell scenario where only the bounded largest violation can be obtained by any bipartite dichotomic Bell functional. (paper)

  10. A vector space approach to geometry

    Hausner, Melvin


    The effects of geometry and linear algebra on each other receive close attention in this examination of geometry's correlation with other branches of math and science. In-depth discussions include a review of systematic geometric motivations in vector space theory and matrix theory; the use of the center of mass in geometry, with an introduction to barycentric coordinates; axiomatic development of determinants in a chapter dealing with area and volume; and a careful consideration of the particle problem. 1965 edition.

  11. Stochastic inflation: Quantum phase-space approach

    Habib, S.


    In this paper a quantum-mechanical phase-space picture is constructed for coarse-grained free quantum fields in an inflationary universe. The appropriate stochastic quantum Liouville equation is derived. Explicit solutions for the phase-space quantum distribution function are found for the cases of power-law and exponential expansions. The expectation values of dynamical variables with respect to these solutions are compared to the corresponding cutoff regularized field-theoretic results (we do not restrict ourselves only to left-angle Φ 2 right-angle). Fair agreement is found provided the coarse-graining scale is kept within certain limits. By focusing on the full phase-space distribution function rather than a reduced distribution it is shown that the thermodynamic interpretation of the stochastic formalism faces several difficulties (e.g., there is no fluctuation-dissipation theorem). The coarse graining does not guarantee an automatic classical limit as quantum correlations turn out to be crucial in order to get results consistent with standard quantum field theory. Therefore, the method does not by itself constitute an explanation of the quantum to classical transition in the early Universe. In particular, we argue that the stochastic equations do not lead to decoherence

  12. Innovative approaches to inertial confinement fusion reactors: Final report

    Bourque, R.F.; Schultz, K.R.


    Three areas of innovative approaches to inertial confinement fusion (ICF) reactor design are given. First, issues pertaining to the Cascade reactor concept are discussed. Then, several innovative concepts are presented which attempt to directly recover the blast energy from a fusion target. Finally, the Turbostar concept for direct recovery of that energy is evaluated. The Cascade issues discussed are combustion of the carbon granules in the event of air ingress, the use of alternate granule materials, and the effect of changes in carbon flow on details of the heat exchanger. Carbon combustion turns out to be a minor problem. Four ICF innovative concepts were considered: a turbine with ablating surfaces, a liquid piston system, a wave generator, and a resonating pump. In the final analysis, none show any real promise. The Turbostar concept of direct recovery is a very interesting idea and appeared technically viable. However, it shows no efficiency gain or any decrease in capital cost compared to reactors with conventional thermal conversion systems. Attempts to improve it by placing a close-in lithium sphere around the target to increase gas generation increased efficiency only slightly. It is concluded that these direct conversion techniques require thermalization of the x-ray and debris energy, and are Carnot limited. They therefore offer no advantage over existing and proposed methods of thermal energy conversion or direct electrical conversion

  13. Autotracking from space - The TDRSS approach

    Spearing, R. E.; Harper, W. R.

    The TDRSS will provide telecommunications support to near-earth orbiting satellites through the 1980s and into the 1990s. The system incorporates two operational satellites at geostationary altitude and a single ground station at White Sands, NM. Of the many tasks facing the engineering team in development of this system, one of the most challenging was K-band autotrack. An approach not previously attempted placed the error detection, processing, and feedback elements for automatic control of the TDR satellite antennas on the ground. This approach offered several advantages to the designers but posed a number of interesting questions during the development program. The autotrack system design and its test program are described with emphasis given to areas of special interest in developing a working K-band service.

  14. Application of Bayesian approach to estimate average level spacing

    Huang Zhongfu; Zhao Zhixiang


    A method to estimate average level spacing from a set of resolved resonance parameters by using Bayesian approach is given. Using the information given in the distributions of both levels spacing and neutron width, the level missing in measured sample can be corrected more precisely so that better estimate for average level spacing can be obtained by this method. The calculation of s-wave resonance has been done and comparison with other work was carried out

  15. The +vbar breakout during approach to Space Station Freedom

    Dunham, Scott D.


    A set of burn profiles was developed to provide bounding jet firing histories for a +vbar breakout during approaches to Space Station Freedom. The delta-v sequences were designed to place the Orbiter on a safe trajectory under worst case conditions and to try to minimize plume impingement on Space Station Freedom structure.

  16. Approach to developing reliable space reactor power systems

    Mondt, J.F.; Shinbrot, C.H.


    The Space Reactor Power System Project is in the engineering development phase of a three-phase program. During Phase II, the Engineering Development Phase, the SP-100 Project has defined and is pursuing a new approach to developing reliable power systems. The approach to developing such a system during the early technology phase is described in this paper along with some preliminary examples to help explain the approach. Developing reliable components to meet space reactor power system requirements is based on a top down systems approach which includes a point design based on a detailed technical specification of a 100 kW power system

  17. Space Weather in the Machine Learning Era: A Multidisciplinary Approach

    Camporeale, E.; Wing, S.; Johnson, J.; Jackman, C. M.; McGranaghan, R.


    The workshop entitled Space Weather: A Multidisciplinary Approach took place at the Lorentz Center, University of Leiden, Netherlands, on 25-29 September 2017. The aim of this workshop was to bring together members of the Space Weather, Mathematics, Statistics, and Computer Science communities to address the use of advanced techniques such as Machine Learning, Information Theory, and Deep Learning, to better understand the Sun-Earth system and to improve space weather forecasting. Although individual efforts have been made toward this goal, the community consensus is that establishing interdisciplinary collaborations is the most promising strategy for fully utilizing the potential of these advanced techniques in solving Space Weather-related problems.

  18. Exploring the triplet parameters space to optimise the final focus of the FCC-hh

    AUTHOR|(CDS)2141109; Abelleira, Jose; Seryi, Andrei; Cruz Alaniz, Emilia


    One of the main challenges when designing final focus systems of particle accelerators is maximising the beam stay clear in the strong quadrupole magnets of the inner triplet. Moreover it is desirable to keep the quadrupoles in the triplet as short as possible for space and costs reasons but also to reduce chromaticity and simplify corrections schemes. An algorithm that explores the triplet parameter space to optimise both these aspects was written. It uses thin lenses as a first approximation and MADX for more precise calculations. In cooperation with radiation studies, this algorithm was then applied to design an alternative triplet for the final focus of the Future Circular Collider (FCC-hh).

  19. Novel Approaches to Cellular Transplantation from the US Space Program

    Pellis, Neal R.; Homick, Jerry L. (Technical Monitor)


    Research in the treatment of type I diabetes is entering a new era that takes advantage of our knowledge in an ever increasing variety of scientific disciplines. Some may originate from very diverse sources, one of which is the Space Program at National Aeronautics and Space Administration (NASA). The Space Program contributes to diabetes-related research in several treatment modalities. As an ongoing effort for medical monitoring of personnel involved in space exploration activities NASA and the extramural scientific community investigate strategies for noninvasive estimation of blood glucose levels. Part of the effort in the space protein crystal growth program is high-resolution structural analysis insulin as a means to better understand the interaction with its receptor and with host immune components and as a basis for rational design of a "better" insulin molecule. The Space Program is also developing laser technology for potential early cataract detection as well as a noninvasive analyses for addressing preclinical diabetic retinopathy. Finally, NASA developed an exciting cell culture system that affords some unique advantages in the propagation and maintenance of mammalian cells in vitro. The cell culture system was originally designed to maintain cell suspensions with a minimum of hydrodynamic and mechanical sheer while awaiting launch into microgravity. Currently the commercially available NASA bioreactor (Synthecon, Inc., Houston, TX) is used as a research tool in basic and applied cell biology. In recent years there is continued strong interest in cellular transplantation as treatment for type I diabetes. The advantages are the potential for successful long-term amelioration and a minimum risk for morbidity in the event of rejection of the transplanted cells. The pathway to successful application of this strategy is accompanied by several substantial hurdles: (1) isolation and propagation of a suitable uniform donor cell population; (2) management of

  20. A Web Based Approach to Integrate Space Culture and Education

    Gerla, F.


    , who can use it to prepare their lessons, retrieve information and organize the didactic material in order to support their lessons. We think it important to use a user centered "psychology" based on UM: we have to know the needs and expectations of the students. Our intent is to use usability tests not just to prove the site effectiveness and clearness, but also to investigate aesthetical preferences of children and young people. Physics, mathematics, chemistry are just some of the difficult learning fields connected with space technologies. Space culture is a potentially never-ending field, and our scope will be to lead students by hand in this universe of knowledge. This paper will present MARS activities in the framework of the above methodologies aimed at implementing a web based approach to integrate space culture and education. The activities are already in progress and some results will be presented in the final paper.

  1. Toward a global space exploration program: A stepping stone approach

    Ehrenfreund, Pascale; McKay, Chris; Rummel, John D.; Foing, Bernard H.; Neal, Clive R.; Masson-Zwaan, Tanja; Ansdell, Megan; Peter, Nicolas; Zarnecki, John; Mackwell, Steve; Perino, Maria Antionetta; Billings, Linda; Mankins, John; Race, Margaret


    In response to the growing importance of space exploration in future planning, the Committee on Space Research (COSPAR) Panel on Exploration (PEX) was chartered to provide independent scientific advice to support the development of exploration programs and to safeguard the potential scientific assets of solar system objects. In this report, PEX elaborates a stepwise approach to achieve a new level of space cooperation that can help develop world-wide capabilities in space science and exploration and support a transition that will lead to a global space exploration program. The proposed stepping stones are intended to transcend cross-cultural barriers, leading to the development of technical interfaces and shared legal frameworks and fostering coordination and cooperation on a broad front. Input for this report was drawn from expertise provided by COSPAR Associates within the international community and via the contacts they maintain in various scientific entities. The report provides a summary and synthesis of science roadmaps and recommendations for planetary exploration produced by many national and international working groups, aiming to encourage and exploit synergies among similar programs. While science and technology represent the core and, often, the drivers for space exploration, several other disciplines and their stakeholders (Earth science, space law, and others) should be more robustly interlinked and involved than they have been to date. The report argues that a shared vision is crucial to this linkage, and to providing a direction that enables new countries and stakeholders to join and engage in the overall space exploration effort. Building a basic space technology capacity within a wider range of countries, ensuring new actors in space act responsibly, and increasing public awareness and engagement are concrete steps that can provide a broader interest in space exploration, worldwide, and build a solid basis for program sustainability. By engaging

  2. a Web Service Approach for Linking Sensors and Cellular Spaces

    Isikdag, U.


    More and more devices are starting to be connected to the Internet. In the future the Internet will not only be a communication medium for people, it will in fact be a communication environment for devices. The connected devices which are also referred as Things will have an ability to interact with other devices over the Internet, i.) provide information in interoperable form and ii.) consume /utilize such information with the help of sensors embedded in them. This overall concept is known as Internet-of- Things (IoT). This requires new approaches to be investigated for system architectures to establish relations between spaces and sensors. The research presented in this paper elaborates on an architecture developed with this aim, i.e. linking spaces and sensors using a RESTful approach. The objective is making spaces aware of (sensor-embedded) devices, and making devices aware of spaces in a loosely coupled way (i.e. a state/usage/function change in the spaces would not have effect on sensors, similarly a location/state/usage/function change in sensors would not have any effect on spaces). The proposed architecture also enables the automatic assignment of sensors to spaces depending on space geometry and sensor location.

  3. An Open and Holistic Approach for Geo and Space Sciences

    Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Toshihiko, Iyemori; Yatagai, Akiyo; Koyama, Yukinobu; Murayama, Yasuhiro; King, Todd; Hughes, Steve; Fung, Shing; Galkin, Ivan; Hapgood, Mike; Belehaki, Anna


    references in preparation of the publishing process. In addition, references to well documented earth and space science data are available via an increasing amount of data publications. This approach serves both, the institutional geo and space data centers which increase their awareness and importance, but also the scientists, which will find the right and already DOI-referenced data in the appropriate data journals. The Open Data and Open Archive approach finally merges in the concept of Open Science. Open Science emphasizes an open sharing of knowledge of all kind, based on a transparent multi-disciplinary and cross-domain scientific work. But Open Science is not just an idea, it also stands for a variety of projects which following the rules of Open Science, such as open methodology, open source, open data, open access, open peer review and open educational resources. Open Science also demands a new culture of scientific collaboration based on social media, and the use of shared cloud technology for data storage and computing. But, we should not forget, the WWW is not a one way road. As more data, methods and software for science research become freely available at the Internet, as more chances for a commercial or even destructive use of scientific data are opened. Already now, the giant search engine provider, such as Google or Microsoft and others are collecting, storing and analyzing all data which is available at the net. The usage of Deep Learning for the detection of semantical coherence of data for e.g. the creation of personalized on time and on location predictions using neuronal networks and artificial intelligence methods should not be reserved for them but also used within Open Science for the creation of new scientific knowledge. Open Science does not mean just to dump our scientific data, information and knowledge into the Web. Far from it, we are still responsible for a sustainable handling of our data for the benefit of humankind. The usage of the

  4. Generalized Wigner functions in curved spaces: A new approach

    Kandrup, H.E.


    It is well known that, given a quantum field in Minkowski space, one can define Wigner functions f/sub W//sup N/(x 1 ,p 1 ,...,x/sub N/,p/sub N/) which (a) are convenient to analyze since, unlike the field itself, they are c-number quantities and (b) can be interpreted in a limited sense as ''quantum distribution functions.'' Recently, Winter and Calzetta, Habib and Hu have shown one way in which these flat-space Wigner functions can be generalized to a curved-space setting, deriving thereby approximate kinetic equations which make sense ''quasilocally'' for ''short-wavelength modes.'' This paper suggests a completely orthogonal approach for defining curved-space Wigner functions which generalizes instead an object such as the Fourier-transformed f/sub W/ 1 (k,p), which is effectively a two-point function viewed in terms of the ''natural'' creation and annihilation operators a/sup dagger/(p-(12k) and a(p+(12k). The approach suggested here lacks the precise phase-space interpretation implicit in the approach of Winter or Calzetta, Habib, and Hu, but it is useful in that (a) it is geared to handle any ''natural'' mode decomposition, so that (b) it can facilitate exact calculations at least in certain limits, such as for a source-free linear field in a static spacetime

  5. Deep-Inelastic Final States in a Space-Time Description of Shower Development and Hadronization

    Ellis, John; Geiger, Klaus; Kowalski, Henryk


    We extend a quantum kinetic approach to the description of hadronic showers in space, time and momentum space to deep-inelastic $ep$ collisions, with particular reference to experiments at HERA. We follow the history of hard scattering events back to the initial hadronic state and forward to the formation of colour-singlet pre-hadronic clusters and their decays into hadrons. The time evolution of the space-like initial-state shower and the time-like secondary partons are treated similarly, an...

  6. A simple coordinate space approach to three-body problems ...

    We show how to treat the dynamics of an asymmetric three-body system consisting of one heavy and two identical light particles in a simple coordinate space variational approach. The method is constructive and gives an efficient way of resolving a three-body system to an effective two-body system. It is illustrated by ...

  7. Learning Approaches - Final Report Sub-Project 4

    Dirckinck-Holmfeld, Lone; Rodríguez Illera, José Luis; Escofet, Anna


    The overall aim of Subproject 4 is to apply learning approaches that are appropriate and applicable using ICT. The task is made up of two components 4.1 dealing with learning approaches (see deliverable 4.1), and component 4.2 application of ICT (see deliverable 4.2, deliverable 4.3 & deliverable...

  8. Space-time uncertainty and approaches to D-brane field theory

    Yoneya, Tamiaki


    In connection with the space-time uncertainty principle which gives a simple qualitative characterization of non-local or non-commutative nature of short-distance space-time structure in string theory, the author's recent approaches toward field theories for D-branes are briefly outlined, putting emphasis on some key ideas lying in the background. The final section of the present report is devoted partially to a tribute to Yukawa on the occasion of the centennial of his birth. (author)

  9. A Proposal for the Common Safety Approach of Space Programs

    Grimard, Max


    For all applications, business and systems related to Space programs, Quality is mandatory and is a key factor for the technical as well as the economical performances. Up to now the differences of applications (launchers, manned space-flight, sciences, telecommunications, Earth observation, planetary exploration, etc.) and the difference of technical culture and background of the leading countries (USA, Russia, Europe) have generally led to different approaches in terms of standards and processes for Quality. At a time where international cooperation is quite usual for the institutional programs and globalization is the key word for the commercial business, it is considered of prime importance to aim at common standards and approaches for Quality in Space Programs. For that reason, the International Academy of Astronautics has set up a Study Group which mandate is to "Make recommendations to improve the Quality, Reliability, Efficiency, and Safety of space programmes, taking into account the overall environment in which they operate : economical constraints, harsh environments, space weather, long life, no maintenance, autonomy, international co-operation, norms and standards, certification." The paper will introduce the activities of this Study Group, describing a first list of topics which should be addressed : Through this paper it is expected to open the discussion to update/enlarge this list of topics and to call for contributors to this Study Group.

  10. Truncated conformal space approach to scaling Lee-Yang model

    Yurov, V.P.; Zamolodchikov, Al.B.


    A numerical approach to 2D relativstic field theories is suggested. Considering a field theory model as an ultraviolet conformal field theory perturbed by suitable relevant scalar operator one studies it in finite volume (on a circle). The perturbed Hamiltonian acts in the conformal field theory space of states and its matrix elements can be extracted from the conformal field theory. Truncation of the space at reasonable level results in a finite dimensional problem for numerical analyses. The nonunitary field theory with the ultraviolet region controlled by the minimal conformal theory μ(2/5) is studied in detail. 9 refs.; 17 figs

  11. Approach to an Affordable and Sustainable Space Transportation System

    McCleskey, Caey M.; Rhodes, R. E.; Robinson, J. W.; Henderson, E. M.


    This paper describes an approach and a general procedure for creating space transportation architectural concepts that are at once affordable and sustainable. Previous papers by the authors and other members of the Space Propulsion Synergy Team (SPST) focused on a functional system breakdown structure for an architecture and definition of high-payoff design techniques with a technology integration strategy. This paper follows up by using a structured process that derives architectural solutions focused on achieving life cycle affordability and sustainability. Further, the paper includes an example concept that integrates key design techniques discussed in previous papers. !

  12. Requirements and approach for a space tourism launch system

    Penn, Jay P.; Lindley, Charles A.


    Market surveys suggest that a viable space tourism industry will require flight rates about two orders of magnitude higher than those required for conventional spacelift. Although enabling round-trip cost goals for a viable space tourism business are about 240/pound (529/kg), or 72,000/passenger round-trip, goals should be about 50/pound (110/kg) or approximately 15,000 for a typical passenger and baggage. The lower price will probably open space tourism to the general population. Vehicle reliabilities must approach those of commercial aircraft as closely as possible. This paper addresses the development of spaceplanes optimized for the ultra-high flight rate and high reliability demands of the space tourism mission. It addresses the fundamental operability, reliability, and cost drivers needed to satisfy this mission need. Figures of merit similar to those used to evaluate the economic viability of conventional commercial aircraft are developed, including items such as payload/vehicle dry weight, turnaround time, propellant cost per passenger, and insurance and depreciation costs, which show that infrastructure can be developed for a viable space tourism industry. A reference spaceplane design optimized for space tourism is described. Subsystem allocations for reliability, operability, and costs are made and a route to developing such a capability is discussed. The vehicle's ability to satisfy the traditional spacelift market is also shown.

  13. Report on Approaches to Database Translation. Final Report.

    Gallagher, Leonard; Salazar, Sandra

    This report describes approaches to database translation (i.e., transferring data and data definitions from a source, either a database management system (DBMS) or a batch file, to a target DBMS), and recommends a method for representing the data structures of newly-proposed network and relational data models in a form suitable for database…

  14. Applied approach slab settlement research, design/construction : final report.


    Approach embankment settlement is a pervasive problem in Oklahoma and many other states. The bump and/or abrupt slope change poses a danger to traffic and can cause increased dynamic loads on the bridge. Frequent and costly maintenance may be needed ...

  15. Embodied Space: a Sensorial Approach to Spatial Experience

    Durão, Maria João


    A reflection is presented on the significance of the role of the body in the interpretation and future creation of spatial living structures. The paper draws on the body as cartography of sensorial meaning that includes vision, touch, smell, hearing, orientation and movement to discuss possible relationships with psychological and sociological parameters of 'sensorial space'. The complex dynamics of body-space is further explored from the standpoint of perceptual variables such as color, light, materialities, texture and their connections with design, technology, culture and symbology. Finally, the paper discusses the integration of knowledge and experimentation in the design of future habitats where body-sensitive frameworks encompass flexibility, communication, interaction and cognitive-driven solutions.

  16. Approach to transaction management for Space Station Freedom

    Easton, C. R.; Cressy, Phil; Ohnesorge, T. E.; Hector, Garland


    The Space Station Freedom Manned Base (SSFMB) will support the operation of the many payloads that may be located within the pressurized modules or on external attachment points. The transaction management (TM) approach presented provides a set of overlapping features that will assure the effective and safe operation of the SSFMB and provide a schedule that makes potentially hazardous operations safe, allocates resources within the capability of the resource providers, and maintains an environment conducive to the operations planned. This approach provides for targets of opportunity and schedule adjustments that give the operators the flexibility to conduct a vast majority of their operations with no conscious involvement with the TM function.

  17. State space approach to mixed boundary value problems.

    Chen, C. F.; Chen, M. M.


    A state-space procedure for the formulation and solution of mixed boundary value problems is established. This procedure is a natural extension of the method used in initial value problems; however, certain special theorems and rules must be developed. The scope of the applications of the approach includes beam, arch, and axisymmetric shell problems in structural analysis, boundary layer problems in fluid mechanics, and eigenvalue problems for deformable bodies. Many classical methods in these fields developed by Holzer, Prohl, Myklestad, Thomson, Love-Meissner, and others can be either simplified or unified under new light shed by the state-variable approach. A beam problem is included as an illustration.

  18. Approach to transaction management for Space Station Freedom

    Easton, C. R.; Cressy, Phil; Ohnesorge, T. E.; Hector, Garland


    An approach to managing the operations of the Space Station Freedom based on their external effects is described. It is assumed that there is a conflict-free schedule that, if followed, will allow only appropriate operations to occur. The problem is then reduced to that of ensuring that the operations initiated are within the limits allowed by the schedule, or that the external effects of such operations are within those allowed by the schedule. The main features of the currently adopted transaction management approach are discussed.

  19. General background and approach to multibody dynamics for space applications

    Santini, Paolo; Gasbarri, Paolo


    Multibody dynamics for space applications is dictated by space environment such as space-varying gravity forces, orbital and attitude perturbations, control forces if any. Several methods and formulations devoted to the modeling of flexible bodies undergoing large overall motions were developed in recent years. Most of these different formulations were aimed to face one of the main problems concerning the analysis of spacecraft dynamics namely the reduction of computer simulation time. By virtue of this, the use of symbolic manipulation, recursive formulation and parallel processing algorithms were proposed. All these approaches fall into two categories, the one based on Newton/Euler methods and the one based on Lagrangian methods; both of them have their advantages and disadvantages although in general, Newtonian approaches lend to a better understanding of the physics of problems and in particular of the magnitude of the reactions and of the corresponding structural stresses. Another important issue which must be addressed carefully in multibody space dynamics is relevant to a correct choice of kinematics variables. In fact, when dealing with flexible multibody system the resulting equations include two different types of state variables, the ones associated with large (rigid) displacements and the ones associated with elastic deformations. These two sets of variables have generally two different time scales if we think of the attitude motion of a satellite whose period of oscillation, due to the gravity gradient effects, is of the same order of magnitude as the orbital period, which is much bigger than the one associated with the structural vibration of the satellite itself. Therefore, the numerical integration of the equations of the system represents a challenging problem. This was the abstract and some of the arguments that Professor Paolo Santini intended to present for the Breakwell Lecture; unfortunately a deadly disease attacked him and shortly took him

  20. Conceptual design of jewellery: a space-based aesthetics approach

    Tzintzi Vaia


    Full Text Available Conceptual design is a field that offers various aesthetic approaches to generation of nature-based product design concepts. Essentially, Conceptual Product Design (CPD uses similarities based on the geometrical forms and functionalities. Furthermore, the CAD-based freehand sketch is a primary conceptual tool in the early stages of the design process. The proposed Conceptual Product Design concept is dealing with jewelleries that are inspired from space. Specifically, a number of galaxy features, such as galaxy shapes, wormholes and graphical representation of planet magnetic field are used as inspirations. Those space-based design ideas at a conceptual level can lead to further opportunities for research and economic success of the jewellery industry. A number of illustrative case studies are presented and new opportunities can be derived for economic success.

  1. Implementing CDIO Approach in preparing engineers for Space Industry

    Daneykin Yury


    Full Text Available The necessity to train highly qualified specialists leads to the development of the trajectory that can allow training specialists for the space industry. Several steps have been undertaken to reach this purpose. First, the University founded the Space Instrument Design Center that promotes a wide range of initiatives in the sphere of educating specialists, retraining specialists, carrying out research and collaborating with profiled enterprises. The University introduced Elite Engineering Education system to attract talented specialist and help them to follow individual trajectory to train unique specialist. The paper discusses the targets necessary for achievement to train specialists. Moreover, the paper presents the compliance of the attempts with the CDIO Approach, which is widely used in leading universities to improve engineering programs.

  2. Hybrid x-space: a new approach for MPI reconstruction.

    Tateo, A; Iurino, A; Settanni, G; Andrisani, A; Stifanelli, P F; Larizza, P; Mazzia, F; Mininni, R M; Tangaro, S; Bellotti, R


    Magnetic particle imaging (MPI) is a new medical imaging technique capable of recovering the distribution of superparamagnetic particles from their measured induced signals. In literature there are two main MPI reconstruction techniques: measurement-based (MB) and x-space (XS). The MB method is expensive because it requires a long calibration procedure as well as a reconstruction phase that can be numerically costly. On the other side, the XS method is simpler than MB but the exact knowledge of the field free point (FFP) motion is essential for its implementation. Our simulation work focuses on the implementation of a new approach for MPI reconstruction: it is called hybrid x-space (HXS), representing a combination of the previous methods. Specifically, our approach is based on XS reconstruction because it requires the knowledge of the FFP position and velocity at each time instant. The difference with respect to the original XS formulation is how the FFP velocity is computed: we estimate it from the experimental measurements of the calibration scans, typical of the MB approach. Moreover, a compressive sensing technique is applied in order to reduce the calibration time, setting a fewer number of sampling positions. Simulations highlight that HXS and XS methods give similar results. Furthermore, an appropriate use of compressive sensing is crucial for obtaining a good balance between time reduction and reconstructed image quality. Our proposal is suitable for open geometry configurations of human size devices, where incidental factors could make the currents, the fields and the FFP trajectory irregular.

  3. An Intelligent Systems Approach to Reservoir Characterization. Final Report

    Shahab D. Mohaghegh; Jaime Toro; Thomas H. Wilson; Emre Artun; Alejandro Sanchez; Sandeep Pyakurel


    Today, the major challenge in reservoir characterization is integrating data coming from different sources in varying scales, in order to obtain an accurate and high-resolution reservoir model. The role of seismic data in this integration is often limited to providing a structural model for the reservoir. Its relatively low resolution usually limits its further use. However, its areal coverage and availability suggest that it has the potential of providing valuable data for more detailed reservoir characterization studies through the process of seismic inversion. In this paper, a novel intelligent seismic inversion methodology is presented to achieve a desirable correlation between relatively low-frequency seismic signals, and the much higher frequency wireline-log data. Vertical seismic profile (VSP) is used as an intermediate step between the well logs and the surface seismic. A synthetic seismic model is developed by using real data and seismic interpretation. In the example presented here, the model represents the Atoka and Morrow formations, and the overlying Pennsylvanian sequence of the Buffalo Valley Field in New Mexico. Generalized regression neural network (GRNN) is used to build two independent correlation models between; (1) Surface seismic and VSP, (2) VSP and well logs. After generating virtual VSP's from the surface seismic, well logs are predicted by using the correlation between VSP and well logs. The values of the density log, which is a surrogate for reservoir porosity, are predicted for each seismic trace through the seismic line with a classification approach having a correlation coefficient of 0.81. The same methodology is then applied to real data taken from the Buffalo Valley Field, to predict inter-well gamma ray and neutron porosity logs through the seismic line of interest. The same procedure can be applied to a complete 3D seismic block to obtain 3D distributions of reservoir properties with less uncertainty than the geostatistical

  4. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.


    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  5. Deep-inelastic final states in a space-time description of shower development and hadronization

    Ellis, J.


    We extend a quantum kinetic approach to the description of hadronic showers in space, time and momentum space to deep-inelastic ep collisions, with particular reference to experiments at HERA. We follow the history of hard scattering events back to the initial hadronic state and forward to the formation of colour-singlet pre-hadronic clusters and their decays into hadrons. The time evolution of the space-like initial-state shower and the time-like secondary partons are treated similarly, and cluster formation is treated using a spatial criterion motivated by confinement and a non-perturbative model for hadronization. We calculate the time evolution of particle distributions in rapidity, transverse and longitudinal space. We also compare the transverse hadronic energy flow and the distribution of observed hadronic masses with experimental data from HERA, finding encouraging results, and discuss the background to large-rapidity-gap events. The techniques developed in this paper may be applied in the future to more complicated processes such as eA, pp, pA and AA collisions. (orig.)

  6. Space and place concepts analysis based on semiology approach in residential architecture

    Mojtaba Parsaee


    Full Text Available Space and place are among the fundamental concepts in architecture about which many discussions have been held and the complexity and importance of these concepts were focused on. This research has introduced an approach to better cognition of the architectural concepts based on theory and method of semiology in linguistics. Hence, at first the research investigates the concepts of space and place and explains their characteristics in architecture. Then, it reviews the semiology theory and explores its concepts and ideas. After obtaining the principles of theory and also the method of semiology, they are redefined in an architectural system based on an adaptive method. Finally, the research offers a conceptual model which is called the semiology approach by considering the architectural system as a system of signs. The approach can be used to decode the content of meanings and forms and analyses of the architectural mechanism in order to obtain its meanings and concepts. In this way and based on this approach, the residential architecture of the traditional city of Bushehr – Iran was analyzed as a case of study and its concepts were extracted. The results of this research demonstrate the effectiveness of this approach in structure detection and identification of an architectural system. Besides, this approach has the capability to be used in processes of sustainable development and also be a basis for deconstruction of architectural texts. The research methods of this study are qualitative based on comparative and descriptive analyses.

  7. A Mellin space approach to the conformal bootstrap

    Gopakumar, Rajesh [International Centre for Theoretical Sciences (ICTS-TIFR),Survey No. 151, Shivakote, Hesaraghatta Hobli, Bangalore North 560 089 (India); Kaviraj, Apratim [Centre for High Energy Physics, Indian Institute of Science,C.V. Raman Avenue, Bangalore 560012 (India); Sen, Kallol [Centre for High Energy Physics, Indian Institute of Science,C.V. Raman Avenue, Bangalore 560012 (India); Kavli Institute for the Physics and Mathematics of the Universe (WPI),The University of Tokyo Institutes for Advanced Study, Kashiwa, Chiba 277-8583 (Japan); Sinha, Aninda [Centre for High Energy Physics, Indian Institute of Science,C.V. Raman Avenue, Bangalore 560012 (India)


    We describe in more detail our approach to the conformal bootstrap which uses the Mellin representation of CFT{sub d} four point functions and expands them in terms of crossing symmetric combinations of AdS{sub d+1} Witten exchange functions. We consider arbitrary external scalar operators and set up the conditions for consistency with the operator product expansion. Namely, we demand cancellation of spurious powers (of the cross ratios, in position space) which translate into spurious poles in Mellin space. We discuss two contexts in which we can immediately apply this method by imposing the simplest set of constraint equations. The first is the epsilon expansion. We mostly focus on the Wilson-Fisher fixed point as studied in an epsilon expansion about d=4. We reproduce Feynman diagram results for operator dimensions to O(ϵ{sup 3}) rather straightforwardly. This approach also yields new analytic predictions for OPE coefficients to the same order which fit nicely with recent numerical estimates for the Ising model (at ϵ=1). We will also mention some leading order results for scalar theories near three and six dimensions. The second context is a large spin expansion, in any dimension, where we are able to reproduce and go a bit beyond some of the results recently obtained using the (double) light cone expansion. We also have a preliminary discussion about numerical implementation of the above bootstrap scheme in the absence of a small parameter.

  8. A phase space approach to wave propagation with dispersion.

    Ben-Benjamin, Jonathan S; Cohen, Leon; Loughlin, Patrick J


    A phase space approximation method for linear dispersive wave propagation with arbitrary initial conditions is developed. The results expand on a previous approximation in terms of the Wigner distribution of a single mode. In contrast to this previously considered single-mode case, the approximation presented here is for the full wave and is obtained by a different approach. This solution requires one to obtain (i) the initial modal functions from the given initial wave, and (ii) the initial cross-Wigner distribution between different modal functions. The full wave is the sum of modal functions. The approximation is obtained for general linear wave equations by transforming the equations to phase space, and then solving in the new domain. It is shown that each modal function of the wave satisfies a Schrödinger-type equation where the equivalent "Hamiltonian" operator is the dispersion relation corresponding to the mode and where the wavenumber is replaced by the wavenumber operator. Application to the beam equation is considered to illustrate the approach.

  9. Comparison of two Minkowski-space approaches to heavy quarkonia

    Leitao, Sofia; Biernat, Elmar P. [Universidade de Lisboa, CFTP, Instituto Superior Tecnico, Lisbon (Portugal); Li, Yang [Iowa State University, Department of Physics and Astronomy, Ames, IA (United States); College of William and Mary, Department of Physics, Williamsburg, VA (United States); Maris, Pieter; Vary, James P. [Iowa State University, Department of Physics and Astronomy, Ames, IA (United States); Pena, M.T. [Universidade de Lisboa, CFTP, Instituto Superior Tecnico, Lisbon (Portugal); Universidade de Lisboa, Departamento de Fisica, Instituto Superior Tecnico, Lisbon (Portugal); Stadler, Alfred [Universidade de Lisboa, CFTP, Instituto Superior Tecnico, Lisbon (Portugal); Universidade de Evora, Departamento de Fisica, Evora (Portugal)


    In this work we compare mass spectra and decay constants obtained from two recent, independent, and fully relativistic approaches to the quarkonium bound-state problem: the Basis Light-Front Quantization approach, where light-front wave functions are naturally formulated; and, the Covariant Spectator Theory (CST), based on a reorganization of the Bethe-Salpeter equation. Even though conceptually different, both solutions are obtained in Minkowski space. Comparisons of decay constants for more than ten states of charmonium and bottomonium show favorable agreement between the two approaches as well as with experiment where available. We also apply the Brodsky-Huang-Lepage prescription to convert the CST amplitudes into functions of light-front variables. This provides an ideal opportunity to investigate the similarities and differences at the level of the wave functions. Several qualitative features are observed in remarkable agreement between the two approaches even for the rarely addressed excited states. Leading-twist distribution amplitudes as well as parton distribution functions of heavy quarkonia are also analyzed. (orig.)

  10. Final Report on the Fuel Saving Effectiveness of Various Driver Feedback Approaches

    Gonder, J.; Earleywine, M.; Sparks, W.


    This final report quantifies the fuel-savings opportunities from specific driving behavior changes, identifies factors that influence drivers' receptiveness to adopting fuel-saving behaviors, and assesses various driver feedback approaches.

  11. A study of space shuttle energy management, approach and landing analysis

    Morth, R.


    The steering system of the space shuttle vehicle is presented for the several hundred miles of flight preceding landing. The guidance scheme is characterized by a spiral turn to dissipate excess potential energy (altitude) prior to a standard straight-in final approach. In addition, the system features pilot oriented control, drag brakes, phugoid damping, and a navigational capacity founded upon an inertial measurement unit and an on-board computer. Analytic formulas are used to calculate, represent, and insure the workability of the system's specifications

  12. Space Culture: Innovative Cultural Approaches To Public Engagement With Astronomy, Space Science And Astronautics

    Malina, Roger F.


    In recent years a number of cultural organizations have established ongoing programs of public engagement with astronomy, space science and astronautics. Many involve elements of citizen science initiatives, artists’ residencies in scientific laboratories and agencies, art and science festivals, and social network projects as well as more traditional exhibition venues. Recognizing these programs several agencies and organizations have established mechanisms for facilitating public engagement with astronomy and space science through cultural activities. The International Astronautics Federation has established an Technical Activities Committee for the Cultural Utilization of Space. Over the past year the NSF and NEA have organized disciplinary workshops to develop recommendations relating to art-science interaction and community building efforts. Rationales for encouraging public engagement via cultural projects range from theory of creativity, innovation and invention to cultural appropriation in the context of `socially robust science’ as advocated by Helga Nowotny of the European Research Council. Public engagement with science, as opposed to science education and outreach initiatives, require different approaches. Just as organizations have employed education professionals to lead education activities, so they must employ cultural professionals if they wish to develop public engagement projects via arts and culture. One outcome of the NSF and NEA workshops has been development of a rationale for converting STEM to STEAM by including the arts in STEM methodologies, particularly for K-12 where students can access science via arts and cultural contexts. Often these require new kinds of informal education approaches that exploit locative media, gaming platforms, artists projects and citizen science. Incorporating astronomy and space science content in art and cultural projects requires new skills in `cultural translation’ and `trans-mediation’ and new kinds

  13. Numerical Identification of Multiparameters in the Space Fractional Advection Dispersion Equation by Final Observations

    Dali Zhang


    Full Text Available This paper deals with an inverse problem for identifying multiparameters in 1D space fractional advection dispersion equation (FADE on a finite domain with final observations. The parameters to be identified are the fractional order, the diffusion coefficient, and the average velocity in the FADE. The forward problem is solved by a finite difference scheme, and then an optimal perturbation regularization algorithm is introduced to determine the three parameters simultaneously. Numerical inversions are performed both with the accurate data and noisy data, and several factors having influences on realization of the algorithm are discussed. The inversion solutions are in good approximations to the exact solutions demonstrating the efficiency of the proposed algorithm.

  14. Multiscale Analysis of Time Irreversibility Based on Phase-Space Reconstruction and Horizontal Visibility Graph Approach

    Zhang, Yongping; Shang, Pengjian; Xiong, Hui; Xia, Jianan

    Time irreversibility is an important property of nonequilibrium dynamic systems. A visibility graph approach was recently proposed, and this approach is generally effective to measure time irreversibility of time series. However, its result may be unreliable when dealing with high-dimensional systems. In this work, we consider the joint concept of time irreversibility and adopt the phase-space reconstruction technique to improve this visibility graph approach. Compared with the previous approach, the improved approach gives a more accurate estimate for the irreversibility of time series, and is more effective to distinguish irreversible and reversible stochastic processes. We also use this approach to extract the multiscale irreversibility to account for the multiple inherent dynamics of time series. Finally, we apply the approach to detect the multiscale irreversibility of financial time series, and succeed to distinguish the time of financial crisis and the plateau. In addition, Asian stock indexes away from other indexes are clearly visible in higher time scales. Simulations and real data support the effectiveness of the improved approach when detecting time irreversibility.

  15. Deep-inelastic final states in a space-time description of shower development and hadronization

    Ellis, J.; Geiger, K.; Kowalski, H.


    We extend a quantum kinetic approach to the description of hadronic showers in space, time, and momentum space to deep-inelastic ep collisions, with particular reference to experiments at DESY HERA. We follow the history of hard scattering events back to the initial hadronic state and forward to the formation of color-singlet prehadronic clusters and their decays into hadrons. The time evolution of the spacelike initial-state shower and the timelike secondary partons are treated similarly, and cluster formation is treated using a spatial criterion motivated by confinement and a nonperturbative model for hadronization. We calculate the time evolution of particle distributions in rapidity, transverse, and longitudinal space. We also compare the transverse hadronic energy flow and the distribution of observed hadronic masses with experimental data from HERA, finding encouraging results, and discuss the background to large-rapidity-gap events. The techniques developed in this paper may be applied in the future to more complicated processes such as eA, pp, pA, and AA collisions. copyright 1996 The American Physical Society

  16. Space, the Final Frontier”: How Good are Agent-Based Models at Simulating Individuals and Space in Cities?

    Alison Heppenstall


    Full Text Available Cities are complex systems, comprising of many interacting parts. How we simulate and understand causality in urban systems is continually evolving. Over the last decade the agent-based modeling (ABM paradigm has provided a new lens for understanding the effects of interactions of individuals and how through such interactions macro structures emerge, both in the social and physical environment of cities. However, such a paradigm has been hindered due to computational power and a lack of large fine scale datasets. Within the last few years we have witnessed a massive increase in computational processing power and storage, combined with the onset of Big Data. Today geographers find themselves in a data rich era. We now have access to a variety of data sources (e.g., social media, mobile phone data, etc. that tells us how, and when, individuals are using urban spaces. These data raise several questions: can we effectively use them to understand and model cities as complex entities? How well have ABM approaches lent themselves to simulating the dynamics of urban processes? What has been, or will be, the influence of Big Data on increasing our ability to understand and simulate cities? What is the appropriate level of spatial analysis and time frame to model urban phenomena? Within this paper we discuss these questions using several examples of ABM applied to urban geography to begin a dialogue about the utility of ABM for urban modeling. The arguments that the paper raises are applicable across the wider research environment where researchers are considering using this approach.

  17. Understanding space weather with new physical, mathematical and philosophical approaches

    Mateev, Lachezar; Velinov, Peter; Tassev, Yordan


    The actual problems of solar-terrestrial physics, in particular of space weather are related to the prediction of the space environment state and are solved by means of different analyses and models. The development of these investigations can be considered also from another side. This is the philosophical and mathematical approach towards this physical reality. What does it constitute? We have a set of physical processes which occur in the Sun and interplanetary space. All these processes interact with each other and simultaneously participate in the general process which forms the space weather. Let us now consider the Leibniz's monads (G.W. von Leibniz, 1714, Monadologie, Wien; Id., 1710, Théodicée, Amsterdam) and use some of their properties. There are total 90 theses for monads in the Leibniz's work (1714), f.e. "(1) The Monad, of which we shall here speak, is nothing but a simple substance, which enters into compounds. By 'simple' is meant 'without parts'. (Theod. 10.); … (56) Now this connexion or adaptation of all created things to each and of each to all, means that each simple substance has relations which express all the others, and, consequently, that it is a perpetual living mirror of the universe. (Theod. 130, 360.); (59) … this universal harmony, according to which every substance exactly expresses all others through the relations it has with them. (63) … every Monad is, in its own way, a mirror of the universe, and the universe is ruled according to a perfect order. (Theod. 403.)", etc. Let us introduce in the properties of monads instead of the word "monad" the word "process". We obtain the following statement: Each process reflects all other processes and all other processes reflect this process. This analogy is not formal at all, it reflects accurately the relation between the physical processes and their unity. The category monad which in the Leibniz's Monadology reflects generally the philosophical sense is fully identical with the

  18. Extension of Space Food Shelf Life Through Hurdle Approach

    Cooper, M. R.; Sirmons, T. A.; Froio-Blumsack, D.; Mohr, L.; Young, M.; Douglas, G. L.


    The processed and prepackaged space food system is the main source of crew nutrition, and hence central to astronaut health and performance. Unfortunately, space food quality and nutrition degrade to unacceptable levels in two to three years with current food stabilization technologies. Future exploration missions will require a food system that remains safe, acceptable and nutritious through five years of storage within vehicle resource constraints. The potential of stabilization technologies (alternative storage temperatures, processing, formulation, ingredient source, packaging, and preparation procedures), when combined in hurdle approach, to mitigate quality and nutritional degradation is being assessed. Sixteen representative foods from the International Space Station food system were chosen for production and analysis and will be evaluated initially and at one, three, and five years with potential for analysis at seven years if necessary. Analysis includes changes in color, texture, nutrition, sensory quality, and rehydration ratio when applicable. The food samples will be stored at -20 C, 4 C, and 21 C. Select food samples will also be evaluated at -80 C to determine the impacts of ultra-cold storage after one and five years. Packaging film barrier properties and mechanical integrity will be assessed before and after processing and storage. At the study conclusion, if tested hurdles are adequate, formulation, processing, and storage combinations will be uniquely identified for processed food matrices to achieve a five-year shelf life. This study will provide one of the most comprehensive investigations of long duration food stability ever completed, and the achievement of extended food system stability will have profound impacts to health and performance for spaceflight crews and for relief efforts and military applications on Earth.

  19. Space nuclear reactor system diagnosis: Knowledge-based approach

    Ting, Y.T.D.


    SP-100 space nuclear reactor system development is a joint effort by the Department of Energy, the Department of Defense and the National Aeronautics and Space Administration. The system is designed to operate in isolation for many years, and is possibly subject to little or no remote maintenance. This dissertation proposes a knowledge based diagnostic system which, in principle, can diagnose the faults which can either cause reactor shutdown or lead to another serious problem. This framework in general can be applied to the fully specified system if detailed design information becomes available. The set of faults considered herein is identified based on heuristic knowledge about the system operation. The suitable approach to diagnostic problem solving is proposed after investigating the most prevalent methodologies in Artificial Intelligence as well as the causal analysis of the system. Deep causal knowledge modeling based on digraph, fault-tree or logic flowgraph methodology would present a need for some knowledge representation to handle the time dependent system behavior. A proposed qualitative temporal knowledge modeling methodology, using rules with specified time delay among the process variables, has been proposed and is used to develop the diagnostic sufficient rule set. The rule set has been modified by using a time zone approach to have a robust system design. The sufficient rule set is transformed to a sufficient and necessary one by searching the whole knowledge base. Qualitative data analysis is proposed in analyzing the measured data if in a real time situation. An expert system shell - Intelligence Compiler is used to develop the prototype system. Frames are used for the process variables. Forward chaining rules are used in monitoring and backward chaining rules are used in diagnosis

  20. Religion and Communication Spaces. A Semio-pragmatic Approach

    Roger Odin


    Full Text Available Following the reflection initiated in his book The Spaces of Communication, Roger Odin suggests a new distinction between physical communication spaces and mental communication spaces (spaces that we have inside us. The suggestion is exemplified by three film analyses dedicated to the relationships between religion and communication.

  1. The algebraic approach to space-time geometry

    Heller, M.; Multarzynski, P.; Sasin, W.


    A differential manifold can be defined in terms of smooth real functions carried by it. By rejecting the postulate, in such a definition, demanding the local diffeomorphism of a manifold to the Euclidean space, one obtains the so-called differential space concept. Every subset of R n turns out to be a differential space. Extensive parts of differential geometry on differential spaces, developed by Sikorski, are reviewed and adapted to relativistic purposes. Differential space as a new model of space-time is proposed. The Lorentz structure and Einstein's field equations on differential spaces are discussed. 20 refs. (author)

  2. Coordination between Subway and Urban Space: A Networked Approach

    Lei Mao


    Full Text Available This paper selects Changsha as a case study and constructs the models of the subway network and the urban spatial network by using planning data. In the network models, the districts of Changsha are regarded as nodes and the connections between each pair of districts are regarded as edges. The method is based on quantitative analysis of the node weights and the edge weights, which are defined in the complex network theory. And the structures of subway and urban space are visualized in the form of networks. Then, through analyzing the discrepancy coefficients of the corresponding nodes and edges, the paper carries out a comparison between the two networks to evaluate the coordination. The results indicate that only 21.4% of districts and 13.2% of district connections have a rational coordination. Finally, the strategies are put forward for optimization, which suggest adjusting subway transit density, regulating land-use intensity and planning new mass transits for the uncoordinated parts.

  3. The NASA Heliophysics Active Final Archive at the Space Physics Data Facility

    McGuire, Robert E.


    The 2009 NASA Heliophysics Science Data Management Policy re-defined and extended the responsibilities of the Space Physics Data Facility (SPDF) project. Building on SPDF's established capabilities, the new policy assigned the role of active "Final Archive" for non-solar NASA Heliophysics data to SPDF. The policy also recognized and formalized the responsibilities of SPDF as a source for critical infrastructure services such as VSPO to the overall Heliophysics Data Environment (HpDE) and as a Center of Excellence for existing SPDF science-enabling services and software including CDAWeb, SSCWeb/4D Orbit Viewer, OMNIweb and CDF. We will focus this talk to the principles, strategies and planned SPDF architecture to effectively and efficiently perform these roles, with special emphasis on how SPDF will ensure the long-term preservation and ongoing online community access to all the data entrusted to SPDF. We will layout our archival philosophy and what we are advocating in our work with NASA missions both current and future, with potential providers of NASA and NASA-relevant archival data, and to make the data and metadata held by SPDF accessible to other systems and services within the overall HpOE. We will also briefly review our current services, their metrics and our current plans and priorities for their evolution.

  4. Next Generation Space Interconnect Standard (NGSIS): a modular open standards approach for high performance interconnects for space

    Collier, Charles Patrick


    The Next Generation Space Interconnect Standard (NGSIS) effort is a Government-Industry collaboration effort to define a set of standards for interconnects between space system components with the goal of cost effectively removing bandwidth as a constraint for future space systems. The NGSIS team has selected the ANSI/VITA 65 OpenVPXTM standard family for the physical baseline. The RapidIO protocol has been selected as the basis for the digital data transport. The NGSIS standards are developed to provide sufficient flexibility to enable users to implement a variety of system configurations, while meeting goals for interoperability and robustness for space. The NGSIS approach and effort represents a radical departure from past approaches to achieve a Modular Open System Architecture (MOSA) for space systems and serves as an exemplar for the civil, commercial, and military Space communities as well as a broader high reliability terrestrial market.

  5. Analysis of Life Histories: A State Space Approach

    Rajulton, Fernando


    Full Text Available EnglishThe computer package LIFEHIST written by the author, is meant for analyzinglife histories through a state-space approach. Basic ideas on which the various programs have beenbuilt are described in this paper in a non-mathematical language. Users can use various programs formultistate analyses based on Markov and semi-Markov frameworks and sequences of transitions implied inlife histories. The package is under constant revision and programs for using a few specific modelsthe author thinks will be useful for analyzing longitudinal data will be incorporated in the nearfuture.FrenchLe système d'ordinateur LIFEHIST écrit par l'auteur est établi pour analyser desévénements au cours de la vie par une approche qui tient compte des états aucours du temps. Les idées fondamentales à la base des divers programmes dumodule sont décrites dans un langage non-mathématique. Le systèmeLIFEHIST peut être utilisé pour des analyses Markov et semi-Markov desséquences d’événements au cours de la vie. Le module est sous révisionconstante, et des programmes que l’auteur compte ajouter pour l'usage dedonnées longitudinales sont décrit.

  6. A Reparametrization Approach for Dynamic Space-Time Models

    Lee, Hyeyoung; Ghosh, Sujit K.


    Researchers in diverse areas such as environmental and health sciences are increasingly working with data collected across space and time. The space-time processes that are generally used in practice are often complicated in the sense that the auto-dependence structure across space and time is non-trivial, often non-separable and non-stationary in space and time. Moreover, the dimension of such data sets across both space and time can be very large leading to computational difficulties due to...

  7. Field-theoretic approach to gravity in the flat space-time

    Cavalleri, G [Centro Informazioni Studi Esperienze, Milan (Italy); Milan Univ. (Italy). Ist. di Fisica); Spinelli, G [Istituto di Matematica del Politecnico di Milano, Milano (Italy)


    In this paper it is discussed how the field-theoretical approach to gravity starting from the flat space-time is wider than the Einstein approach. The flat approach is able to predict the structure of the observable space as a consequence of the behaviour of the particle proper masses. The field equations are formally equal to Einstein's equations without the cosmological term.

  8. Mapping the Hot Spots: A Zoning Approach to Space Analysis and Design

    Bunnell, Adam; Carpenter, Russell; Hensley, Emily; Strong, Kelsey; Williams, ReBecca; Winter, Rachel


    This article examines a preliminary approach to space design developed and implemented in Eastern Kentucky University's Noel Studio for Academic Creativity. The approach discussed here is entitled "hot spots," which has allowed the research team to observe trends in space usage and composing activities among students. This approach has…

  9. HI-STAR. Health Improvements Through Space Technologies and Resources: Final Report

    Finarelli, Margaret G.


    The purpose of this document is to describe a global strategy to integrate the use of space technology in the fight against malaria. Given the well-documented relationship between the vector and its environment, and the ability of existing space technologies to monitor environmental factors, malaria is a strong candidate for the application of space technology. The concept of a malaria early warning system has been proposed in the past' and pilot studies have been conducted. The HI-STAR project (Health Improvement through Space Technologies and Resources) seeks to build on this concept and enhance the space elements of the suggested framework. As such, the mission statement for this International Space University design project has been defined as follows: "Our mission is to develop and promote a global strategy to help combat malaria using space technology". A general overview of malaria, aspects of how space technology can be useful, and an outline of the HI-STAR strategy is presented.

  10. Review of the Space Mapping Approach to Engineering Optimization and Modeling

    Bakr, M. H.; Bandler, J. W.; Madsen, Kaj


    We review the Space Mapping (SM) concept and its applications in engineering optimization and modeling. The aim of SM is to avoid computationally expensive calculations encountered in simulating an engineering system. The existence of less accurate but fast physically-based models is exploited. S......-based Modeling (SMM). These include Space Derivative Mapping (SDM), Generalized Space Mapping (GSM) and Space Mapping-based Neuromodeling (SMN). Finally, we address open points for research and future development....

  11. Repository documentation rethought. A comprehensive approach from untreated waste to waste packages for final disposal

    Anthofer, Anton Philipp; Schubert, Johannes [VPC GmbH, Dresden (Germany)


    The German Act on Reorganization of Responsibility for Nuclear Disposal (Entsorgungsuebergangsgesetz (EntsorgUebG)) adopted in June 2017 provides the energy utilities with the new option of transferring responsibility for their waste packages to the Federal Government. This is conditional on the waste packages being approved for delivery to the Konrad final repository. A comprehensive approach starts with the dismantling of nuclear facilities and extends from waste disposal and packaging planning to final repository documentation. Waste package quality control measures are planned and implemented as early as in the process qualification stage so that the production of waste packages that are suitable for final deposition can be ensured. Optimization of cask and loading configuration can save container and repository volume. Workflow planning also saves time, expenditure and exposure time for personnel at the facilities. VPC has evaluated this experience and developed it into a comprehensive approach.

  12. A risk-based approach to flammable gas detector spacing.

    Defriend, Stephen; Dejmek, Mark; Porter, Leisa; Deshotels, Bob; Natvig, Bernt


    Flammable gas detectors allow an operating company to address leaks before they become serious, by automatically alarming and by initiating isolation and safe venting. Without effective gas detection, there is very limited defense against a flammable gas leak developing into a fire or explosion that could cause loss of life or escalate to cascading failures of nearby vessels, piping, and equipment. While it is commonly recognized that some gas detectors are needed in a process plant containing flammable gas or volatile liquids, there is usually a question of how many are needed. The areas that need protection can be determined by dispersion modeling from potential leak sites. Within the areas that must be protected, the spacing of detectors (or alternatively, number of detectors) should be based on risk. Detector design can be characterized by spacing criteria, which is convenient for design - or alternatively by number of detectors, which is convenient for cost reporting. The factors that influence the risk are site-specific, including process conditions, chemical composition, number of potential leak sites, piping design standards, arrangement of plant equipment and structures, design of isolation and depressurization systems, and frequency of detector testing. Site-specific factors such as those just mentioned affect the size of flammable gas cloud that must be detected (within a specified probability) by the gas detection system. A probability of detection must be specified that gives a design with a tolerable risk of fires and explosions. To determine the optimum spacing of detectors, it is important to consider the probability that a detector will fail at some time and be inoperative until replaced or repaired. A cost-effective approach is based on the combined risk from a representative selection of leakage scenarios, rather than a worst-case evaluation. This means that probability and severity of leak consequences must be evaluated together. In marine and

  13. Phase-space densities and effects of resonance decays in a hydrodynamic approach to heavy ion collisions

    Akkelin, S.V.; Sinyukov, Yu.M.


    A method allowing analysis of the overpopulation of phase space in heavy ion collisions in a model-independent way is proposed within the hydrodynamic approach. It makes it possible to extract a chemical potential of thermal pions at freeze-out, irrespective of the form of freeze-out (isothermal) hypersurface in Minkowski space and transverse flows on it. The contributions of resonance (with masses up to 2 GeV) decays to spectra, interferometry volumes, and phase-space densities are calculated and discussed in detail. The estimates of average phase-space densities and chemical potentials of thermal pions are obtained for SPS and RHIC energies. They demonstrate that multibosonic phenomena at those energies might be considered as a correction factor rather than as a significant physical effect. The analysis of the evolution of the pion average phase-space density in chemically frozen hadron systems shows that it is almost constant or slightly increases with time while the particle density and phase-space density at each space point decreases rapidly during the system's expansion. We found that, unlike the particle density, the average phase-space density has no direct link to the freeze-out criterion and final thermodynamic parameters, being connected rather to the initial phase-space density of hadronic matter formed in relativistic nucleus-nucleus collisions

  14. Approaching space-time through velocity in doubly special relativity

    Aloisio, R.; Galante, A.; Grillo, A.F.; Luzio, E.; Mendez, F.


    We discuss the definition of velocity as dE/d vertical bar p vertical bar, where E, p are the energy and momentum of a particle, in doubly special relativity (DSR). If this definition matches dx/dt appropriate for the space-time sector, then space-time can in principle be built consistently with the existence of an invariant length scale. We show that, within different possible velocity definitions, a space-time compatible with momentum-space DSR principles cannot be derived

  15. Space Station - An integrated approach to operational logistics support

    Hosmer, G. J.


    Development of an efficient and cost effective operational logistics system for the Space Station will require logistics planning early in the program's design and development phase. This paper will focus on Integrated Logistics Support (ILS) Program techniques and their application to the Space Station program design, production and deployment phases to assure the development of an effective and cost efficient operational logistics system. The paper will provide the methodology and time-phased programmatic steps required to establish a Space Station ILS Program that will provide an operational logistics system based on planned Space Station program logistics support.

  16. Space-Wise approach for airborne gravity data modelling

    Sampietro, D.; Capponi, M.; Mansi, A. H.; Gatti, A.; Marchetti, P.; Sansò, F.


    Regional gravity field modelling by means of remove-compute-restore procedure is nowadays widely applied in different contexts: it is the most used technique for regional gravimetric geoid determination, and it is also used in exploration geophysics to predict grids of gravity anomalies (Bouguer, free-air, isostatic, etc.), which are useful to understand and map geological structures in a specific region. Considering this last application, due to the required accuracy and resolution, airborne gravity observations are usually adopted. However, due to the relatively high acquisition velocity, presence of atmospheric turbulence, aircraft vibration, instrumental drift, etc., airborne data are usually contaminated by a very high observation error. For this reason, a proper procedure to filter the raw observations in both the low and high frequencies should be applied to recover valuable information. In this work, a software to filter and grid raw airborne observations is presented: the proposed solution consists in a combination of an along-track Wiener filter and a classical Least Squares Collocation technique. Basically, the proposed procedure is an adaptation to airborne gravimetry of the Space-Wise approach, developed by Politecnico di Milano to process data coming from the ESA satellite mission GOCE. Among the main differences with respect to the satellite application of this approach, there is the fact that, while in processing GOCE data the stochastic characteristics of the observation error can be considered a-priori well known, in airborne gravimetry, due to the complex environment in which the observations are acquired, these characteristics are unknown and should be retrieved from the dataset itself. The presented solution is suited for airborne data analysis in order to be able to quickly filter and grid gravity observations in an easy way. Some innovative theoretical aspects focusing in particular on the theoretical covariance modelling are presented too

  17. Space commerce in a global economy - Comparison of international approaches to commercial space

    Stone, Barbara A.; Kleber, Peter


    A historical perspective, current status, and comparison of national government/commercial space industry relationships in the United States and Europe are presented. It is noted that space technology has been developed and used primarily to meet the needs of civil and military government initiatives. Two future trends of space technology development include new space enterprises, and the national drive to achieve a more competitive global economic position.

  18. [Optimize dropping process of Ginkgo biloba dropping pills by using design space approach].

    Shen, Ji-Chen; Wang, Qing-Qing; Chen, An; Pan, Fang-Lai; Gong, Xing-Chu; Qu, Hai-Bin


    In this paper, a design space approach was applied to optimize the dropping process of Ginkgo biloba dropping pills. Firstly, potential critical process parameters and potential process critical quality attributes were determined through literature research and pre-experiments. Secondly, experiments were carried out according to Box-Behnken design. Then the critical process parameters and critical quality attributes were determined based on the experimental results. Thirdly, second-order polynomial models were used to describe the quantitative relationships between critical process parameters and critical quality attributes. Finally, a probability-based design space was calculated and verified. The verification results showed that efficient production of Ginkgo biloba dropping pills can be guaranteed by operating within the design space parameters. The recommended operation ranges for the critical dropping process parameters of Ginkgo biloba dropping pills were as follows: dropping distance of 5.5-6.7 cm, and dropping speed of 59-60 drops per minute, providing a reference for industrial production of Ginkgo biloba dropping pills. Copyright© by the Chinese Pharmaceutical Association.

  19. A Systems Approach to Developing an Affordable Space Ground Transportation Architecture using a Commonality Approach

    Garcia, Jerry L.; McCleskey, Carey M.; Bollo, Timothy R.; Rhodes, Russel E.; Robinson, John W.


    This paper presents a structured approach for achieving a compatible Ground System (GS) and Flight System (FS) architecture that is affordable, productive and sustainable. This paper is an extension of the paper titled "Approach to an Affordable and Productive Space Transportation System" by McCleskey et al. This paper integrates systems engineering concepts and operationally efficient propulsion system concepts into a structured framework for achieving GS and FS compatibility in the mid-term and long-term time frames. It also presents a functional and quantitative relationship for assessing system compatibility called the Architecture Complexity Index (ACI). This paper: (1) focuses on systems engineering fundamentals as it applies to improving GS and FS compatibility; (2) establishes mid-term and long-term spaceport goals; (3) presents an overview of transitioning a spaceport to an airport model; (4) establishes a framework for defining a ground system architecture; (5) presents the ACI concept; (6) demonstrates the approach by presenting a comparison of different GS architectures; and (7) presents a discussion on the benefits of using this approach with a focus on commonality.

  20. Applying the system engineering approach to devise a master’s degree program in space technology in developing countries

    Jazebizadeh, Hooman; Tabeshian, Maryam; Taheran Vernoosfaderani, Mahsa


    Although more than half a century is passed since space technology was first developed, developing countries are just beginning to enter the arena, focusing mainly on educating professionals. Space technology by itself is an interdisciplinary science, is costly, and developing at a fast pace. Moreover, a fruitful education system needs to remain dynamic if the quality of education is the main concern, making it a complicated system. This paper makes use of the System Engineering Approach and the experiences of developed countries in this area while incorporating the needs of the developing countries to devise a comprehensive program in space engineering at the Master's level. The needs of the developing countries as regards space technology education may broadly be put into two categories: to raise their knowledge of space technology which requires hard work and teamwork skills, and to transfer and domesticate space technology while minimizing the costs and maximizing its effectiveness. The requirements of such space education system, which include research facilities, courses, and student projects are then defined using a model drawn from the space education systems in universities in North America and Europe that has been modified to include the above-mentioned needs. Three design concepts have been considered and synthesized through functional analysis. The first one is Modular and Detail Study which helps students specialize in a particular area in space technology. Second is referred to as Integrated and Interdisciplinary Study which focuses on understanding and development of space systems. Finally, the third concept which has been chosen for the purpose of this study, is a combination of the other two, categorizing the required curriculum into seven modules, setting aside space applications. This helps students to not only specialize in one of these modules but also to get hands-on experience in a real space project through participation in summer group

  1. A Database Approach to Distributed State Space Generation

    Blom, Stefan; Lisser, Bert; van de Pol, Jan Cornelis; Weber, M.


    We study distributed state space generation on a cluster of workstations. It is explained why state space partitioning by a global hash function is problematic when states contain variables from unbounded domains, such as lists or other recursive datatypes. Our solution is to introduce a database

  2. Geometric approach to evolution problems in metric spaces

    Stojković, Igor


    This PhD thesis contains four chapters where research material is presented. In the second chapter the extension of the product formulas for semigroups induced by convex functionals, from the classical Hilbert space setting to the setting of general CAT(0) spaces. In the third chapter, the

  3. Evaluating public space pedestrian accessibility: a GIS approach

    Morar, T.; Bertolini, L.; Radoslav, R.


    Public spaces are sources of quality of life in neighborhoods. Seeking to help professionals and municipalities assess how well a public space can be used by the community it serves, this paper presents a GIS-based methodology for evaluating its pedestrian accessibility. The Romanian city of

  4. A Database Approach to Distributed State Space Generation

    Blom, Stefan; Lisser, Bert; van de Pol, Jan Cornelis; Weber, M.; Cerna, I.; Haverkort, Boudewijn R.H.M.


    We study distributed state space generation on a cluster of workstations. It is explained why state space partitioning by a global hash function is problematic when states contain variables from unbounded domains, such as lists or other recursive datatypes. Our solution is to introduce a database

  5. Groups, matrices, and vector spaces a group theoretic approach to linear algebra

    Carrell, James B


    This unique text provides a geometric approach to group theory and linear algebra, bringing to light the interesting ways in which these subjects interact. Requiring few prerequisites beyond understanding the notion of a proof, the text aims to give students a strong foundation in both geometry and algebra. Starting with preliminaries (relations, elementary combinatorics, and induction), the book then proceeds to the core topics: the elements of the theory of groups and fields (Lagrange's Theorem, cosets, the complex numbers and the prime fields), matrix theory and matrix groups, determinants, vector spaces, linear mappings, eigentheory and diagonalization, Jordan decomposition and normal form, normal matrices, and quadratic forms. The final two chapters consist of a more intensive look at group theory, emphasizing orbit stabilizer methods, and an introduction to linear algebraic groups, which enriches the notion of a matrix group. Applications involving symm etry groups, determinants, linear coding theory ...

  6. The Resurrection of Malthus: space as the final escape from the law of diminishing returns

    Sommers, J.; Beldavs, V.


    If there is a self-sustaining space economy, which is the goal of the International Lunar Decade, then it is a subject of economic analysis. The immediate challenge of space economics then is to conceptually demonstrate how a space economy could emerge and work where markets do not exist and few human agents may be involved, in fact where human agents may transact with either human agents or robotic agents or robotic agents may transact with other robotic agents.

  7. Space Station Freedom - Approaching the critical design phase

    Kohrs, Richard H.; Huckins, Earle, III


    The status and future developments of the Space Station Freedom are discussed. To date detailed design drawings are being produced to manufacture SSF hardware. A critical design review (CDR) for the man-tended capability configuration is planned to be performed in 1993 under the SSF program. The main objective of the CDR is to enable the program to make a full commitment to proceed to manufacture parts and assemblies. NASA recently signed a contract with the Russian space company, NPO Energia, to evaluate potential applications of various Russian space hardware for on-going NASA programs.

  8. Space-Hotel Early Bird - An Educational and Public Outreach Approach

    Amekrane, R.; Holze, C.


    In April 2001 the German Aerospace Society DGLR e.V. in cooperation with the Technical University of Darmstadt, Germany initiated an interdisciplinary students contest, under the patronage of Mr. Joerg Feustel-Buechl, the Director of Manned Spaceflight and Microgravity, European Space Agency (ESA), for the summer term 2001. It was directed to graduated architecture students, who had to conceive and design a space-hotel with specific technical, economical and social requirements. The to be developed Space Hotel for a low earth orbit has to accommodate 220 guests. It was of utmost importance that this contest becomes an integral part of the student's tuition and that professors of the different academic and industrial institutions supported the project idea. During the summer term 2001 about fifty students occupied themselves with the topic, "design of an innovative space-hotel". The overall challenge was to create rooms used under microgravity environment, which means to overcome existing definitions and to find a new definition of living space. Because none of the students were able to experience such a room under microgravity they were forced to use the power of their imagination capability. The students attended moreover a number of lectures on different technical subjects focusing on space and went on several space-related excursions. Having specialists, as volunteers, in the field of space in charge meant that it could be ensured that the designs reflected a certain possibility of being able to be realized. Within the summer term seventeen major designs developed from the conceptual status to high sophisticated concepts and later on also to respective models. A competition combined with a public exhibition, that took place within the Annual German Aeronautics and Astronautics Congress, and intense media relations finalized this project. The project idea of "Early Bird - Visions of a Space Hotel" which was developed within six month is a remarkable example, how

  9. Fractal electrodynamics via non-integer dimensional space approach

    Tarasov, Vasily E.


    Using the recently suggested vector calculus for non-integer dimensional space, we consider electrodynamics problems in isotropic case. This calculus allows us to describe fractal media in the framework of continuum models with non-integer dimensional space. We consider electric and magnetic fields of fractal media with charges and currents in the framework of continuum models with non-integer dimensional spaces. An application of the fractal Gauss's law, the fractal Ampere's circuital law, the fractal Poisson equation for electric potential, and equation for fractal stream of charges are suggested. Lorentz invariance and speed of light in fractal electrodynamics are discussed. An expression for effective refractive index of non-integer dimensional space is suggested.

  10. The XML approach to implementing space link extension service management

    Tai, W.; Welz, G. A.; Theis, G.; Yamada, T.


    A feasibility study has been conducted at JPL, ESOC, and ISAS to assess the possible applications of the eXtensible Mark-up Language (XML) capabilities to the implementation of the CCSDS Space Link Extension (SLE) Service Management function.

  11. Interpolation of final geometry and result fields in process parameter space

    Misiun, Grzegorz Stefan; Wang, Chao; Geijselaers, Hubertus J.M.; van den Boogaard, Antonius H.; Saanouni, K.


    Different routes to produce a product in a bulk forming process can be described by a limited set of process parameters. The parameters determine the final geometry as well as the distribution of state variables in the final shape. Ring rolling has been simulated using different parameter settings.

  12. The canonical Lagrangian approach to three-space general relativity

    Shyam, Vasudev; Venkatesh, Madhavan


    We study the action for the three-space formalism of general relativity, better known as the Barbour-Foster-Ó Murchadha action, which is a square-root Baierlein-Sharp-Wheeler action. In particular, we explore the (pre)symplectic structure by pulling it back via a Legendre map to the tangent bundle of the configuration space of this action. With it we attain the canonical Lagrangian vector field which generates the gauge transformations (3-diffeomorphisms) and the true physical evolution of the system. This vector field encapsulates all the dynamics of the system. We also discuss briefly the observables and perennials for this theory. We then present a symplectic reduction of the constrained phase space.

  13. Lie-Hamilton systems on curved spaces: a geometrical approach

    Herranz, Francisco J.; de Lucas, Javier; Tobolski, Mariusz


    A Lie-Hamilton system is a nonautonomous system of first-order ordinary differential equations describing the integral curves of a t-dependent vector field taking values in a finite-dimensional Lie algebra, a Vessiot-Guldberg Lie algebra, of Hamiltonian vector fields relative to a Poisson structure. Its general solution can be written as an autonomous function, the superposition rule, of a generic finite family of particular solutions and a set of constants. We pioneer the study of Lie-Hamilton systems on Riemannian spaces (sphere, Euclidean and hyperbolic plane), pseudo-Riemannian spaces (anti-de Sitter, de Sitter, and Minkowski spacetimes) as well as on semi-Riemannian spaces (Newtonian spacetimes). Their corresponding constants of motion and superposition rules are obtained explicitly in a geometric way. This work extends the (graded) contraction of Lie algebras to a contraction procedure for Lie algebras of vector fields, Hamiltonian functions, and related symplectic structures, invariants, and superposition rules.

  14. A multilevel control approach for a modular structured space platform

    Chichester, F. D.; Borelli, M. T.


    A three axis mathematical representation of a modular assembled space platform consisting of interconnected discrete masses, including a deployable truss module, was derived for digital computer simulation. The platform attitude control system as developed to provide multilevel control utilizing the Gauss-Seidel second level formulation along with an extended form of linear quadratic regulator techniques. The objectives of the multilevel control are to decouple the space platform's spatial axes and to accommodate the modification of the platform's configuration for each of the decoupled axes.

  15. Weaponizing the Final Frontier: The United States and the New Space Race


    prepare to defend these systems from attack.41 The next logical step is the development and execution of this philosophy to secure national interests...fourth argument impacting the weaponization of space references is the question of morality . In the article, Moral and Ethical Decisions Regarding Space...Warfare, Col (now General) John Hyten and Dr. Robert Uy describe the moral and ethical considerations to evaluate as the United States shapes

  16. Space, the final frontier: A critical review of recent experiments performed in microgravity.

    Vandenbrink, Joshua P; Kiss, John Z


    Space biology provides an opportunity to study plant physiology and development in a unique microgravity environment. Recent space studies with plants have provided interesting insights into plant biology, including discovering that plants can grow seed-to-seed in microgravity, as well as identifying novel responses to light. However, spaceflight experiments are not without their challenges, including limited space, limited access, and stressors such as lack of convection and cosmic radiation. Therefore, it is important to design experiments in a way to maximize the scientific return from research conducted on orbiting platforms such as the International Space Station. Here, we provide a critical review of recent spaceflight experiments and suggest ways in which future experiments can be designed to improve the value and applicability of the results generated. These potential improvements include: utilizing in-flight controls to delineate microgravity versus other spaceflight effects, increasing scientific return via next-generation sequencing technologies, and utilizing multiple genotypes to ensure results are not unique to one genetic background. Space experiments have given us new insights into plant biology. However, to move forward, special care should be given to maximize science return in understanding both microgravity itself as well as the combinatorial effects of living in space. Copyright © 2015. Published by Elsevier Ireland Ltd.

  17. Effect of Repeated/Spaced Formative Assessments on Medical School Final Exam Performance

    Edward K. Chang


    Discussion: Performance on weekly formative assessments was predictive of final exam scores. Struggling medical students will benefit from extra cumulative practice exams while students who are excelling do not need extra practice.

  18. The group approach to AdS space propagators

    Leonhardt, Thorsten; Manvelyan, Ruben; Ruehl, Werner


    We show that AdS two-point functions can be obtained by connecting two points in the interior of AdS space with one point on its boundary by a dual pair of Dobrev's boundary-to-bulk intertwiners and integrating over the boundary point

  19. Hybrid Enhanced Epidermal SpaceSuit Design Approaches

    Jessup, Joseph M.

    A Space suit that does not rely on gas pressurization is a multi-faceted problem that requires major stability controls to be incorporated during design and construction. The concept of Hybrid Epidermal Enhancement space suit integrates evolved human anthropomorphic and physiological adaptations into its functionality, using commercially available bio-medical technologies to address shortcomings of conventional gas pressure suits, and the impracticalities of MCP suits. The prototype HEE Space Suit explored integumentary homeostasis, thermal control and mobility using advanced bio-medical materials technology and construction concepts. The goal was a space suit that functions as an enhanced, multi-functional bio-mimic of the human epidermal layer that works in attunement with the wearer rather than as a separate system. In addressing human physiological requirements for design and construction of the HEE suit, testing regimes were devised and integrated into the prototype which was then subject to a series of detailed tests using both anatomical reproduction methods and human subject.

  20. An approach to developing user interfaces for space systems

    Shackelford, Keith; McKinney, Karen


    Inherent weakness in the traditional waterfall model of software development has led to the definition of the spiral model. The spiral model software development lifecycle model, however, has not been applied to NASA projects. This paper describes its use in developing real time user interface software for an Environmental Control and Life Support System (ECLSS) Process Control Prototype at NASA's Marshall Space Flight Center.

  1. Real-space renormalization group approach to driven diffusive systems

    Hanney, T [SUPA and School of Physics, University of Edinburgh, Mayfield Road, Edinburgh, EH9 3JZ (United Kingdom); Stinchcombe, R B [Theoretical Physics, 1 Keble Road, Oxford, OX1 3NP (United Kingdom)


    We introduce a real-space renormalization group procedure for driven diffusive systems which predicts both steady state and dynamic properties. We apply the method to the boundary driven asymmetric simple exclusion process and recover exact results for the steady state phase diagram, as well as the crossovers in the relaxation dynamics for each phase.

  2. Real-space renormalization group approach to driven diffusive systems

    Hanney, T; Stinchcombe, R B


    We introduce a real-space renormalization group procedure for driven diffusive systems which predicts both steady state and dynamic properties. We apply the method to the boundary driven asymmetric simple exclusion process and recover exact results for the steady state phase diagram, as well as the crossovers in the relaxation dynamics for each phase

  3. Quantitative approach to measuring the cerebrospinal fluid space with CT

    Zeumer, H.; Hacke, W.; Hartwich, P.


    A method for measuring the subarachnoid space by using an independent CT evaluation unit is described. The normal values have been calculated for patients, according to age, and three examples are presented demonstrating reversible decrease of brain volume in patients suffering anorexia nervosa and chronic alcoholism.

  4. Long-Term Memory: A State-Space Approach

    Kiss, George R.


    Some salient concepts derived from the information sciences and currently used in theories of human memory are critically reviewed. The application of automata theory is proposed as a new approach in this field. The approach is illustrated by applying it to verbal memory. (Author)

  5. A Conceptual Approach for Optimising Bus Stop Spacing

    Johar, Amita; Jain, S. S.; Garg, P. k.


    An efficient public transportation system is essential of any country. The growth, development and shape of the urban areas are mainly due to availability of good transportation (Shah et al. in Inst Town Plan India J 5(3):50-59, 1). In developing countries, like India, travel by local bus in a city is very common. The accidents, congestion, pollution and appropriate location of bus stops are the major problems arising in metropolitan cities. Among all the metropolitan cities in India, Delhi has highest percentage of growth of population and vehicles. Therefore, it is important to adopt efficient and effective ways to improve mobility in different metropolitan cities in order to overcome the problem and to reduce the number of private vehicles on the road. The primary objective of this paper is to present a methodology for developing a model for optimum bus stop spacing (OBSS). It describes the evaluation of existing urban bus route, data collection, development of model for optimizing urban bus route and application of model. In this work, the bus passenger generalized cost method is used to optimize the spacing between bus stops. For the development of model, a computer program is required to be written. The applicability of the model has been evaluated by taking the data of urban bus route of Delhi Transport Corporation (DTC) in Excel sheet in first phase. Later on, it is proposed to develop a programming in C++ language. The developed model is expected to be useful to transport planner for rational design of the spacing of bus stops to save travel time and to generalize operating cost. After analysis it is found that spacing between the bus stop comes out to be between 250 and 500 m. The Proposed Spacing of bus stops is done considering the points that they don't come nearer to metro/rail station, entry or exit of flyover and near traffic signal.

  6. Implementation of an Open-Scenario, Long-Term Space Debris Simulation Approach

    Nelson, Bron; Yang Yang, Fan; Carlino, Roberto; Dono Perez, Andres; Faber, Nicolas; Henze, Chris; Karacalioglu, Arif Goktug; O'Toole, Conor; Swenson, Jason; Stupl, Jan


    This paper provides a status update on the implementation of a flexible, long-term space debris simulation approach. The motivation is to build a tool that can assess the long-term impact of various options for debris-remediation, including the LightForce space debris collision avoidance concept that diverts objects using photon pressure [9]. State-of-the-art simulation approaches that assess the long-term development of the debris environment use either completely statistical approaches, or they rely on large time steps on the order of several days if they simulate the positions of single objects over time. They cannot be easily adapted to investigate the impact of specific collision avoidance schemes or de-orbit schemes, because the efficiency of a collision avoidance maneuver can depend on various input parameters, including ground station positions and orbital and physical parameters of the objects involved in close encounters (conjunctions). Furthermore, maneuvers take place on timescales much smaller than days. For example, LightForce only changes the orbit of a certain object (aiming to reduce the probability of collision), but it does not remove entire objects or groups of objects. In the same sense, it is also not straightforward to compare specific de-orbit methods in regard to potential collision risks during a de-orbit maneuver. To gain flexibility in assessing interactions with objects, we implement a simulation that includes every tracked space object in Low Earth Orbit (LEO) and propagates all objects with high precision and variable time-steps as small as one second. It allows the assessment of the (potential) impact of physical or orbital changes to any object. The final goal is to employ a Monte Carlo approach to assess the debris evolution during the simulation time-frame of 100 years and to compare a baseline scenario to debris remediation scenarios or other scenarios of interest. To populate the initial simulation, we use the entire space

  7. A Continuum Mechanical Approach to Geodesics in Shape Space


    mean curvature flow equation. Calc. Var., 3:253–271, 1995. [30] Siddharth Manay, Daniel Cremers , Byung-Woo Hong, Anthony J. Yezzi, and Stefano Soatto...P. W. Michor and D. Mumford. Riemannian geometries on spaces of plane curves. J. Eur. Math. Soc., 8:1–48, 2006. 37 [33] Peter W. Michor, David ... Cremers . Shape matching by variational computation of geodesics on a manifold. In Pattern Recognition, LNCS 4174, pages 142–151, 2006. [38] P

  8. Analytical Approach to Space- and Time-Fractional Burgers Equations

    Yıldırım, Ahmet; Mohyud-Din, Syed Tauseef


    A scheme is developed to study numerical solution of the space- and time-fractional Burgers equations under initial conditions by the homotopy analysis method. The fractional derivatives are considered in the Caputo sense. The solutions are given in the form of series with easily computable terms. Numerical solutions are calculated for the fractional Burgers equation to show the nature of solution as the fractional derivative parameter is changed

  9. 13th Workshop on Radiation Monitoring for the International Space Station - Final Program


    The Workshop on Radiation Monitoring for the International Space Station (WRMISS) has been held annually since 1996. The major purpose of WRMISS is to provide a forum for discussion of technical issues concerning radiation dosimetry aboard the International Space Station. This includes discussion of new results, improved instrumentation, detector calibration, and radiation environment and transport models. The goal of WRMISS is to enhance international efforts to provide the best information on the space radiation environment in low-Earth orbit and on the exposure of astronauts and cosmonauts in order to optimize the radiation safety of the ISS crew. During the 13 th Annual WRMISS, held in the Institute of Nuclear Physics (Krakow, Poland) on 8-10 September 2008, participants presented 47 lectures

  10. Shutdown and degradation: Space computers for nuclear application, verification of radiation hardness. Final report

    Eichhorn, E.; Gerber, V.; Schreyer, P.


    (1) Employment of those radiation hard electronics which are already known in military and space applications. (2) The experience in space-flight shall be used to investigate nuclear technology areas, for example, by using space electronics to prove the range of applications in nuclear radiating environments. (3) Reproduction of a computer developed for telecommunication satellites; proof of radiation hardness by radiation tests. (4) At 328 Krad (Si) first failure of radiation tolerant devices with 100 Krad (Si) hardness guaranteed. (5) Using radiation hard devices of the same type you can expect applications at doses of greater than 1 Mrad (Si). Electronic systems applicable for radiation categories D, C and lower part of B for manipulators, vehicles, underwater robotics. (orig.) [de

  11. Second space Christmas for ESA: Huygens to begin its final journey to Titan/ Media activities.


    the morning of 25 December at about 05:08 CET. Since the Cassini orbiter will have to achieve precise pointing for the release, there will be no real-time telemetry available until it turns back its main antenna toward Earth and beams the recorded data of the release. It will take over an hour (67 min) for the signals to reach us on Earth. The final data confirming the separation will be available later on Christmas Day. After release, Huygens will move away from Cassini at a speed of about 35 cm per second and, to keep on track, will spin on its axis, making about 7 revolutions a minute. Huygens will not communicate with Cassini for the whole period until after deployment of the main parachute following entry into Titan’s atmosphere. On 28 December Cassini will then manoeuvre off collision course to resume its mission and prepare itself to receive Huygens data, which it will record for later playback to Earth. Huygens will remain dormant until a few hours before its arrival at Titan on 14 January. The entry into the atmosphere is set for 11:15 CET. Huygens is planned to complete its descent in about two hours and 15 minutes, beaming back its science data to the Cassini orbiter for replay to Earth later in the afternoon. If Huygens, which is designed as an atmospheric probe rather than a lander, survives touchdown on the surface, it could deliver up to 2 hours of bonus data before the link with Cassini is lost. Direct radio signals from Huygens will reach Earth after 67 minutes of interplanetary travel at the speed of light. An experiment has been set up by radio scientists that will use an array of radio telescopes around the Pacific to attempt to detect a faint tone from Huygens. If successful, early detection is not expected before around 11:30 CET. The European Space Agency owns and manages the Huygens probe and is in charge of operations of the probe from its control centre in Darmstadt, Germany. NASA's Jet Propulsion Laboratory in Pasadena, California

  12. Stable isotopes to trace food web stressors: Is space the final frontier?

    To support community decision-making, we need to evaluate sources of stress and impact at a variety of spatial scales, whether local or watershed-based. Increasingly, we are using stable isotope-based approaches to determine those scales of impact, and using these approaches in v...

  13. Space Acquisitions: Challenges Facing DOD as it Changes Approaches to Space Acquisitions


    alternatives to support decisions about the future of space programs, there are gaps in cost and other data needed to weigh the pros and cons of changes to...preliminary work suggests there are gaps in cost and other data needed to weigh the pros and cons of changes to space systems. Second, most changes...Facebook, Flickr, Twitter, and YouTube . Subscribe to our RSS Feeds or E-mail Updates. Listen to our Podcasts and read The Watchblog. Visit GAO on the

  14. Development of an international safeguards approach to the final disposal of spent fuel in geological repositories

    Murphey, W.M.; Moran, B.W.; Fattah, A.


    The International Atomic Energy Agency (IAEA) is currently pursuing development of an international safeguards approach for the final disposal of spent fuel in geological repositories through consultants meetings and through the Program for Development of Safeguards for Final Disposal of Spent Fuel in Geological Repositories (SAGOR). The consultants meetings provide policy guidance to IAEA; SAGOR recommends effective approaches that can be efficiently implemented by IAEA. The SAGOR program, which is a collaboration of eight Member State Support Programs (MSSPs), was initiated in July 1994 and has identified 15 activities in each of three areas (i.e. conditioning facilities, active repositories, and closed repositories) that must be performed to ensure an efficient, yet effective safeguards approach. Two consultants meetings have been held: the first in May 1991 and the last in November 1995. For nuclear materials emplaced in a geological repository, the safeguards objectives were defined to be (1) to detect the diversion of spent fuel, whether concealed or unconcealed, from the repository and (2) to detect undeclared activities of safeguards concern (e.g., tunneling, underground reprocessing, or substitution in containers)

  15. Learning the Task Management Space of an Aircraft Approach Model

    Krall, Joseph; Menzies, Tim; Davies, Misty


    Validating models of airspace operations is a particular challenge. These models are often aimed at finding and exploring safety violations, and aim to be accurate representations of real-world behavior. However, the rules governing the behavior are quite complex: nonlinear physics, operational modes, human behavior, and stochastic environmental concerns all determine the responses of the system. In this paper, we present a study on aircraft runway approaches as modeled in Georgia Tech's Work Models that Compute (WMC) simulation. We use a new learner, Genetic-Active Learning for Search-Based Software Engineering (GALE) to discover the Pareto frontiers defined by cognitive structures. These cognitive structures organize the prioritization and assignment of tasks of each pilot during approaches. We discuss the benefits of our approach, and also discuss future work necessary to enable uncertainty quantification.

  16. Coset Space Dimensional Reduction approach to the Standard Model

    Farakos, K.; Kapetanakis, D.; Koutsoumbas, G.; Zoupanos, G.


    We present a unified theory in ten dimensions based on the gauge group E 8 , which is dimensionally reduced to the Standard Mode SU 3c xSU 2 -LxU 1 , which breaks further spontaneously to SU 3L xU 1em . The model gives similar predictions for sin 2 θ w and proton decay as the minimal SU 5 G.U.T., while a natural choice of the coset space radii predicts light Higgs masses a la Coleman-Weinberg

  17. Technical approach to finalizing sensible soil cleanup levels at the Fernald Environmental Management Project

    Carr, D.; Hertel, B.; Jewett, M.; Janke, R.; Conner, B.


    The remedial strategy for addressing contaminated environmental media was recently finalized for the US Department of Energy's (DOE) Fernald Environmental Management Project (FEMP) following almost 10 years of detailed technical analysis. The FEMP represents one of the first major nuclear facilities to successfully complete the Remedial Investigation/Feasibility Study (RI/FS) phase of the environmental restoration process. A critical element of this success was the establishment of sensible cleanup levels for contaminated soil and groundwater both on and off the FEMP property. These cleanup levels were derived based upon a strict application of Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) regulations and guidance, coupled with positive input from the regulatory agencies and the local community regarding projected future land uses for the site. The approach for establishing the cleanup levels was based upon a Feasibility Study (FS) strategy that examined a bounding range of viable future land uses for the site. Within each land use, the cost and technical implications of a range of health-protective cleanup levels for the environmental media were analyzed. Technical considerations in driving these cleanup levels included: direct exposure routes to viable human receptors; cross- media impacts to air, surface water, and groundwater; technical practicality of attaining the levels; volume of affected media; impact to sensitive environmental receptors or ecosystems; and cost. This paper will discuss the technical approach used to support the finalization of the cleanup levels for the site. The final cleanup levels provide the last remaining significant piece to the puzzle of establishing a final site-wide remedial strategy for the FEMP, and positions the facility for the expedient completion of site-wide remedial activities

  18. NASA Research Announcement Phase 2 Final Report for the Development of a Power Assisted Space Suit Glove

    Lingo, Robert; Cadogan, Dave; Sanner, Rob; Sorenson, Beth


    The main goal of this program was to develop an unobtrusive power-assisted EVA glove metacarpalphalangeal (MCP) joint that could provide the crew member with as close to nude body performance as possible, and to demonstrate the technology feasibility of power assisted space suit components in general. The MCP joint was selected due to its being representative of other space suit joints, such as the shoulder, hip and carpometacarpal joint, that would also greatly benefit from this technology. In order to meet this objective, a development team of highly skilled and experienced personnel was assembled. The team consisted of two main entities. The first was comprised of ILC's experienced EVA space suit glove designers, who had the responsibility of designing and fabricating a low torque MCP joint which would be compatible with power assisted technology. The second part of the team consisted of space robotics experts from the University of Maryland's Space Systems Laboratory. This team took on the responsibility of designing and building the robotics aspects of the power-assist system. Both parties addressed final system integration responsibilities.

  19. Solar space and water heating system at Stanford University Central Food Services Building. Final report


    This active hydronic domestic hot water and space heating system was 840 ft/sup 2/ of single-glazed, liquid, flat plate collectors and 1550 gal heat storage tanks. The following are discussed: energy conservation, design philosophy, operation, acceptance testing, performance data, collector selection, bidding, costs, economics, problems, and recommendations. An operation and maintenance manual and as-built drawings are included in appendices. (MHR)

  20. A feasibility study of space-charge neutralized ion induction linacs: Final report

    Slutz, S.A.; Primm, P.; Renk, T.; Johnson, D.J.


    Applications for high current (> 1 kA) ion beams are increasing. They include hardening of material surfaces, transmutation of radioactive waste, cancer treatment, and possibly driving fusion reactions to create energy. The space-charge of ions limits the current that can be accelerated in a conventional ion linear accelerator (linac). Furthermore, the accelerating electric field must be kept low enough to avoid the generation and acceleration of counter-streaming electrons. These limitations have resulted in ion accelerator designs that employ long beam lines and would be expensive to build. Space-charge neutralization and magnetic insulation of the acceleration gaps could substantially reduce these two limitations, but at the expense of increasing the complexity of the beam physics. We present theory and experiments to determine the degree of charge-neutralization that can be achieved in various environments found in ion accelerators. Our results suggest that, for high current applications, space-charge neutralization could be used to improve on the conventional ion accelerator technology. There are two basic magnetic field geometries that can be used to insulate the accelerating gaps, a radial field or a cusp field. We will present studies related to both of these geometries. We shall also present numerical simulations of open-quotes multicuspclose quotes accelerator that would deliver potassium ions at 400 MeV with a total beam power of approximately 40 TW. Such an accelerator could be used to drive fusion

  1. Space radiation studies. Final report, 22 July 1983-30 June 1989


    Two Active Radiation Dosimeters (ARD's) flown on Spacelab 1, performed without fault and were returned to Space Science Laboratory, MSFC for recalibration. During the flight, performance was monitored at the Huntsville Operations Center (HOSC). Despite some problems with the Shuttle data system handling the verification flight instrumentation (VFI), it was established that the ARD's were operating normally. Postflight calibrations of both units determined that sensitivities were essentially unchanged from preflight values. Flight tapes were received for approx. 60 percent of the flight and it appears that this is the total available. The data was analyzed in collaboration with Space Science Laboratory, MSFC. Also, the Nuclear Radiation Monitor (NRM) was assembled and tested at MSFC. Support was rendered in the areas of materials control and parts were supplied for the supplementary heaters, dome gas-venting device and photomultiplier tube housing. Performance characteristics of some flight-space photomultipliers were measured. The NRM was flown on a balloon-borne test flight and subsequently performed without fault on Spacelab-2. This data was analyzed and published

  2. High-efficiency pump for space helium transfer. Final Technical Report

    Hasenbein, R.; Izenson, M.G.; Swift, W.L.; Sixsmith, H.


    A centrifugal pump was developed for the efficient and reliable transfer of liquid helium in space. The pump can be used to refill cryostats on orbiting satellites which use liquid helium for refrigeration at extremely low temperatures. The pump meets the head and flow requirements of on-orbit helium transfer: a flow rate of 800 L/hr at a head of 128 J/kg. The overall pump efficiency at the design point is 0.45. The design head and flow requirements are met with zero net positive suction head, which is the condition in an orbiting helium supply Dewar. The mass transfer efficiency calculated for a space transfer operation is 0.99. Steel ball bearings are used with gas fiber-reinforced teflon retainers to provide solid lubrication. These bearings have demonstrated the longest life in liquid helium endurance tests under simulated pumping conditions. Technology developed in the project also has application for liquid helium circulation in terrestrial facilities and for transfer of cryogenic rocket propellants in space

  3. Approach to design space from retrospective quality data.

    Puñal Peces, Daniel; García-Montoya, Encarna; Manich, Albert; Suñé-Negre, Josep Maria; Pérez-Lozano, Pilar; Miñarro, Montse; Ticó, Josep Ramon


    Nowadays, the entire manufacturing process is based on the current GMPs, which emphasize the reproducibility of the process, and companies have a lot of recorded data about their processes. The establishment of the design space (DS) from retrospective data for a wet compression process. A design of experiments (DoE) with historical data from 4 years of industrial production has been carried out using the experimental factors as the results of the previous risk analysis and eight key parameters (quality specifications) that encompassed process and quality control data. Software Statgraphics 5.0 was applied, and data were processed to obtain eight DS as well as their safe and working ranges. Experience shows that it is possible to determine DS retrospectively, being the greatest difficulty in handling and processing of high amounts of data; however, the practicality of this study is very interesting as it let have the DS with minimal investment in experiments since actual production batch data are processed statistically.

  4. Space and Time as Relations: The Theoretical Approach of Leibniz

    Basil Evangelidis


    Full Text Available The epistemological rupture of Copernicus, the laws of planetary motions of Kepler, the comprehensive physical observations of Galileo and Huygens, the conception of relativity, and the physical theory of Newton were components of an extremely fertile and influential cognitive environment that prompted the restless Leibniz to shape an innovative theory of space and time. This theory expressed some of the concerns and intuitions of the scientific community of the seventeenth century, in particular the scientific group of the Academy of Sciences of Paris, but remained relatively unknown until the twentieth century. After Einstein, however, the relational theory of Leibniz gained wider respect and fame. The aim of this article is to explain how Leibniz foresaw relativity, through his critique of contemporary mechanistic philosophy.

  5. Advanced free space optics (FSO) a systems approach

    Majumdar, Arun K


    This book provides a comprehensive, unified tutorial covering the most recent advances in the technology of free-space optics (FSO). It is an all-inclusive source of information on the fundamentals of FSO as well as up-to-date information on the state-of-the-art in technologies available today. This text is intended for graduate students, and will also be useful for research scientists and engineers with an interest in the field. FSO communication is a practical solution for creating a three dimensional global broadband communications grid, offering bandwidths far beyond what is possible in the Radio Frequency (RF) range. However, the attributes of atmospheric turbulence and scattering impose perennial limitations on availability and reliability of FSO links. From a systems point-of-view, this groundbreaking book provides a thorough understanding of channel behavior, which can be used to design and evaluate optimum transmission techniques that operate under realistic atmospheric conditions. Topics addressed...

  6. Accessibility of green space in urban areas: an examination of various approaches to measure it

    Zhang, Xin


    In the present research, we attempt to improve the methods used for measuring accessibility of green spaces by combining two components of accessibility-distance and demand relative to supply. Three modified approaches (Joseph and Bantock gravity model measure, the two-step floating catchment area measure and a measure based on kernel densities) will be applied for measuring accessibility to green spaces. We select parks and public open spaces (metropolitan open land) of south London as a cas...

  7. A state space approach for the eigenvalue problem of marine risers

    Alfosail, Feras; Nayfeh, Ali H.; Younis, Mohammad I.


    A numerical state-space approach is proposed to examine the natural frequencies and critical buckling limits of marine risers. A large axial tension in the riser model causes numerical limitations. These limitations are overcome by using

  8. Proper Motions of Dwarf Spheroidal Galaxies from Hubble Space Telescope Imaging. V. Final Measurement for Fornax

    Piatek, Slawomir; Pryor, Carlton; Bristow, Paul; Olszewski, Edward W.; Harris, Hugh C.; Mateo, Mario; Minniti, Dante; Tinney, Christopher G.


    The measured proper motion of Fornax, expressed in the equatorial coordinate system, is (μα,μδ)=(47.6+/-4.6,-36.0+/-4.1) mas century-1. This proper motion is a weighted mean of four independent measurements for three distinct fields. Each measurement uses a quasi-stellar object as a reference point. Removing the contribution of the motion of the Sun and of the local standard of rest to the measured proper motion produces a Galactic rest-frame proper motion of (μGrfα,μGrfδ)=(24.4+/-4.6,-14.3+/-4.1) mas century-1. The implied space velocity with respect to the Galactic center has a radial component of Vr=-31.8+/-1.7 km s-1 and a tangential component of Vt=196+/-29 km s-1. Integrating the motion of Fornax in a realistic potential for the Milky Way produces orbital elements. The perigalacticon and apogalacticon are 118 (66, 137) and 152 (144, 242) kpc, respectively, where the values in the parentheses represent the 95% confidence intervals derived from Monte Carlo experiments. The eccentricity of the orbit is 0.13 (0.11, 0.38), and the orbital period is 3.2 (2.5, 4.6) Gyr. The orbit is retrograde and inclined by 101° (94°, 107°) to the Galactic plane. Fornax could be a member of a proposed ``stream'' of galaxies and globular clusters; however, the membership of another proposed galaxy in the stream, Sculptor, has been previously ruled out. Fornax is in the Kroupa-Theis-Boily plane, which contains 11 of the Galactic satellite galaxies, but its orbit will take it out of that plane. Based on observations with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555.

  9. Simulation of space radiation effects on polyimide film materials for high temperature applications. Final report

    Fogdall, L.B.; Cannaday, S.S.


    Space environment effects on candidate materials for the solar sail film are determined. Polymers, including metallized polyimides that might be suitable solar radiation receivers, were exposed to combined proton and solar electromagnetic radiation. Each test sample was weighted, to simulate the tension on the polymer when it is stretched into near-planar shape while receiving solar radiation. Exposure rates up to 16 times that expected in Earth orbit were employed, to simulate near-sun solar sailing conditions. Sample appearance, elongation, and shrinkage were monitored, noted, and documented in situ. Thermosetting polyimides showed less degradation or visual change in appearance than thermoplastics

  10. A Belief-Space Approach to Integrated Intelligence - Research Area 10.3: Intelligent Networks


    A Belief-Space Approach to Integrated Intelligence- Research Area 10.3: Intelligent Networks The views, opinions and/or findings contained in this...Technology (MIT) Title: A Belief-Space Approach to Integrated Intelligence- Research Area 10.3: Intelligent Networks Report Term: 0-Other Email: tlp...students presented progress and received feedback from the research group . o wrote papers on their research and submitted them to leading conferences

  11. Development of a coal fired pulse combustor for residential space heating. Phase I, Final report



    This report presents the results of the first phase of a program for the development of a coal-fired residential combustion system. This phase consisted of the design, fabrication, testing, and evaluation of an advanced pulse combustor sized for residential space heating requirements. The objective was to develop an advanced pulse coal combustor at the {approximately} 100,000 Btu/hr scale that can be integrated into a packaged space heating system for small residential applications. The strategy for the development effort included the scale down of the feasibility unit from 1-2 MMBtu/hr to 100,000 Btu/hr to establish a baseline for isolating the effect of scale-down and new chamber configurations separately. Initial focus at the residential scale was concentrated on methods of fuel injection and atomization in a bare metal unit. This was followed by incorporating changes to the advanced chamber designs and testing of refractory-lined units. Multi-fuel capability for firing oil or gas as a secondary fuel was also established. Upon completion of the configuration and component testing, an optimum configuration would be selected for integrated testing of the pulse combustor unit. The strategy also defined the use of Dry Ultrafine Coal (DUC) for Phases 1 and 2 of the development program with CWM firing to be a product improvement activity for a later phase of the program.

  12. A Cost Effective System Design Approach for Critical Space Systems

    Abbott, Larry Wayne; Cox, Gary; Nguyen, Hai


    NASA-JSC required an avionics platform capable of serving a wide range of applications in a cost-effective manner. In part, making the avionics platform cost effective means adhering to open standards and supporting the integration of COTS products with custom products. Inherently, operation in space requires low power, mass, and volume while retaining high performance, reconfigurability, scalability, and upgradability. The Universal Mini-Controller project is based on a modified PC/104-Plus architecture while maintaining full compatibility with standard COTS PC/104 products. The architecture consists of a library of building block modules, which can be mixed and matched to meet a specific application. A set of NASA developed core building blocks, processor card, analog input/output card, and a Mil-Std-1553 card, have been constructed to meet critical functions and unique interfaces. The design for the processor card is based on the PowerPC architecture. This architecture provides an excellent balance between power consumption and performance, and has an upgrade path to the forthcoming radiation hardened PowerPC processor. The processor card, which makes extensive use of surface mount technology, has a 166 MHz PowerPC 603e processor, 32 Mbytes of error detected and corrected RAM, 8 Mbytes of Flash, and I Mbytes of EPROM, on a single PC/104-Plus card. Similar densities have been achieved with the quad channel Mil-Std-1553 card and the analog input/output cards. The power management built into the processor and its peripheral chip allows the power and performance of the system to be adjusted to meet the requirements of the application, allowing another dimension to the flexibility of the Universal Mini-Controller. Unique mechanical packaging allows the Universal Mini-Controller to accommodate standard COTS and custom oversized PC/104-Plus cards. This mechanical packaging also provides thermal management via conductive cooling of COTS boards, which are typically

  13. Gravity Probe B: final results of a space experiment to test general relativity.

    Everitt, C W F; DeBra, D B; Parkinson, B W; Turneaure, J P; Conklin, J W; Heifetz, M I; Keiser, G M; Silbergleit, A S; Holmes, T; Kolodziejczak, J; Al-Meshari, M; Mester, J C; Muhlfelder, B; Solomonik, V G; Stahl, K; Worden, P W; Bencze, W; Buchman, S; Clarke, B; Al-Jadaan, A; Al-Jibreen, H; Li, J; Lipa, J A; Lockhart, J M; Al-Suwaidan, B; Taber, M; Wang, S


    Gravity Probe B, launched 20 April 2004, is a space experiment testing two fundamental predictions of Einstein's theory of general relativity (GR), the geodetic and frame-dragging effects, by means of cryogenic gyroscopes in Earth orbit. Data collection started 28 August 2004 and ended 14 August 2005. Analysis of the data from all four gyroscopes results in a geodetic drift rate of -6601.8±18.3  mas/yr and a frame-dragging drift rate of -37.2±7.2  mas/yr, to be compared with the GR predictions of -6606.1  mas/yr and -39.2  mas/yr, respectively ("mas" is milliarcsecond; 1  mas=4.848×10(-9)  rad).

  14. NIAC Phase I Study Final Report on Large Ultra-Lightweight Photonic Muscle Space Structures

    Ritter, Joe


    The research goal is to develop new tools support NASA's mission of understanding of the Cosmos by developing cost effective solutions that yield a leap in performance and science data. 'Maikalani' in Hawaiian translates to, "knowledge we gain from the cosmos." Missions like Hubble have fundamentally changed humanity's view of the cosmos. Last year's Nobel prize in physics was a result of astronomical discoveries. $9B class JWST size (6.5 meter diameter) space telescopes, when launched are anticipated to rewrite our knowledge of physics. Here we report on a neoteric meta-material telescope mirror technology designed to enable a factor of 100 or more reduction in areal density, a factor of 100 reduction in telescope production and launch costs as well as other advantages; a leap to enable missions to image the cosmos in unprecedented detail, with the associated gain in knowledge. Whether terahertz, visible or X-ray, reflectors used for high quality electromagnetic imaging require shape accuracy (surface figure) to far better than 1 wavelength (lambda) of the incident photons, more typically lambda/10 or better. Imaging visible light therefore requires mirror surfaces that approximate a desired curve (e.g. a sphere or paraboloid) with smooth shape deviation of th less than approximately 1/1000 the diameter of a human hair. This requires either thick high modulus material like glass or metal, or actuators to control mirror shape. During Phase I our team studied a novel solution to this systems level design mass/shape tradespace requirement both to advance the innovative space technology concept and also to help NASA and other agencies meet current operational and future mission requirements. Extreme and revolutionary NASA imaging missions such as Terrestrial Planet Imager (TPI) require lightweight mirrors with minimum diameters of 20 to 40 meters. For reference, NASA's great achievement; the Hubble space telescope, is only 2.4 meters in diameter. What is required is a

  15. Free-piston Stirling engine conceptual design and technologies for space power, Phase 1. Final Report

    Penswick, L.B.; Beale, W.T.; Wood, J.G.


    As part of the SP-100 program, a phase 1 effort to design a free-piston Stirling engine (FPSE) for a space dynamic power conversion system was completed. SP-100 is a combined DOD/DOE/NASA program to develop nuclear power for space. This work was completed in the initial phases of the SP-100 program prior to the power conversion concept selection for the Ground Engineering System (GES). Stirling engine technology development as a growth option for SP-100 is continuing after this phase 1 effort. Following a review of various engine concepts, a single-cylinder engine with a linear alternator was selected for the remainder of the study. The relationships of specific mass and efficiency versus temperature ratio were determined for a power output of 25 kWe. This parametric study was done for a temperature ratio range of 1.5 to 2.0 and for hot-end temperatures of 875 K and 1075 K. A conceptual design of a 1080 K FPSE with a linear alternator producing 25 kWe output was completed. This was a single-cylinder engine designed for a 62,000 hour life and a temperature ratio of 2.0. The heat transport systems were pumped liquid-metal loops on both the hot and cold ends. These specifications were selected to match the SP-100 power system designs that were being evaluated at that time. The hot end of the engine used both refractory and superalloy materials; the hot-end pressure vessel featured an insulated design that allowed use of the superalloy material. The design was supported by the hardware demonstration of two of the component concepts - the hydrodynamic gas bearing for the displacer and the dynamic balance system. The hydrodynamic gas bearing was demonstrated on a test rig. The dynamic balance system was tested on the 1 kW RE-1000 engine at NASA Lewis

  16. Unified Approach to Modeling and Simulation of Space Communication Networks and Systems

    Barritt, Brian; Bhasin, Kul; Eddy, Wesley; Matthews, Seth


    Network simulator software tools are often used to model the behaviors and interactions of applications, protocols, packets, and data links in terrestrial communication networks. Other software tools that model the physics, orbital dynamics, and RF characteristics of space systems have matured to allow for rapid, detailed analysis of space communication links. However, the absence of a unified toolset that integrates the two modeling approaches has encumbered the systems engineers tasked with the design, architecture, and analysis of complex space communication networks and systems. This paper presents the unified approach and describes the motivation, challenges, and our solution - the customization of the network simulator to integrate with astronautical analysis software tools for high-fidelity end-to-end simulation. Keywords space; communication; systems; networking; simulation; modeling; QualNet; STK; integration; space networks

  17. Spatial Polygamy and Contextual Exposures (SPACEs): Promoting Activity Space Approaches in Research on Place and Health

    Matthews, Stephen A.; Yang, Tse-Chuan


    Exposure science has developed rapidly and there is an increasing call for greater precision in the measurement of individual exposures across space and time. Social science interest in an individual’s environmental exposure, broadly conceived, has arguably been quite limited conceptually and methodologically. Indeed, we appear to lag behind our exposure science colleagues in our theories, data, and methods. In this paper we discuss a framework based on the concept of spatial polygamy to demonstrate the need to collect new forms of data on human spatial behavior and contextual exposures across time and space. Adopting new data and methods will be essential if we want to better understand social inequality in terms of exposure to health risks and access to health resources. We discuss the opportunities and challenges focusing on the potential seemingly offered by focusing on human mobility, and specifically the utilization of activity space concepts and data. A goal of the paper is to spatialize social and health science concepts and research practice vis-a-vis the complexity of exposure. The paper concludes with some recommendations for future research focusing on theoretical and conceptual development, promoting research on new types of places and human movement, the dynamic nature of contexts, and on training. “When we elect wittingly or unwittingly, to work within a level … we tend to discern or construct – whichever emphasis you prefer – only those kinds of systems whose elements are confined to that level.”Otis Dudley Duncan (1961, p. 141). “…despite the new ranges created by improved transportation, local government units have tended to remain medieval in size.”Torsten Hägerstrand (1970, p.18) “A detective investigating a crime needs both tools and understanding. If he has no fingerprint powder, he will fail to find fingerprints on most surfaces. If he does not understand where the criminal is likely to have put his fingers, he will not

  18. Quantifying space, understanding minds: A visual summary approach

    Mark Simpson


    Full Text Available This paper presents an illustrated, validated taxonomy of research that compares spatial measures to human behavior. Spatial measures quantify the spatial characteristics of environments, such as the centrality of intersections in a street network or the accessibility of a room in a building from all the other rooms. While spatial measures have been of interest to spatial sciences, they are also of importance in the behavioral sciences for use in modeling human behavior. A high correlation between values for spatial measures and specific behaviors can provide insights into an environment's legibility, and contribute to a deeper understanding of human spatial cognition. Research in this area takes place in several domains, which makes a full understanding of existing literature difficult. To address this challenge, we adopt a visual summary approach. Literature is analyzed, and recurring topics are identified and validated with independent inter-rater agreement tasks in order to create a robust taxonomy for spatial measures and human behavior. The taxonomy is then illustrated with a visual representation that allows for at-a-glance visual access to the content of individual research papers in a corpus. A public web interface has been created that allows interested researchers to add to the database and create visual summaries for their research papers using our taxonomy.

  19. A behavioral approach to shared mapping of peripersonal space between oneself and others.

    Teramoto, Wataru


    Recent physiological studies have showed that some visuotactile brain areas respond to other's peripersonal spaces (PPS) as they would their own. This study investigates this PPS remapping phenomenon in terms of human behavior. Participants placed their left hands on a tabletop screen where visual stimuli were projected. A vibrotactile stimulator was attached to the tip of their index finger. While a white disk approached or receded from the hand in the participant's near or far space, the participant was instructed to quickly detect a target (vibrotactile stimulation, change in the moving disk's color or both). When performing this task alone, the participants exhibited shorter detection times when the disk approached the hand in their near space. In contrast, when performing the task with a partner across the table, the participants exhibited shorter detection times both when the disk approached their own hand in their near space and when it approached the partner's hand in the partner's near space but the participants' far space. This phenomenon was also observed when the body parts from which the visual stimuli approached/receded differed between the participant and partner. These results suggest that humans can share PPS representations and/or body-derived attention/arousal mechanisms with others.

  20. Combining Statistical Methodologies in Water Quality Monitoring in a Hydrological Basin - Space and Time Approaches

    Costa, Marco; A. Manuela Gonçalves


    In this work are discussed some statistical approaches that combine multivariate statistical techniques and time series analysis in order to describe and model spatial patterns and temporal evolution by observing hydrological series of water quality variables recorded in time and space. These approaches are illustrated with a data set collected in the River Ave hydrological basin located in the Northwest region of Portugal.

  1. Direct utilization of geothermal energy for space and water heating at Marlin, Texas. Final report

    Conover, M.F.; Green, T.F.; Keeney, R.C.; Ellis, P.F. II; Davis, R.J.; Wallace, R.C.; Blood, F.B.


    The Torbett-Hutchings-Smith Memorial Hospital geothermal heating project, which is one of nineteen direct-use geothermal projects funded principally by DOE, is documented. The five-year project encompassed a broad range of technical, institutional, and economic activities including: resource and environmental assessments; well drilling and completion; system design, construction, and monitoring; economic analyses; public awareness programs; materials testing; and environmental monitoring. Some of the project conclusions are that: (1) the 155/sup 0/F Central Texas geothermal resource can support additional geothermal development; (2) private-sector economic incentives currently exist, especially for profit-making organizations, to develop and use this geothermal resource; (3) potential uses for this geothermal resource include water and space heating, poultry dressing, natural cheese making, fruit and vegetable dehydrating, soft-drink bottling, synthetic-rubber manufacturing, and furniture manufacturing; (4) high maintenance costs arising from the geofluid's scaling and corrosion tendencies can be avoided through proper analysis and design; (5) a production system which uses a variable-frequency drive system to control production rate is an attractive means of conserving parasitic pumping power, controlling production rate to match heating demand, conserving the geothermal resource, and minimizing environmental impacts.

  2. Final Environmental Impact Statement (EIS) for the Space Nuclear Thermal Propulsion (SNTP) program


    A program has been proposed to develop the technology and demonstrate the feasibility of a high-temperature particle bed reactor (PBR) propulsion system to be used to power an advanced second stage nuclear rocket engine. The purpose of this Final Environmental Impact Statement (FEIS) is to assess the potential environmental impacts of component development and testing, construction of ground test facilities, and ground testing. Major issues and goals of the program include the achievement and control of predicted nuclear power levels; the development of materials that can withstand the extremely high operating temperatures and hydrogen flow environments; and the reliable control of cryogenic hydrogen and hot gaseous hydrogen propellant. The testing process is designed to minimize radiation exposure to the environment. Environmental impact and mitigation planning are included for the following areas of concern: (1) Population and economy; (2) Land use and infrastructure; (3) Noise; (4) Cultural resources; (5) Safety (non-nuclear); (6) Waste; (7) Topography; (8) Geology; (9) Seismic activity; (10) Water resources; (11) Meteorology/Air quality; (12) Biological resources; (13) Radiological normal operations; (14) Radiological accidents; (15) Soils; and (16) Wildlife habitats.

  3. Solar space heating for the visitors' center, Stephens College, Columbia, Missouri. Final report

    Henley, Marion


    This document is the final report of the solar energy system located at the Visitors' Center on the Stephens College Campus, Columbia, Missouri. The system is installed in a four-story, 15,000 square foot building designed to include the college's Admission Office, nine guest rooms for overnight lodging for official guests of the college, a two-story art gallery, and a Faculty Lounge. The solar energy system is an integral design of the building and utilizes 176 Honeywell/Lennox hydronic flat-plate collectors which use a 50% water-ethylene glycol solution and water-to-water heat exchanger. Solar heated water is stored in a 5000 gallon water storage tank located in the basement equipment room. A natural gas fired hot water boiler supplies hot water when the solar energy heat supply fails to meet the demand. The designed solar contribution is 71% of the heating load. The demonstration period for this project ends June 30, 1984.

  4. Final Report for 'Design calculations for high-space-charge beam-to-RF conversion'

    Smithe, David N.


    Accelerator facility upgrades, new accelerator applications, and future design efforts are leading to novel klystron and IOT device concepts, including multiple beam, high-order mode operation, and new geometry configurations of old concepts. At the same time, a new simulation capability, based upon finite-difference 'cut-cell' boundaries, has emerged and is transforming the existing modeling and design capability with unparalleled realism, greater flexibility, and improved accuracy. This same new technology can also be brought to bear on a difficult-to-study aspect of the energy recovery linac (ERL), namely the accurate modeling of the exit beam, and design of the beam dump for optimum energy efficiency. We have developed new capability for design calculations and modeling of a broad class of devices which convert bunched beam kinetic energy to RF energy, including RF sources, as for example, klystrons, gyro-klystrons, IOT's, TWT's, and other devices in which space-charge effects are important. Recent advances in geometry representation now permits very accurate representation of the curved metallic surfaces common to RF sources, resulting in unprecedented simulation accuracy. In the Phase I work, we evaluated and demonstrated the capabilities of the new geometry representation technology as applied to modeling and design of output cavity components of klystron, IOT's, and energy recovery srf cavities. We identified and prioritized which aspects of the design study process to pursue and improve in Phase II. The development and use of the new accurate geometry modeling technology on RF sources for DOE accelerators will help spark a new generational modeling and design capability, free from many of the constraints and inaccuracy associated with the previous generation of 'stair-step' geometry modeling tools. This new capability is ultimately expected to impact all fields with high power RF sources, including DOE fusion research, communications, radar and other

  5. Space Station Freedom - Configuration management approach to supporting concurrent engineering and total quality management. [for NASA Space Station Freedom Program

    Gavert, Raymond B.


    Some experiences of NASA configuration management in providing concurrent engineering support to the Space Station Freedom program for the achievement of life cycle benefits and total quality are discussed. Three change decision experiences involving tracing requirements and automated information systems of the electrical power system are described. The potential benefits of concurrent engineering and total quality management include improved operational effectiveness, reduced logistics and support requirements, prevention of schedule slippages, and life cycle cost savings. It is shown how configuration management can influence the benefits attained through disciplined approaches and innovations that compel consideration of all the technical elements of engineering and quality factors that apply to the program development, transition to operations and in operations. Configuration management experiences involving the Space Station program's tiered management structure, the work package contractors, international partners, and the participating NASA centers are discussed.

  6. A phase-space approach to atmospheric dynamics based on observational data. Theory and applications

    Wang Risheng.


    This thesis is an attempt to develop systematically a phase-space approach to the atmospheric dynamics based on the theoretical achievement and application experiences in nonlinear time-series analysis. In particular, it is concerned with the derivation of quantities for describing the geometrical structure of the observed dynamics in phase-space (dimension estimation) and the examination of the observed atmospheric fluctuations in the light of phase-space representation. The thesis is, therefore composed of three major parts, i.e. an general survey of the theory of statistical approaches to dynamic systems, the methodology designed for the present study and specific applications with respect to dimension estimation and to a phase-space analysis of the tropical stratospheric quasi-biennial oscillation. (orig./KW)

  7. Application of the Quality by Design Approach to the Freezing Step of Freeze-Drying: Building the Design Space.

    Arsiccio, Andrea; Pisano, Roberto


    The present work shows a rational method for the development of the freezing step of a freeze-drying cycle. The current approach to the selection of freezing conditions is still empirical and nonsystematic, thus resulting in poor robustness of control strategy. The final aim of this work is to fill this gap, describing a rational procedure, based on mathematical modeling, for properly choosing the freezing conditions. Mechanistic models are used for the prediction of temperature profiles during freezing and dimension of ice crystals being formed. Mathematical description of the drying phase of freeze-drying is also coupled with the results obtained by freezing models, thus providing a comprehensive characterization of the lyophilization process. In this framework, deep understanding of the phenomena involved is required, and according to the Quality by Design approach, this knowledge can be used to build the design space. The step-by-step procedure for building the design space for freezing is thus described, and examples of applications are provided. The calculated design space is validated upon experimental data, and we show that it allows easy control of the freezing process and fast selection of appropriate operating conditions. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  8. Symbols, spaces and materiality: a transmission-based approach to Aegean Bronze Age ritual.

    Briault, C.


    This thesis explores the transmission of ritual practices in the second millennium BC Aegean. In contrast to previous approaches, which often overlook gaps in the diachronic record, emphasising continuity in cult practice over very long timescales, it is argued here that through charting the spatial and temporal distributions of three broad material types (cult symbols, spaces and objects), it is possible to document the spread of cult practice over time and space, and, crucially, to monitor ...

  9. Final Report: 03-LW-005 Space-Time Secure Communications for Hostile Environments

    Candy, J V; Poggio, A J; Chambers, D H; Guidry, B L; Robbins, C L; Hertzog, C A; Dowla, F; Burke, G; Kane, R


    The development of communications for highly reverberative environments is a major concern for both the private and military sectors whether the application is aimed at the securing a stock order or stalking hostile in a tunnel or cave. Other such environments can range from a hostile urban setting populated with a multitude of buildings and vehicles to the simple complexity of a large number of sound sources that are common in the stock exchange, or military operations in an environment with a topographic features hills, valleys, mountains or even a maze of buried water pipes attempting to transmit information about any chemical anomalies in the water system servicing a city or town. These inherent obstructions cause transmitted signals to reflect, refract and disperse in a multitude of directions distorting both their shape and arrival times at network receiver locations. Imagine troops attempting to communicate on missions in underground caves consisting of a maze of chambers causing multiple echoes with the platoon leader trying to issue timely commands to neutralize terrorists. This is the problem with transmitting information in a complex environment. Waves are susceptible to multiple paths and distortions created by a variety of possible obstructions, which may exist in the particular propagation medium. This is precisely the communications problem we solve using the physics of wave propagation to not only mitigate the noxious effects created by the hostile medium, but also to utilize it in a constructive manner enabling a huge benefit in communications. We employ time-reversal (T/R) communications to accomplish this task. This project is concerned with the development of secure communications techniques that can operate even in the most extreme conditions while maintaining a secure link between host and client stations. We developed an approach based on the concept of time-reversal (T/R) signal processing. In fact, the development of T/R communication

  10. Solar chimney: A sustainable approach for ventilation and building space conditioning

    Lal, S.,


    Full Text Available The residential and commercial buildings demand increase with rapidly growing population. It leads to the vertical growth of the buildings and needs proper ventilation and day-lighting. The natural air ventilation system is not significantly works in conventional structure, so fans and air conditioners are mandatory to meet the proper ventilation and space conditioning. Globally building sector consumed largest energy and utmost consumed in heating, ventilation and space conditioning. This load can be reduced by application of solar chimney and integrated approaches in buildings for heating, ventilation and space conditioning. It is a sustainable approach for these applications in buildings. The authors are reviewed the concept, various method of evaluation, modelings and performance of solar chimney variables, applications and integrated approaches.

  11. The Faster, Better, Cheaper Approach to Space Missions: An Engineering Management Assessment

    Hamaker, Joe


    This paper describes, in viewgraph form, the faster, better, cheaper approach to space missions. The topics include: 1) What drives "Faster, Better, Cheaper"? 2) Why Space Programs are Costly; 3) Background; 4) Aerospace Project Management (Old Culture); 5) Aerospace Project Management (New Culture); 6) Scope of Analysis Limited to Engineering Management Culture; 7) Qualitative Analysis; 8) Some Basic Principles of the New Culture; 9) Cause and Effect; 10) "New Ways of Doing Business" Survey Results; 11) Quantitative Analysis; 12) Recent Space System Cost Trends; 13) Spacecraft Dry Weight Trend; 14) Complexity Factor Trends; 15) Cost Normalization; 16) Cost Normalization Algorithm; 17) Unnormalized Cost vs. Normalized Cost; and 18) Concluding Observations.

  12. Lateral skull base approaches in the management of benign parapharyngeal space tumors.

    Prasad, Sampath Chandra; Piccirillo, Enrico; Chovanec, Martin; La Melia, Claudio; De Donato, Giuseppe; Sanna, Mario


    To evaluate the role of lateral skull base approaches in the management of benign parapharyngeal space tumors and to propose an algorithm for their surgical approach. Retrospective study of patients with benign parapharyngeal space tumors. The clinical features, radiology and preoperative management of skull base neurovasculature, the surgical approaches and overall results were recorded. 46 patients presented with 48 tumors. 12 were prestyloid and 36 poststyloid. 19 (39.6%) tumors were paragangliomas, 15 (31.25%) were schwannomas and 11 (23%) were pleomorphic adenomas. Preoperative embolization was performed in 19, stenting of the internal carotid artery in 4 and permanent balloon occlusion in 2 patients. 19 tumors were approached by the transcervical, 13 by transcervical-transparotid, 5 by transcervical-transmastoid, 6, 1 and 2 tumors by the infratemporal fossa approach types A, B and D, respectively. Total radical tumor removal was achieved in 46 (96%) of the cases. Lateral skull base approaches have an advantage over other approaches in the management of benign tumors of the parapharyngeal space due to the fact that they provide excellent exposure with less morbidity. The use of microscope combined with bipolar cautery reduces morbidity. Stenting of internal carotid artery gives a chance for complete tumor removal with arterial preservation. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. The third spatial dimension risk approach for individual risk and group risk in multiple use of space

    Suddle, Shahid; Ale, Ben


    Buildings above roads and railways are examples of multiple use of space. Safety is one of the critical issues for such projects. Risk analyses can be undertaken to investigate what safety measures that are required to realise these projects. The results of these analyses can also be compared to risk acceptance criteria, if they are applicable. In The Netherlands, there are explicit criteria for acceptability of individual risk and societal risk. Traditionally calculations of individual risk result in contours of equal risk on a map and thus are considered in two-dimensional space only. However, when different functions are layered the third spatial dimension, height, becomes an important parameter. The various activities and structures above and below each other impose mutual risks. There are no explicit norms or policies about how to deal with the individual or group risk approach in the third dimension. This paper proposes an approach for these problems and gives some examples. Finally, the third dimension risk approach is applied in a case study of Bos en Lommer, Amsterdam

  14. An Autonomous Sensor Tasking Approach for Large Scale Space Object Cataloging

    Linares, R.; Furfaro, R.

    The field of Space Situational Awareness (SSA) has progressed over the last few decades with new sensors coming online, the development of new approaches for making observations, and new algorithms for processing them. Although there has been success in the development of new approaches, a missing piece is the translation of SSA goals to sensors and resource allocation; otherwise known as the Sensor Management Problem (SMP). This work solves the SMP using an artificial intelligence approach called Deep Reinforcement Learning (DRL). Stable methods for training DRL approaches based on neural networks exist, but most of these approaches are not suitable for high dimensional systems. The Asynchronous Advantage Actor-Critic (A3C) method is a recently developed and effective approach for high dimensional systems, and this work leverages these results and applies this approach to decision making in SSA. The decision space for the SSA problems can be high dimensional, even for tasking of a single telescope. Since the number of SOs in space is relatively high, each sensor will have a large number of possible actions at a given time. Therefore, efficient DRL approaches are required when solving the SMP for SSA. This work develops a A3C based method for DRL applied to SSA sensor tasking. One of the key benefits of DRL approaches is the ability to handle high dimensional data. For example DRL methods have been applied to image processing for the autonomous car application. For example, a 256x256 RGB image has 196608 parameters (256*256*3=196608) which is very high dimensional, and deep learning approaches routinely take images like this as inputs. Therefore, when applied to the whole catalog the DRL approach offers the ability to solve this high dimensional problem. This work has the potential to, for the first time, solve the non-myopic sensor tasking problem for the whole SO catalog (over 22,000 objects) providing a truly revolutionary result.

  15. Urban Multisensory Laboratory, AN Approach to Model Urban Space Human Perception

    González, T.; Sol, D.; Saenz, J.; Clavijo, D.; García, H.


    An urban sensory lab (USL or LUS an acronym in Spanish) is a new and avant-garde approach for studying and analyzing a city. The construction of this approach allows the development of new methodologies to identify the emotional response of public space users. The laboratory combines qualitative analysis proposed by urbanists and quantitative measures managed by data analysis applications. USL is a new approach to go beyond the borders of urban knowledge. The design thinking strategy allows us to implement methods to understand the results provided by our technique. In this first approach, the interpretation is made by hand. However, our goal is to combine design thinking and machine learning in order to analyze the qualitative and quantitative data automatically. Now, the results are being used by students from the Urbanism and Architecture courses in order to get a better understanding of public spaces in Puebla, Mexico and its interaction with people.

  16. Comparing Laser Interferometry and Atom Interferometry Approaches to Space-Based Gravitational-Wave Measurement

    Baker, John; Thorpe, Ira


    Thoroughly studied classic space-based gravitational-wave missions concepts such as the Laser Interferometer Space Antenna (LISA) are based on laser-interferometry techniques. Ongoing developments in atom-interferometry techniques have spurred recently proposed alternative mission concepts. These different approaches can be understood on a common footing. We present an comparative analysis of how each type of instrument responds to some of the noise sources which may limiting gravitational-wave mission concepts. Sensitivity to laser frequency instability is essentially the same for either approach. Spacecraft acceleration reference stability sensitivities are different, allowing smaller spacecraft separations in the atom interferometry approach, but acceleration noise requirements are nonetheless similar. Each approach has distinct additional measurement noise issues.

  17. State-space approach for evaluating the soil-plant-atmosphere system

    Timm, L.C.; Reichardt, K.; Cassaro, F.A.M.; Tominaga, T.T.; Bacchi, O.O.S.; Oliveira, J.C.M.; Dourado-Neto, D.


    Using as examples one sugarcane and one forage oat experiment, both carried out in the State of Sao Paulo, Brazil, this chapter presents recent state-space approaches used to evaluate the relation between soil and plant properties. A contrast is made between classical statistics methodologies that do not take into account the sampling position coordinates, and the more recently used methodologies which include the position coordinates, and allow a better interpretation of the field-sampled data. Classical concepts are first introduced, followed by spatially referenced methodologies like the autocorrelation function, the cross correlation function, and the state-space approach. Two variations of the state-space approach are given: one emphasizes the evolution of the state system while the other based on the bayesian formulation emphasizes the evolution of the estimated observations. It is concluded that these state-space analyses using dynamic regression models improve data analyses and are therefore recommended for analyzing time and space data series related to the performance of a given soil-plant-atmosphere system. (author)

  18. Approaching control for tethered space robot based on disturbance observer using super twisting law

    Hu, Yongxin; Huang, Panfeng; Meng, Zhongjie; Wang, Dongke; Lu, Yingbo


    Approaching control is a key mission for the tethered space robot to perform the task of removing space debris. But the uncertainties of the TSR such as the change of model parameter have an important effect on the approaching mission. Considering the space tether and the attitude of the gripper, the dynamic model of the TSR is derived using Lagrange method. Then a disturbance observer is designed to estimate the uncertainty based on STW control method. Using the disturbance observer, a controller is designed, and the performance is compared with the dynamic inverse controller which turns out that the proposed controller performs better. Numerical simulation validates the feasibility of the proposed controller on the position and attitude tracking of the TSR.

  19. An Effective Approach Control Scheme for the Tethered Space Robot System

    Zhongjie Meng


    Full Text Available The tethered space robot system (TSR, which is composed of a platform, a gripper and a space tether, has great potential in future space missions. Given the relative motion among the platform, tether, gripper and the target, an integrated approach model is derived. Then, a novel coordinated approach control scheme is presented, in which the tether tension, thrusters and the reaction wheel are all utilized. It contains the open-loop trajectory optimization, the feedback trajectory control and attitude control. The numerical simulation results show that the rendezvous between TSR and the target can be realized by the proposed coordinated control scheme, and the propellant consumption is efficiently reduced. Moreover, the control scheme performs well in the presence of the initial state's perturbations, actuator characteristics and sensor errors.

  20. Activity markers and household space in Swahili urban contexts: An integrated geoarchaeological approach

    Wynne-Jones, Stephanie; Sulas, Federica

    , this paper draws from recent work at a Swahili urban site to illustrate the potential and challenges of an integrated geoarchaeological approach to the study of household space. The site of Songo Mnara (14th–16thc. AD) thrived as a Swahili stonetown off the coast of Tanzania. Here, our work has concentrated...

  1. Learning in Earth and Space Science: A Review of Conceptual Change Instructional Approaches

    Mills, Reece; Tomas, Louisa; Lewthwaite, Brian


    In response to calls for research into effective instruction in the Earth and space sciences, and to identify directions for future research, this systematic review of the literature explores research into instructional approaches designed to facilitate conceptual change. In total, 52 studies were identified and analyzed. Analysis focused on the…

  2. Concept of Draft International Standard for a Unified Approach to Space Program Quality Assurance

    Stryzhak, Y.; Vasilina, V.; Kurbatov, V.


    For want of the unified approach to guaranteed space project and product quality assurance, implementation of many international space programs has become a challenge. Globalization of aerospace industry and participation of various international ventures with diverse quality assurance requirements in big international space programs requires for urgent generation of unified international standards related to this field. To ensure successful fulfillment of space missions, aerospace companies should design and process reliable and safe products with properties complying or bettering User's (or Customer's) requirements. Quality of the products designed or processed by subcontractors (or other suppliers) should also be in compliance with the main user (customer)'s requirements. Implementation of this involved set of unified requirements will be made possible by creating and approving a system (series) of international standards under a generic title Space Product Quality Assurance based on a system consensus principle. Conceptual features of the baseline standard in this system (series) should comprise: - Procedures for ISO 9000, CEN and ECSS requirements adaptation and introduction into space product creation, design, manufacture, testing and operation; - Procedures for quality assurance at initial (design) phases of space programs, with a decision on the end product made based on the principle of independence; - Procedures to arrange incoming inspection of products delivered by subcontractors (including testing, audit of supplier's procedures, review of supplier's documentation), and space product certification; - Procedures to identify materials and primary products applied; - Procedures for quality system audit at the component part, primary product and materials supplier facilities; - Unified procedures to form a list of basic performances to be under configuration management; - Unified procedures to form a list of critical space product components, and unified

  3. An integrated mission approach to the space exploration initiative will ensure success

    Coomes, E.P.; Dagle, J.E.; Bamberger, J.A.; Noffsinger, K.E.


    The direction of the American space program, as defined by President Bush and the National Commission on Space, is to expand human presence into the solar system. Landing an American on Mars by the 50th anniversary of the Apollo 11 lunar landing is the goal. This challenge has produced a level of excitement among young Americans not seen for nearly three decades. The exploration and settlement of the space frontier will occupy the creative thoughts and energies of generations of Americans well into the next century. The return of Americans to the moon and beyond must be viewed as a national effort with strong public support if it is to become a reality. Key to making this an actuality is the mission approach selected. Developing a permanent presence in space requires a continual stepping outward from Earch in a logical progressive manner. If we seriously plan to go and to stay, then not only must we plan what we are to do and how we are to do it, we must address the logistic support infrastructure that will allow us to stay there once we arrive. A fully integrated approach to mission planning is needed if the Space exploration Initiative (SEI) is to be successful. Only in this way can a permanent human presence in space be sustained. An integrated infrastructure approach would reduce the number of new systems and technologies requiring development. The resultant horizontal commonality of systems and hardware would reduce the direct economic impact of SEI while an early return on investment through technology spin-offs would be an economic benefit by greatly enhancing our international technical competitiveness. If the exploration, development, and colonization of space is to be affordable and acceptable, careful consideration must be given to such things as ''return on investment'' and ''commercial product potential'' of the technologies developed

  4. State space model extraction of thermohydraulic systems – Part I: A linear graph approach

    Uren, K.R.; Schoor, G. van


    Thermohydraulic simulation codes are increasingly making use of graphical design interfaces. The user can quickly and easily design a thermohydraulic system by placing symbols on the screen resembling system components. These components can then be connected to form a system representation. Such system models may then be used to obtain detailed simulations of the physical system. Usually this kind of simulation models are too complex and not ideal for control system design. Therefore, a need exists for automated techniques to extract lumped parameter models useful for control system design. The goal of this first paper, in a two part series, is to propose a method that utilises a graphical representation of a thermohydraulic system, and a lumped parameter modelling approach, to extract state space models. In this methodology each physical domain of the thermohydraulic system is represented by a linear graph. These linear graphs capture the interaction between all components within and across energy domains – hydraulic, thermal and mechanical. These linear graphs are analysed using a graph-theoretic approach to derive reduced order state space models. These models capture the dominant dynamics of the thermohydraulic system and are ideal for control system design purposes. The proposed state space model extraction method is demonstrated by considering a U-tube system. A non-linear state space model is extracted representing both the hydraulic and thermal domain dynamics of the system. The simulated state space model is compared with a Flownex ® model of the U-tube. Flownex ® is a validated systems thermal-fluid simulation software package. - Highlights: • A state space model extraction methodology based on graph-theoretic concepts. • An energy-based approach to consider multi-domain systems in a common framework. • Allow extraction of transparent (white-box) state space models automatically. • Reduced order models containing only independent state

  5. Limitations Of The Current State Space Modelling Approach In Multistage Machining Processes Due To Operation Variations

    Abellán-Nebot, J. V.; Liu, J.; Romero, F.


    The State Space modelling approach has been recently proposed as an engineering-driven technique for part quality prediction in Multistage Machining Processes (MMP). Current State Space models incorporate fixture and datum variations in the multi-stage variation propagation, without explicitly considering common operation variations such as machine-tool thermal distortions, cutting-tool wear, cutting-tool deflections, etc. This paper shows the limitations of the current State Space model through an experimental case study where the effect of the spindle thermal expansion, cutting-tool flank wear and locator errors are introduced. The paper also discusses the extension of the current State Space model to include operation variations and its potential benefits.

  6. Swamp Works: A New Approach to Develop Space Mining and Resource Extraction Technologies at the National Aeronautics Space Administration (NASA) Kennedy Space Center (KSC)

    Mueller, R. P.; Sibille, L.; Leucht, K.; Smith, J. D.; Townsend, I. I.; Nick, A. J.; Schuler, J. M.


    environment and methodology, with associated laboratories that uses lean development methods and creativity-enhancing processes to invent and develop new solutions for space exploration. This paper will discuss the Swamp Works approach to developing space mining and resource extraction systems and the vision of space development it serves. The ultimate goal of the Swamp Works is to expand human civilization into the solar system via the use of local resources utilization. By mining and using the local resources in situ, it is conceivable that one day the logistics supply train from Earth can be eliminated and Earth independence of a space-based community will be enabled.

  7. Final Report from The University of Texas at Austin for DEGAS: Dynamic Global Address Space programming environments

    Erez, Mattan


    The Dynamic, Exascale Global Address Space programming environment (DEGAS) project will develop the next generation of programming models and runtime systems to meet the challenges of Exascale computing. Our approach is to provide an efficient and scalable programming model that can be adapted to application needs through the use of dynamic runtime features and domain-specific languages for computational kernels. We address the following technical challenges: Programmability: Rich set of programming constructs based on a Hierarchical Partitioned Global Address Space (HPGAS) model, demonstrated in UPC++. Scalability: Hierarchical locality control, lightweight communication (extended GASNet), and ef- ficient synchronization mechanisms (Phasers). Performance Portability: Just-in-time specialization (SEJITS) for generating hardware-specific code and scheduling libraries for domain-specific adaptive runtimes (Habanero). Energy Efficiency: Communication-optimal code generation to optimize energy efficiency by re- ducing data movement. Resilience: Containment Domains for flexible, domain-specific resilience, using state capture mechanisms and lightweight, asynchronous recovery mechanisms. Interoperability: Runtime and language interoperability with MPI and OpenMP to encourage broad adoption.

  8. A practical approach for exploration and modeling of the design space of a bacterial vaccine cultivation process.

    Streefland, M; Van Herpen, P F G; Van de Waterbeemd, B; Van der Pol, L A; Beuvery, E C; Tramper, J; Martens, D E; Toft, M


    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process, especially for processes involving biological products, usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged by new or additional clinical testing. Recent changes in the regulations for pharmaceutical processing allow broader ranges of process settings to be submitted for regulatory approval, the so-called process design space, which means that a manufacturer can optimize his process within the submitted ranges after the product has entered the market, which allows flexible processes. In this article, the applicability of this concept of the process design space is investigated for the cultivation process step for a vaccine against whooping cough disease. An experimental design (DoE) is applied to investigate the ranges of critical process parameters that still result in a product that meets specifications. The on-line process data, including near infrared spectroscopy, are used to build a descriptive model of the processes used in the experimental design. Finally, the data of all processes are integrated in a multivariate batch monitoring model that represents the investigated process design space. This article demonstrates how the general principles of PAT and process design space can be applied for an undefined biological product such as a whole cell vaccine. The approach chosen for model development described here, allows on line monitoring and control of cultivation batches in order to assure in real time that a process is running within the process design space.

  9. A state space approach for the eigenvalue problem of marine risers

    Alfosail, Feras


    A numerical state-space approach is proposed to examine the natural frequencies and critical buckling limits of marine risers. A large axial tension in the riser model causes numerical limitations. These limitations are overcome by using the modified Gram–Schmidt orthonormalization process as an intermediate step during the numerical integration process with the fourth-order Runge–Kutta scheme. The obtained results are validated against those obtained with other numerical methods, such as the finite-element, Galerkin, and power-series methods, and are found to be in good agreement. The state-space approach is shown to be computationally more efficient than the other methods. Also, we investigate the effect of a high applied tension, a high apparent weight, and higher-order modes on the accuracy of the numerical scheme. We demonstrate that, by applying the orthonormalization process, the stability and convergence of the approach are significantly improved.

  10. A Declarative Design Approach to Modeling Traditional and Non-Traditional Space Systems

    Hoag, Lucy M.

    The space system design process is known to be laborious, complex, and computationally demanding. It is highly multi-disciplinary, involving several interdependent subsystems that must be both highly optimized and reliable due to the high cost of launch. Satellites must also be capable of operating in harsh and unpredictable environments, so integrating high-fidelity analysis is important. To address each of these concerns, a holistic design approach is necessary. However, while the sophistication of space systems has evolved significantly in the last 60 years, improvements in the design process have been comparatively stagnant. Space systems continue to be designed using a procedural, subsystem-by-subsystem approach. This method is inadequate since it generally requires extensive iteration and limited or heuristic-based search, which can be slow, labor-intensive, and inaccurate. The use of a declarative design approach can potentially address these inadequacies. In the declarative programming style, the focus of a problem is placed on what the objective is, and not necessarily how it should be achieved. In the context of design, this entails knowledge expressed as a declaration of statements that are true about the desired artifact instead of explicit instructions on how to implement it. A well-known technique is through constraint-based reasoning, where a design problem is represented as a network of rules and constraints that are reasoned across by a solver to dynamically discover the optimal candidate(s). This enables implicit instantiation of the tradespace and allows for automatic generation of all feasible design candidates. As such, this approach also appears to be well-suited to modeling adaptable space systems, which generally have large tradespaces and possess configurations that are not well-known a priori. This research applied a declarative design approach to holistic satellite design and to tradespace exploration for adaptable space systems. The

  11. Collaborative Approaches in Developing Environmental and Safety Management Systems for Commercial Space Transportation

    Zee, Stacey; Murray, D.


    The Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST) licenses and permits U.S. commercial space launch and reentry activities, and licenses the operation of non-federal launch and reentry sites. ASTs mission is to ensure the protection of the public, property, and the national security and foreign policy interests of the United States during commercial space transportation activities and to encourage, facilitate, and promote U.S. commercial space transportation. AST faces unique challenges of ensuring the protection of public health and safety while facilitating and promoting U.S. commercial space transportation. AST has developed an Environmental Management System (EMS) and a Safety Management System (SMS) to help meet its mission. Although the EMS and SMS were developed independently, the systems share similar elements. Both systems follow a Plan-Do-Act-Check model in identifying potential environmental aspects or public safety hazards, assessing significance in terms of severity and likelihood of occurrence, developing approaches to reduce risk, and verifying that the risk is reduced. This paper will describe the similarities between ASTs EMS and SMS elements and how AST is building a collaborative approach in environmental and safety management to reduce impacts to the environment and risks to the public.

  12. A real-space stochastic density matrix approach for density functional electronic structure.

    Beck, Thomas L


    The recent development of real-space grid methods has led to more efficient, accurate, and adaptable approaches for large-scale electrostatics and density functional electronic structure modeling. With the incorporation of multiscale techniques, linear-scaling real-space solvers are possible for density functional problems if localized orbitals are used to represent the Kohn-Sham energy functional. These methods still suffer from high computational and storage overheads, however, due to extensive matrix operations related to the underlying wave function grid representation. In this paper, an alternative stochastic method is outlined that aims to solve directly for the one-electron density matrix in real space. In order to illustrate aspects of the method, model calculations are performed for simple one-dimensional problems that display some features of the more general problem, such as spatial nodes in the density matrix. This orbital-free approach may prove helpful considering a future involving increasingly parallel computing architectures. Its primary advantage is the near-locality of the random walks, allowing for simultaneous updates of the density matrix in different regions of space partitioned across the processors. In addition, it allows for testing and enforcement of the particle number and idempotency constraints through stabilization of a Feynman-Kac functional integral as opposed to the extensive matrix operations in traditional approaches.

  13. About One Approach to Determine the Weights of the State Space Method

    I. K. Romanova


    Full Text Available The article studies methods of determining weight coefficients, also called coefficients of criteria importance in multiobjective optimization (MOO. It is assumed that these coefficients indicate a degree of individual criteria influence on the final selection (final or summary assessment: the more is coefficient, the greater is contribution of its corresponding criterion.Today in the framework of modern information systems to support decision making for various purposes a number of methods for determining relative importance of criteria has been developed. Among those methods we can distinguish a utility method, method of weighted power average; weighted median; method of matching clustered rankings, method of paired comparison of importance, etc.However, it should be noted that different techniques available for calculating weights does not eliminate the main problem of multicriteria optimization namely, the inconsistency of individual criteria. The basis for solving multicriteria problems is a fundamental principle of multi-criteria selection i.e. Edgeworth - Pareto principle.Despite a large number of methods to determine the weights, the task remains relevant not only for reasons of evaluations subjectivity, but also because of the mathematical aspects. Today, recognized is the fact that, for example, such a popular method as linear convolution of private criteria, essentially, represents one of the heuristic approaches and, applying it, you can have got not the best final choice. Carlin lemma reflects the limits of the method application.The aim of this work is to offer one of the methods to calculate the weights applied to the problem of dynamic system optimization, the quality of which is determined by the criterion of a special type, namely integral quadratic quality criterion. The main challenge relates to the method of state space, which in the literature also is called the method of analytical design of optimal controllers.Despite the

  14. Residential and commercial space heating and cooling with possible greenhouse operation; Baca Grande development, San Luis Valley, Colorado. Final report

    Goering, S.W.; Garing, K.L.; Coury, G.E.; Fritzler, E.A.


    A feasibility study was performed to evaluate the potential of multipurpose applications of moderate-temperature geothermal waters in the vicinity of the Baca Grande community development in the San Luis Valley, Colorado. The project resource assessment, based on a thorough review of existing data, indicates that a substantial resource likely exists in the Baca Grande region capable of supporting residential and light industrial activity. Engineering designs were developed for geothermal district heating systems for space heating and domestic hot water heating for residences, including a mobile home park, an existing motel, a greenhouse complex, and other small commercial uses such as aquaculture. In addition, a thorough institutional analysis of the study area was performed to highlight factors which might pose barriers to the ultimate commercial development of the resource. Finally, an environmental evaluation of the possible impacts of the proposed action was also performed. The feasibility evaluation indicates the economics of the residential areas are dependent on the continued rate of housing construction. If essentially complete development could occur over a 30-year period, the economics are favorable as compared to existing alternatives. For the commercial area, the economics are good as compared to existing conventional energy sources. This is especially true as related to proposed greenhouse operations. The institutional and environmental analyses indicates that no significant barriers to development are apparent.

  15. Space-time trajectories of wind power generation: Parameterized precision matrices under a Gaussian copula approach

    Tastu, Julija; Pinson, Pierre; Madsen, Henrik


    -correlations. Estimation is performed in a maximum likelihood framework. Based on a test case application in Denmark, with spatial dependencies over 15 areas and temporal ones for 43 hourly lead times (hence, for a dimension of n = 645), it is shown that accounting for space-time effects is crucial for generating skilful......Emphasis is placed on generating space-time trajectories of wind power generation, consisting of paths sampled from high-dimensional joint predictive densities, describing wind power generation at a number of contiguous locations and successive lead times. A modelling approach taking advantage...

  16. Innovative Approaches to Space-Based Manufacturing and Rapid Prototyping of Composite Materials

    Hill, Charles S.


    The ability to deploy large habitable structures, construct, and service exploration vehicles in low earth orbit will be an enabling capability for continued human exploration of the solar system. It is evident that advanced manufacturing methods to fabricate replacement parts and re-utilize launch vehicle structural mass by converting it to different uses will be necessary to minimize costs and allow flexibility to remote crews engaged in space travel. Recent conceptual developments and the combination of inter-related approaches to low-cost manufacturing of composite materials and structures are described in context leading to the possibility of on-orbit and space-based manufacturing.


    Yurij F. Telnov


    Full Text Available This article reveals principles of semantic structuring of information and educational space of objects of knowledge and scientific and educational services with use of methods of ontologic engineering. Novelty of offered approach is interface of ontology of a content and ontology of scientific and educational services that allows to carry out effective composition of services and objects of knowledge according to models of professional competences and requirements being trained. As a result of application of methods of information and educational space semantic structuring integration of use of the diverse distributed scientific and educational content by educational institutions for carrying out scientific researches, methodical development and training is provided.

  18. A perturbative approach to the redshift space correlation function: beyond the Standard Model

    Bose, Benjamin; Koyama, Kazuya, E-mail:, E-mail: [Institute of Cosmology and Gravitation, University of Portsmouth, Burnaby Road, Portsmouth, Hampshire, PO1 3FX (United Kingdom)


    We extend our previous redshift space power spectrum code to the redshift space correlation function. Here we focus on the Gaussian Streaming Model (GSM). Again, the code accommodates a wide range of modified gravity and dark energy models. For the non-linear real space correlation function used in the GSM we use the Fourier transform of the RegPT 1-loop matter power spectrum. We compare predictions of the GSM for a Vainshtein screened and Chameleon screened model as well as GR. These predictions are compared to the Fourier transform of the Taruya, Nishimichi and Saito (TNS) redshift space power spectrum model which is fit to N-body data. We find very good agreement between the Fourier transform of the TNS model and the GSM predictions, with ≤ 6% deviations in the first two correlation function multipoles for all models for redshift space separations in 50Mpc h ≤ s ≤ 180Mpc/ h . Excellent agreement is found in the differences between the modified gravity and GR multipole predictions for both approaches to the redshift space correlation function, highlighting their matched ability in picking up deviations from GR. We elucidate the timeliness of such non-standard templates at the dawn of stage-IV surveys and discuss necessary preparations and extensions needed for upcoming high quality data.

  19. A perturbative approach to the redshift space correlation function: beyond the Standard Model

    Bose, Benjamin; Koyama, Kazuya


    We extend our previous redshift space power spectrum code to the redshift space correlation function. Here we focus on the Gaussian Streaming Model (GSM). Again, the code accommodates a wide range of modified gravity and dark energy models. For the non-linear real space correlation function used in the GSM we use the Fourier transform of the RegPT 1-loop matter power spectrum. We compare predictions of the GSM for a Vainshtein screened and Chameleon screened model as well as GR. These predictions are compared to the Fourier transform of the Taruya, Nishimichi and Saito (TNS) redshift space power spectrum model which is fit to N-body data. We find very good agreement between the Fourier transform of the TNS model and the GSM predictions, with <= 6% deviations in the first two correlation function multipoles for all models for redshift space separations in 50Mpch <= s <= 180Mpc/h. Excellent agreement is found in the differences between the modified gravity and GR multipole predictions for both approaches to the redshift space correlation function, highlighting their matched ability in picking up deviations from GR. We elucidate the timeliness of such non-standard templates at the dawn of stage-IV surveys and discuss necessary preparations and extensions needed for upcoming high quality data.

  20. Astronaut Ross Approaches Assembly Concept for Construction of Erectable Space Structure (ACCESS)


    The crew assigned to the STS-61B mission included Bryan D. O'Conner, pilot; Brewster H. Shaw, commander; Charles D. Walker, payload specialist; mission specialists Jerry L. Ross, Mary L. Cleave, and Sherwood C. Spring; and Rodolpho Neri Vela, payload specialist. Launched aboard the Space Shuttle Atlantis November 28, 1985 at 7:29:00 pm (EST), the STS-61B mission's primary payload included three communications satellites: MORELOS-B (Mexico); AUSSAT-2 (Australia); and SATCOM KU-2 (RCA Americom). Two experiments were conducted to test assembling erectable structures in space: EASE (Experimental Assembly of Structures in Extravehicular Activity), and ACCESS (Assembly Concept for Construction of Erectable Space Structure). In a joint venture between NASA/Langley Research Center in Hampton, Virginia, and the Marshall Space Flight Center (MSFC), EASE and ACCESS were developed and demonstrated at MSFC's Neutral Buoyancy Simulator (NBS). In this STS-61B onboard photo, astronaut Ross, perched on the Manipulator Foot Restraint (MFR) approaches the erected ACCESS. The primary objective of these experiments was to test the structural assembly concepts for suitability as the framework for larger space structures and to identify ways to improve the productivity of space construction.

  1. Final Report for Harvesting a New Wind Crop: Innovative Economic Approaches for Rural America

    Susan Innis; Randy Udall; Project Officer - Keith Bennett


    Final Report for ''Harvesting a New Wind Crop: Innovative Economic Approaches for Rural America'': This project, ''Harvesting a New Wind Crop'', helped stimulate wind development by rural electric cooperatives and municipal utilities in Colorado. To date most of the wind power development in the United States has been driven by large investor-owned utilities serving major metropolitan areas. To meet the 5% by 2020 goal of the Wind Powering America program the 2,000 municipal and 900 rural electric cooperatives in the country must get involved in wind power development. Public power typically serves rural and suburban areas and can play a role in revitalizing communities by tapping into the economic development potential of wind power. One barrier to the involvement of public power in wind development has been the perception that wind power is more expensive than other generation sources. This project focused on two ways to reduce the costs of wind power to make it more attractive to public power entities. The first way was to develop a revenue stream from the sale of green tags. By selling green tags to entities that voluntarily support wind power, rural coops and munis can effectively reduce their cost of wind power. Western Resource Advocates (WRA) and the Community Office for Resource Efficiency (CORE) worked with Lamar Light and Power and Arkansas River Power Authority to develop a strategy to use green tags to help finance their wind project. These utilities are now selling their green tags to Community Energy, Inc., an independent for-profit marketer who in turn sells the tags to consumers around Colorado. The Lamar tags allow the University of Colorado-Boulder, the City of Boulder, NREL and other businesses to support wind power development and make the claim that they are ''wind-powered''. This urban-rural partnership is an important development for the state of Colorado's rural communities

  2. An Approach to Developing Independent Learning and Non-Technical Skills Amongst Final Year Mining Engineering Students

    Knobbs, C. G.; Grayson, D. J.


    There is mounting evidence to show that engineers need more than technical skills to succeed in industry. This paper describes a curriculum innovation in which so-called "soft" skills, specifically inter-personal and intra-personal skills, were integrated into a final year mining engineering course. The instructional approach was…

  3. Comparison of distal soft-tissue procedures combined with a distal chevron osteotomy for moderate to severe hallux valgus: first web-space versus transarticular approach.

    Park, Yu-Bok; Lee, Keun-Bae; Kim, Sung-Kyu; Seon, Jong-Keun; Lee, Jun-Young


    There are two surgical approaches for distal soft-tissue procedures for the correction of hallux valgus-the dorsal first web-space approach, and the medial transarticular approach. The purpose of this study was to compare the outcomes achieved after use of either of these approaches combined with a distal chevron osteotomy in patients with moderate to severe hallux valgus. One hundred and twenty-two female patients (122 feet) who underwent a distal chevron osteotomy as part of a distal soft-tissue procedure for the treatment of symptomatic unilateral moderate to severe hallux valgus constituted the study cohort. The 122 feet were randomly divided into two groups: namely, a dorsal first web-space approach (group D; sixty feet) and a medial transarticular approach (group M; sixty-two feet). The clinical and radiographic results of the two groups were compared at a mean follow-up time of thirty-eight months. The American Orthopaedic Foot & Ankle Society (AOFAS) hindfoot scale hallux metatarsophalangeal-interphalangeal scores improved from a mean and standard deviation of 55.5 ± 12.8 points preoperatively to 93.5 ± 6.3 points at the final follow-up in group D and from 54.9 ± 12.6 points preoperatively to 93.6 ± 6.2 points at the final follow-up in group M. The mean hallux valgus angle in groups D and M was reduced from 32.2° ± 6.3° and 33.1° ± 8.4° preoperatively to 10.5° ± 5.5° and 9.9° ± 5.5°, respectively, at the time of final follow-up. The mean first intermetatarsal angle in groups D and M was reduced from 15.0° ± 2.8° and 15.3° ± 2.7° preoperatively to 6.5° ± 2.2° and 6.3° ± 2.4°, respectively, at the final follow-up. The clinical and radiographic outcomes were not significantly different between the two groups. The final clinical and radiographic outcomes between the two approaches for distal soft-tissue procedures were comparable and equally successful. Accordingly, the results of this study suggest that the medial transarticular

  4. Sustainable Approach for Landfill Management at Final Processing Site Cikundul in Sukabumi City, Indonesia

    Sri Darwati


    The main problem of landfill management in Indonesia is the difficulty in getting a location for Final Processing Sites (FPS) due to limited land and high land prices. Besides, about 95% of existing landfills are uncontrolled dumping sites, which could potentially lead to water, soil and air pollution. Based on data from the Ministry of Environment (2010), The Act of the Republic of Indonesia Number 18 Year 2008 Concerning Solid Waste Management, prohibits open dumping at final processing sit...

  5. Approaches in the determination of plant nutrient uptake and distribution in space flight conditions

    Heyenga, A. G.; Forsman, A.; Stodieck, L. S.; Hoehn, A.; Kliss, M.


    The effective growth and development of vascular plants rely on the adequate availability of water and nutrients. Inefficiency in either the initial absorption, transportation, or distribution of these elements are factors which impinge on plant structure and metabolic integrity. The potential effect of space flight and microgravity conditions on the efficiency of these processes is unclear. Limitations in the available quantity of space-grown plant material and the sensitivity of routine analytical techniques have made an evaluation of these processes impractical. However, the recent introduction of new plant cultivating methodologies supporting the application of radionuclide elements and subsequent autoradiography techniques provides a highly sensitive investigative approach amenable to space flight studies. Experiments involving the use of gel based 'nutrient packs' and the radionuclides calcium-45 and iron-59 were conducted on the Shuttle mission STS-94. Uptake rates of the radionuclides between ground and flight plant material appeared comparable.

  6. Researching on Hawking Effect in a Kerr Space Time via Open Quantum System Approach

    Liu, Wen-Biao; Liu, Xian-Ming


    It has been proposed that Hawking radiation from a Schwarzschild or a de Sitter spacetime can be understood as the manifestation of thermalization phenomena in the framework of an open quantum system. Through examining the time evolution of a detector interacting with vacuum massless scalar fields, it is found that the detector would spontaneously excite with a probability the same as the thermal radiation at Hawking temperature. Following the proposals, the Hawking effect in a Kerr space time is investigated in the framework of an open quantum systems. It is shown that Hawking effect of the Kerr space time can also be understood as the the manifestation of thermalization phenomena via open quantum system approach. Furthermore, it is found that near horizon local conformal symmetry plays the key role in the quantum effect of the Kerr space time

  7. Mentoring SFRM: A New Approach to International Space Station Flight Control Training

    Huning, Therese; Barshi, Immanuel; Schmidt, Lacey


    The Mission Operations Directorate (MOD) of the Johnson Space Center is responsible for providing continuous operations support for the International Space Station (ISS). Operations support requires flight controllers who are skilled in team performance as well as the technical operations of the ISS. Space Flight Resource Management (SFRM), a NASA adapted variant of Crew Resource Management (CRM), is the competency model used in the MOD. ISS flight controller certification has evolved to include a balanced focus on development of SFRM and technical expertise. The latest challenge the MOD faces is how to certify an ISS flight controller (Operator) to a basic level of effectiveness in 1 year. SFRM training uses a twopronged approach to expediting operator certification: 1) imbed SFRM skills training into all Operator technical training and 2) use senior flight controllers as mentors. This paper focuses on how the MOD uses senior flight controllers as mentors to train SFRM skills.

  8. Zeta-function regularization approach to finite temperature effects in Kaluza-Klein space-times

    Bytsenko, A.A.; Vanzo, L.; Zerbini, S.


    In the framework of heat-kernel approach to zeta-function regularization, in this paper the one-loop effective potential at finite temperature for scalar and spinor fields on Kaluza-Klein space-time of the form M p x M c n , where M p is p-dimensional Minkowski space-time is evaluated. In particular, when the compact manifold is M c n = H n /Γ, the Selberg tracer formula associated with discrete torsion-free group Γ of the n-dimensional Lobachevsky space H n is used. An explicit representation for the thermodynamic potential valid for arbitrary temperature is found. As a result a complete high temperature expansion is presented and the roles of zero modes and topological contributions is discussed

  9. An integrated mission approach to the space exploration initiative will ensure success

    Coomes, Edmund P.; Dagle, Jefferey E.; Bamberger, Judith A.; Noffsinger, Kent E.


    The direction of the American space program, as defined by President Bush and the National Commission on Space, is to expand human presence into the solar system. Landing an American on Mars by the 50th anniversary of the Apollo 11 lunar landing is the goal. This challenge has produced a level of excitement among young Americans not seen for nearly three decades. The exploration and settlement of the space frontier will occupy the creative thoughts and energies of generations of Americans well into the next century. The return of Americans to the moon and beyond must be viewed as a national effort with strong public support if it is to become a reality. Key to making this an actuality is the mission approach selected. Developing a permanent presence in space requires a continual stepping outward from Earch in a logical progressive manner. If we seriously plan to go and to stay, then not only must we plan what we are to do and how we are to do it, we must address the logistic support infrastructure that will allow us to stay there once we arrive. A fully integrated approach to mission planning is needed if the Space exploration Initiative (SEI) is to be successful. Only in this way can a permanent human presence in space be sustained. An integrated infrastructure approach would reduce the number of new systems and technologies requiring development. The resultant horizontal commonality of systems and hardware would reduce the direct economic impact of SEI while an early return on investment through technology spin-offs would be an economic benefit by greatly enhancing our international technical competitiveness. If the exploration, development, and colonization of space is to be affordable and acceptable, careful consideration must be given to such things as ``return on investment'' and ``commercial product potential'' of the technologies developed. This integrated approach will win the Congressional support needed to secure the financial backing necessary to assure

  10. A Principled Approach to the Specification of System Architectures for Space Missions

    McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad


    Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.

  11. The balance space approach to multicriteria decision making—involving the decision maker

    Ehrgott, M.


    The balance space approach (introduced by Galperin in 1990) provides a new view on multicriteria optimization. Looking at deviations from global optimality of the different objectives, balance points and balance numbers are defined when either different or equal deviations for each objective are allowed. Apportioned balance numbers allow the specification of proportions among the deviations. Through this concept the decision maker can be involved in the decision process. In this paper we prov...

  12. Approaching the new reality. [changes in NASA space programs due to US economy

    Diaz, Al V.


    The focus on more frequent access to space through smaller, less costly missions, and on NASA's role as a source of technological advance within the U.S. economy is discussed. The Pluto fast flyby mission is examined as an illustration of this approach. Testbeds are to be developed to survive individual programs, becoming permanent facilities, to allow for technological upgrades on an ongoing basis.

  13. Forecasting the Global Mean Sea Level, a Continuous-Time State-Space Approach

    Boldrini, Lorenzo

    In this paper we propose a continuous-time, Gaussian, linear, state-space system to model the relation between global mean sea level (GMSL) and the global mean temperature (GMT), with the aim of making long-term projections for the GMSL. We provide a justification for the model specification based......) and the temperature reconstruction from Hansen et al. (2010). We compare the forecasting performance of the proposed specification to the procedures developed in Rahmstorf (2007b) and Vermeer and Rahmstorf (2009). Finally, we compute projections for the sea-level rise conditional on the 21st century SRES temperature...

  14. A Programmatic and Engineering Approach to the Development of a Nuclear Thermal Rocket for Space Exploration

    Bordelon, Wayne J., Jr.; Ballard, Rick O.; Gerrish, Harold P., Jr.


    With the announcement of the Vision for Space Exploration on January 14, 2004, there has been a renewed interest in nuclear thermal propulsion. Nuclear thermal propulsion is a leading candidate for in-space propulsion for human Mars missions; however, the cost to develop a nuclear thermal rocket engine system is uncertain. Key to determining the engine development cost will be the engine requirements, the technology used in the development and the development approach. The engine requirements and technology selection have not been defined and are awaiting definition of the Mars architecture and vehicle definitions. The paper discusses an engine development approach in light of top-level strategic questions and considerations for nuclear thermal propulsion and provides a suggested approach based on work conducted at the NASA Marshall Space Flight Center to support planning and requirements for the Prometheus Power and Propulsion Office. This work is intended to help support the development of a comprehensive strategy for nuclear thermal propulsion, to help reduce the uncertainty in the development cost estimate, and to help assess the potential value of and need for nuclear thermal propulsion for a human Mars mission.

  15. Review of NASA approach to space radiation risk assessments for Mars exploration.

    Cucinotta, Francis A


    Long duration space missions present unique radiation protection challenges due to the complexity of the space radiation environment, which includes high charge and energy particles and other highly ionizing radiation such as neutrons. Based on a recommendation by the National Council on Radiation Protection and Measurements, a 3% lifetime risk of exposure-induced death for cancer has been used as a basis for risk limitation by the National Aeronautics and Space Administration (NASA) for low-Earth orbit missions. NASA has developed a risk-based approach to radiation exposure limits that accounts for individual factors (age, gender, and smoking history) and assesses the uncertainties in risk estimates. New radiation quality factors with associated probability distribution functions to represent the quality factor's uncertainty have been developed based on track structure models and recent radiobiology data for high charge and energy particles. The current radiation dose limits are reviewed for spaceflight and the various qualitative and quantitative uncertainties that impact the risk of exposure-induced death estimates using the NASA Space Cancer Risk (NSCR) model. NSCR estimates of the number of "safe days" in deep space to be within exposure limits and risk estimates for a Mars exploration mission are described.

  16. Innovative Approaches to Development and Ground Testing of Advanced Bimodal Space Power and Propulsion Systems

    Hill, T.; Noble, C.; Martinell, J.; Borowski, S.


    The last major development effort for nuclear power and propulsion systems ended in 1993. Currently, there is not an initiative at either the National Aeronautical and Space Administration (NASA) or the U.S. Department of Energy (DOE) that requires the development of new nuclear power and propulsion systems. Studies continue to show nuclear technology as a strong technical candidate to lead the way toward human exploration of adjacent planets or provide power for deep space missions, particularly a 15,000 lbf bimodal nuclear system with 115 kW power capability. The development of nuclear technology for space applications would require technology development in some areas and a major flight qualification program. The last major ground test facility considered for nuclear propulsion qualification was the U.S. Air Force/DOE Space Nuclear Thermal Propulsion Project. Seven years have passed since that effort, and the questions remain the same, how to qualify nuclear power and propulsion systems for future space flight. It can be reasonably assumed that much of the nuclear testing required to qualify a nuclear system for space application will be performed at DOE facilities as demonstrated by the Nuclear Rocket Engine Reactor Experiment (NERVA) and Space Nuclear Thermal Propulsion (SNTP) programs. The nuclear infrastructure to support testing in this country is aging and getting smaller, though facilities still exist to support many of the technology development needs. By renewing efforts, an innovative approach to qualifying these systems through the use of existing facilities either in the U.S. (DOE's Advance Test Reactor, High Flux Irradiation Facility and the Contained Test Facility) or overseas should be possible

  17. Innovation Approaches to Development and Ground Testing of Advanced Bimodal Space Power and Propulsion Systems

    Hill, T.; Noble, C.; Martinell, J. (INEEL); Borowski, S. (NASA Glenn Research Center)


    The last major development effort for nuclear power and propulsion systems ended in 1993. Currently, there is not an initiative at either the National Aeronautical and Space Administration (NASA) or the U.S. Department of Energy (DOE) that requires the development of new nuclear power and propulsion systems. Studies continue to show nuclear technology as a strong technical candidate to lead the way toward human exploration of adjacent planets or provide power for deep space missions, particularly a 15,000 lbf bimodal nuclear system with 115 kW power capability. The development of nuclear technology for space applications would require technology development in some areas and a major flight qualification program. The last major ground test facility considered for nuclear propulsion qualification was the U.S. Air Force/DOE Space Nuclear Thermal Propulsion Project. Seven years have passed since that effort, and the questions remain the same, how to qualify nuclear power and propulsion systems for future space flight. It can be reasonably assumed that much of the nuclear testing required to qualify a nuclear system for space application will be performed at DOE facilities as demonstrated by the Nuclear Rocket Engine Reactor Experiment (NERVA) and Space Nuclear Thermal Propulsion (SNTP) programs. The nuclear infrastructure to support testing in this country is aging and getting smaller, though facilities still exist to support many of the technology development needs. By renewing efforts, an innovative approach to qualifying these systems through the use of existing facilities either in the U.S. (DOE's Advance Test Reactor, High Flux Irradiation Facility and the Contained Test Facility) or overseas should be possible.

  18. Innovative Approaches to Development and Ground Testing of Advanced Bimodal Space Power and Propulsion Systems

    Hill, Thomas Johnathan; Noble, Cheryl Ann; Noble, C.; Martinell, John Stephen; Borowski, S.


    The last major development effort for nuclear power and propulsion systems ended in 1993. Currently, there is not an initiative at either the National Aeronautical and Space Administration (NASA) or the U.S. Department of Energy (DOE) that requires the development of new nuclear power and propulsion systems. Studies continue to show nuclear technology as a strong technical candidate to lead the way toward human exploration of adjacent planets or provide power for deep space missions, particularly a 15,000 lbf bimodal nuclear system with 115 kW power capability. The development of nuclear technology for space applications would require technology development in some areas and a major flight qualification program. The last major ground test facility considered for nuclear propulsion qualification was the U.S. Air Force/DOE Space Nuclear Thermal Propulsion Project. Seven years have passed since that effort, and the questions remain the same, how to qualify nuclear power and propulsion systems for future space flight. It can be reasonable assumed that much of the nuclear testing required to qualify a nuclear system for space application will be performed at DOE facilities as demonstrated by the Nuclear Rocket Engine Reactor Experiment (NERVA) and Space Nuclear Thermal Propulsion (SNTP) programs. The nuclear infrastructure to support testing in this country is aging and getting smaller, though facilities still exist to support many of the technology development needs. By renewing efforts, an innovative approach to qualifying these systems through the use of existing facilities either in the U.S. (DOE's Advance Test Reactor, High Flux Irradiation Facility and the Contained Test Facility) or overseas should be possible.

  19. Study of Modern Approach to Build the Functional Models of Managerial and Engineering Systems in Training Specialists for Space Industry

    N. V. Arhipova


    Full Text Available The SM8 Chair at Bauman Moscow State Technological University (BMSTU trains specialists majoring not only in design and manufacture, but also in operation and maintenance of space ground-based infrastructure.The learning courses in design, production, and operation of components of the missile and space technology, give much prominence to modeling. The same attention should be given to the modeling of managerial and engineering systems, with which deal both an expert and a leadman. It is important to choose the modeling tools for managerial and engineering systems with which they are to work and to learn how to apply these tools.The study of modern approach to functional modeling of managerial and engineering systems is held in the format of business game in laboratory class. A structural analysis and design technique (IDEFØ is considered as the means of modeling.The article stresses the IDEFØ approach advantages, namely: comprehensible graphical language, applicability to all-types and all-levels-of-hierarchy managerial and engineering systems modeling, popularity, version control means, teamwork tools. Moreover, the IDEFØ allows us to illustrate such notions, as point of view, system bounders, structure, control, feedback as applied to the managerial and engineering systems.The article offers a modified procedure to create an IDEFØ model in the context of training session. It also suggests a step-by-step procedure of the instruction session to be held, as well as of student self-training to have study credits, and a procedure of the work defense (final test.The approach under consideration can be applied to other training courses. The article proves it giving information about positive experience of its application.

  20. Simulation of the space debris environment in LEO using a simplified approach

    Kebschull, Christopher; Scheidemann, Philipp; Hesselbach, Sebastian; Radtke, Jonas; Braun, Vitali; Krag, H.; Stoll, Enrico


    Several numerical approaches exist to simulate the evolution of the space debris environment. These simulations usually rely on the propagation of a large population of objects in order to determine the collision probability for each object. Explosion and collision events are triggered randomly using a Monte-Carlo (MC) approach. So in many different scenarios different objects are fragmented and contribute to a different version of the space debris environment. The results of the single Monte-Carlo runs therefore represent the whole spectrum of possible evolutions of the space debris environment. For the comparison of different scenarios, in general the average of all MC runs together with its standard deviation is used. This method is computationally very expensive due to the propagation of thousands of objects over long timeframes and the application of the MC method. At the Institute of Space Systems (IRAS) a model capable of describing the evolution of the space debris environment has been developed and implemented. The model is based on source and sink mechanisms, where yearly launches as well as collisions and explosions are considered as sources. The natural decay and post mission disposal measures are the only sink mechanisms. This method reduces the computational costs tremendously. In order to achieve this benefit a few simplifications have been applied. The approach of the model partitions the Low Earth Orbit (LEO) region into altitude shells. Only two kinds of objects are considered, intact bodies and fragments, which are also divided into diameter bins. As an extension to a previously presented model the eccentricity has additionally been taken into account with 67 eccentricity bins. While a set of differential equations has been implemented in a generic manner, the Euler method was chosen to integrate the equations for a given time span. For this paper parameters have been derived so that the model is able to reflect the results of the numerical MC

  1. Valuing Community Benefits of Final Ecosystem Goods and Services: Human Health and Ethnographic Approaches

    This report provides a summary of three of our research projects: 1) an evaluation of the quality of scientific evidence associating green spaces with health benefits, along with ensuing research in San Juan, Puerto Rico; 2) a Health Impact Assessment of a Long Island sewering pi...

  2. A GOCE-only global gravity field model by the space-wise approach

    Migliaccio, Frederica; Reguzzoni, Mirko; Gatti, Andrea


    The global gravity field model computed by the spacewise approach is one of three official solutions delivered by ESA from the analysis of the GOCE data. The model consists of a set of spherical harmonic coefficients and the corresponding error covariance matrix. The main idea behind this approach...... the orbit to reduce the noise variance and correlation before gridding the data. In the first release of the space-wise approach, based on a period of about two months, some prior information coming from existing gravity field models entered into the solution especially at low degrees and low orders...... degrees; the second is an internally computed GOCE-only prior model to be used in place of the official quick-look model, thus removing the dependency on EIGEN5C especially in the polar gaps. Once the procedure to obtain a GOCE-only solution has been outlined, a new global gravity field model has been...

  3. Thyroid Cartilage Window Approach to Extract a Foreign Body after Migration into the Paraglottic Space

    Sheikha Alkhudher


    Full Text Available We report a case of fish bone impaction in the paraglottic space, which caused palsy of the left vocal cord. The patient was a 45-year-old man. He presented with throat pain and hoarseness of voice for approximately one week. The diagnosis was made after careful history taking and confirmed by the use of computed tomography scan as the fish bone was not visible endoscopically under local and general anaesthesia. The patient underwent thyroid cartilage window approach, and the fish bone was retrieved. His symptoms have improved significantly, and he did not require tracheostomy. Other cases reported the removal of foreign bodies by other techniques such as laryngofissure and posterolateral approach. Our case is different in that we used a modification of thyroplasty type 1 technique as it has less reported complications than other approaches that were published in literature.

  4. Final-state interactions and superscaling in the semi-relativistic approach to quasielastic electron and neutrino scattering

    Amaro, J. E.; Barbaro, M. B.; Caballero, J. A.; Donnelly, T. W.; Udias, J. M.


    The semi-relativistic approach to electron and neutrino quasielastic scattering from nuclei is extended to include final-state interactions. Starting with the usual nonrelativistic continuum shell model, the problem is relativized by using the semi-relativistic expansion of the current in powers of the initial nucleon momentum and relativistic kinematics. Two different approaches are considered for the final-state interactions: the Smith-Wambach 2p-2h damping model and the Dirac-equation-based potential extracted from a relativistic mean-field plus the Darwin factor. Using the latter, the scaling properties of (e,e ' ) and (ν μ ,μ - ) cross sections for intermediate momentum transfers are investigated

  5. Planning additional drilling campaign using two-space genetic algorithm: A game theoretical approach

    Kumral, Mustafa; Ozer, Umit


    Grade and tonnage are the most important technical uncertainties in mining ventures because of the use of estimations/simulations, which are mostly generated from drill data. Open pit mines are planned and designed on the basis of the blocks representing the entire orebody. Each block has different estimation/simulation variance reflecting uncertainty to some extent. The estimation/simulation realizations are submitted to mine production scheduling process. However, the use of a block model with varying estimation/simulation variances will lead to serious risk in the scheduling. In the medium of multiple simulations, the dispersion variances of blocks can be thought to regard technical uncertainties. However, the dispersion variance cannot handle uncertainty associated with varying estimation/simulation variances of blocks. This paper proposes an approach that generates the configuration of the best additional drilling campaign to generate more homogenous estimation/simulation variances of blocks. In other words, the objective is to find the best drilling configuration in such a way as to minimize grade uncertainty under budget constraint. Uncertainty measure of the optimization process in this paper is interpolation variance, which considers data locations and grades. The problem is expressed as a minmax problem, which focuses on finding the best worst-case performance i.e., minimizing interpolation variance of the block generating maximum interpolation variance. Since the optimization model requires computing the interpolation variances of blocks being simulated/estimated in each iteration, the problem cannot be solved by standard optimization tools. This motivates to use two-space genetic algorithm (GA) approach to solve the problem. The technique has two spaces: feasible drill hole configuration with minimization of interpolation variance and drill hole simulations with maximization of interpolation variance. Two-space interacts to find a minmax solution

  6. Approaches to learning for the ANZCA Final Examination and validation of the revised Study Process Questionnaire in specialist medical training.

    Weller, J M; Henning, M; Civil, N; Lavery, L; Boyd, M J; Jolly, B


    When evaluating assessments, the impact on learning is often overlooked. Approaches to learning can be deep, surface and strategic. To provide insights into exam quality, we investigated the learning approaches taken by trainees preparing for the Australian and New Zealand College of Anaesthetists (ANZCA) Final Exam. The revised two-factor Study Process Questionnaire (R-SPQ-2F) was modified and validated for this context and was administered to ANZCA advanced trainees. Additional questions were asked about perceived value for anaesthetic practice, study time and approaches to learning for each exam component. Overall, 236 of 690 trainees responded (34%). Responses indicated both deep and surface approaches to learning with a clear preponderance of deep approaches. The anaesthetic viva was valued most highly and the multiple choice question component the least. Despite this, respondents spent the most time studying for the multiple choice questions. The traditionally low short answer questions pass rate could not be explained by limited study time, perceived lack of value or study approaches. Written responses suggested that preparation for multiple choice questions was characterised by a surface approach, with rote memorisation of past questions. Minimal reference was made to the ANZCA syllabus as a guide for learning. These findings indicate that, although trainees found the exam generally relevant to practice and adopted predominantly deep learning approaches, there was considerable variation between the four components. These results provide data with which to review the existing ANZCA Final Exam and comparative data for future studies of the revisions to the ANZCA curriculum and exam process.

  7. Wave-filter-based approach for generation of a quiet space in a rectangular cavity

    Iwamoto, Hiroyuki; Tanaka, Nobuo; Sanada, Akira


    This paper is concerned with the generation of a quiet space in a rectangular cavity using active wave control methodology. It is the purpose of this paper to present the wave filtering method for a rectangular cavity using multiple microphones and its application to an adaptive feedforward control system. Firstly, the transfer matrix method is introduced for describing the wave dynamics of the sound field, and then feedforward control laws for eliminating transmitted waves is derived. Furthermore, some numerical simulations are conducted that show the best possible result of active wave control. This is followed by the derivation of the wave filtering equations that indicates the structure of the wave filter. It is clarified that the wave filter consists of three portions; modal group filter, rearrangement filter and wave decomposition filter. Next, from a numerical point of view, the accuracy of the wave decomposition filter which is expressed as a function of frequency is investigated using condition numbers. Finally, an experiment on the adaptive feedforward control system using the wave filter is carried out, demonstrating that a quiet space is generated in the target space by the proposed method.

  8. Analytical approach to chromatic correction in the final focus system of circular colliders

    Yunhai Cai


    Full Text Available A conventional final focus system in particle accelerators is systematically analyzed. We find simple relations between the parameters of two focus modules in the final telescope. Using the relations, we derive the chromatic Courant-Snyder parameters for the telescope. The parameters are scaled approximately according to (L^{*}/β_{y}^{*}δ, where L^{*} is the distance from the interaction point to the first quadrupole, β_{y}^{*} the vertical beta function at the interaction point, and δ the relative momentum deviation. Most importantly, we show how to compensate its chromaticity order by order in δ by a traditional correction module flanked by an asymmetric pair of harmonic multipoles. The method enables a circular Higgs collider with 2% momentum aperture and illuminates a path forward to 4% in the future.

  9. A novel variable selection approach that iteratively optimizes variable space using weighted binary matrix sampling.

    Deng, Bai-chuan; Yun, Yong-huan; Liang, Yi-zeng; Yi, Lun-zhao


    In this study, a new optimization algorithm called the Variable Iterative Space Shrinkage Approach (VISSA) that is based on the idea of model population analysis (MPA) is proposed for variable selection. Unlike most of the existing optimization methods for variable selection, VISSA statistically evaluates the performance of variable space in each step of optimization. Weighted binary matrix sampling (WBMS) is proposed to generate sub-models that span the variable subspace. Two rules are highlighted during the optimization procedure. First, the variable space shrinks in each step. Second, the new variable space outperforms the previous one. The second rule, which is rarely satisfied in most of the existing methods, is the core of the VISSA strategy. Compared with some promising variable selection methods such as competitive adaptive reweighted sampling (CARS), Monte Carlo uninformative variable elimination (MCUVE) and iteratively retaining informative variables (IRIV), VISSA showed better prediction ability for the calibration of NIR data. In addition, VISSA is user-friendly; only a few insensitive parameters are needed, and the program terminates automatically without any additional conditions. The Matlab codes for implementing VISSA are freely available on the website:

  10. Tools in the orbit space approach to the study of invariant functions: rational parametrization of strata

    Sartori, G; Valente, G


    Functions which are equivariant or invariant under the transformations of a compact linear group G acting in a Euclidean space R n , can profitably be studied as functions defined in the orbit space of the group. The orbit space is the union of a finite set of strata, which are semialgebraic manifolds formed by the G-orbits with the same orbit-type. In this paper, we provide a simple recipe to obtain rational parametrizations of the strata. Our results can be easily exploited, in many physical contexts where the study of equivariant or invariant functions is important, for instance in the determination of patterns of spontaneous symmetry breaking, in the analysis of phase spaces and structural phase transitions (Landau theory), in equivariant bifurcation theory, in crystal field theory and in most areas where use is made of symmetry-adapted functions. A physically significant example of utilization of the recipe is given, related to spontaneous polarization in chiral biaxial liquid crystals, where the advantages with respect to previous heuristic approaches are shown

  11. Tools in the orbit space approach to the study of invariant functions: rational parametrization of strata

    Sartori, G; Valente, G [Dipartimento di Fisica, Universita di Padova and INFN, Sezione di Padova, I-35131 Padova (Italy)


    Functions which are equivariant or invariant under the transformations of a compact linear group G acting in a Euclidean space R{sup n}, can profitably be studied as functions defined in the orbit space of the group. The orbit space is the union of a finite set of strata, which are semialgebraic manifolds formed by the G-orbits with the same orbit-type. In this paper, we provide a simple recipe to obtain rational parametrizations of the strata. Our results can be easily exploited, in many physical contexts where the study of equivariant or invariant functions is important, for instance in the determination of patterns of spontaneous symmetry breaking, in the analysis of phase spaces and structural phase transitions (Landau theory), in equivariant bifurcation theory, in crystal field theory and in most areas where use is made of symmetry-adapted functions. A physically significant example of utilization of the recipe is given, related to spontaneous polarization in chiral biaxial liquid crystals, where the advantages with respect to previous heuristic approaches are shown.

  12. An Architecture, System Engineering, and Acquisition Approach for Space System Software Resiliency

    Phillips, Dewanne Marie

    Software intensive space systems can harbor defects and vulnerabilities that may enable external adversaries or malicious insiders to disrupt or disable system functions, risking mission compromise or loss. Mitigating this risk demands a sustained focus on the security and resiliency of the system architecture including software, hardware, and other components. Robust software engineering practices contribute to the foundation of a resilient system so that the system "can take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". Software resiliency must be a priority and addressed early in the life cycle development to contribute a secure and dependable space system. Those who develop, implement, and operate software intensive space systems must determine the factors and systems engineering practices to address when investing in software resiliency. This dissertation offers methodical approaches for improving space system resiliency through software architecture design, system engineering, increased software security, thereby reducing the risk of latent software defects and vulnerabilities. By providing greater attention to the early life cycle phases of development, we can alter the engineering process to help detect, eliminate, and avoid vulnerabilities before space systems are delivered. To achieve this objective, this dissertation will identify knowledge, techniques, and tools that engineers and managers can utilize to help them recognize how vulnerabilities are produced and discovered so that they can learn to circumvent them in future efforts. We conducted a systematic review of existing architectural practices, standards, security and coding practices, various threats, defects, and vulnerabilities that impact space systems from hundreds of relevant publications and interviews of subject matter experts. We expanded on the system-level body of knowledge for resiliency and identified a new software

  13. Robust control of uncertain dynamic systems a linear state space approach

    Yedavalli, Rama K


    This textbook aims to provide a clear understanding of the various tools of analysis and design for robust stability and performance of uncertain dynamic systems. In model-based control design and analysis, mathematical models can never completely represent the “real world” system that is being modeled, and thus it is imperative to incorporate and accommodate a level of uncertainty into the models. This book directly addresses these issues from a deterministic uncertainty viewpoint and focuses on the interval parameter characterization of uncertain systems. Various tools of analysis and design are presented in a consolidated manner. This volume fills a current gap in published works by explicitly addressing the subject of control of dynamic systems from linear state space framework, namely using a time-domain, matrix-theory based approach. This book also: Presents and formulates the robustness problem in a linear state space model framework Illustrates various systems level methodologies with examples and...

  14. A New Approach to Space Situational Awareness using Small Ground-Based Telescopes

    Anheier, Norman C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chen, Cliff S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    This report discusses a new SSA approach evaluated by Pacific Northwest National Laboratory (PNNL) that may lead to highly scalable, small telescope observing stations designed to help manage the growing space surveillance burden. Using the methods and observing tools described in this report, the team was able to acquire and track very faint satellites (near Pluto’s apparent brightness). Photometric data was collected and used to correlate object orbital position as a function of atomic clock-derived time. Object apparent brightness was estimated by image analysis and nearby star calibration. The measurement performance was only limited by weather conditions, object brightness, and the sky glow at the observation site. In the future, these new SSA technologies and techniques may be utilized to protect satellite assets, detect and monitor orbiting debris fields, and support Outer Space Treaty monitoring and transparency.

  15. Semiclassical moment of inertia shell-structure within the phase-space approach

    Gorpinchenko, D V; Magner, A G; Bartel, J; Blocki, J P


    The moment of inertia for nuclear collective rotations is derived within a semiclassical approach based on the cranking model and the Strutinsky shell-correction method by using the non-perturbative periodic-orbit theory in the phase-space variables. This moment of inertia for adiabatic (statistical-equilibrium) rotations can be approximated by the generalized rigid-body moment of inertia accounting for the shell corrections of the particle density. A semiclassical phase-space trace formula allows us to express the shell components of the moment of inertia quite accurately in terms of the free-energy shell corrections for integrable and partially chaotic Fermi systems, which is in good agreement with the corresponding quantum calculations. (paper)

  16. A semiclassical approach to many-body interference in Fock-space

    Engl, Thomas


    Many-body systems draw ever more physicists' attention. Such an increase of interest often comes along with the development of new theoretical methods. In this thesis, a non-perturbative semiclassical approach is developed, which allows to analytically study many-body interference effects both in bosonic and fermionic Fock space and is expected to be applicable to many research areas in physics ranging from Quantum Optics and Ultracold Atoms to Solid State Theory and maybe even High Energy Physics. After the derivation of the semiclassical approximation, which is valid in the limit of large total number of particles, first applications manifesting the presence of many-body interference effects are shown. Some of them are confirmed numerically thus verifying the semiclassical predictions. Among these results are coherent back-/forward-scattering in bosonic and fermionic Fock space as well as a many-body spin echo, to name only the two most important ones.

  17. Parameter retrieval of chiral metamaterials based on the state-space approach.

    Zarifi, Davoud; Soleimani, Mohammad; Abdolali, Ali


    This paper deals with the introduction of an approach for the electromagnetic characterization of homogeneous chiral layers. The proposed method is based on the state-space approach and properties of a 4×4 state transition matrix. Based on this, first, the forward problem analysis through the state-space method is reviewed and properties of the state transition matrix of a chiral layer are presented and proved as two theorems. The formulation of a proposed electromagnetic characterization method is then presented. In this method, scattering data for a linearly polarized plane wave incident normally on a homogeneous chiral slab are combined with properties of a state transition matrix and provide a powerful characterization method. The main difference with respect to other well-established retrieval procedures based on the use of the scattering parameters relies on the direct computation of the transfer matrix of the slab as opposed to the conventional calculation of the propagation constant and impedance of the modes supported by the medium. The proposed approach allows avoiding nonlinearity of the problem but requires getting enough equations to fulfill the task which was provided by considering some properties of the state transition matrix. To demonstrate the applicability and validity of the method, the constitutive parameters of two well-known dispersive chiral metamaterial structures at microwave frequencies are retrieved. The results show that the proposed method is robust and reliable.

  18. Space Applications and Global Information Infrastructure: a Global Approach against Epidemics

    Bastos, C. R.


    Brazilian space expenditures correspond to a low-middle rank among the space-faring nations. In this regard, international partnerships have opened doors for the country to take part in a wider range of projects than it would be possible if carried out on its own. Within the above framework, this paper will address a concept in which countries join efforts in pursuit of common objectives and needs in the field of health, countries whose similarities tend to make them face the same types of health problems. Exactly for this reason, such countries can get together and share the costs, risks and ultimately the benefits of their joint efforts. Infectious diseases are mankind's leading causes of death. And their agents travel around the world by the action of their vectors: insects, birds, winds, infected individuals, and others. The ways how Global Information Infrastructure and Space applications can be very helpful in the detection, identification, tracking and fighting migratory diseases will then be discussed. A concept for an international cooperative initiative is presented, addressing its composition, its implementation, the international coordination requirements, the financial and funding issues related to its implementation and sustainability, and the roles to be played by such an organization. The funding issue deserves a closer attention, since many good ideas are killed by financial problems in their early implementation stages. Finally, a conclusion drives the audience's attention towards the potential advantages of space-based assets in covering large portions of the Earth, and consequently being suitable for global initiatives for the benefit of mankind.

  19. Modified parity space averaging approaches for online cross-calibration of redundant sensors in nuclear reactors

    Moath Kassim


    Full Text Available To maintain safety and reliability of reactors, redundant sensors are usually used to measure critical variables and estimate their averaged time-dependency. Nonhealthy sensors can badly influence the estimation result of the process variable. Since online condition monitoring was introduced, the online cross-calibration method has been widely used to detect any anomaly of sensor readings among the redundant group. The cross-calibration method has four main averaging techniques: simple averaging, band averaging, weighted averaging, and parity space averaging (PSA. PSA is used to weigh redundant signals based on their error bounds and their band consistency. Using the consistency weighting factor (C, PSA assigns more weight to consistent signals that have shared bands, based on how many bands they share, and gives inconsistent signals of very low weight. In this article, three approaches are introduced for improving the PSA technique: the first is to add another consistency factor, so called trend consistency (TC, to include a consideration of the preserving of any characteristic edge that reflects the behavior of equipment/component measured by the process parameter; the second approach proposes replacing the error bound/accuracy based weighting factor (Wa with a weighting factor based on the Euclidean distance (Wd, and the third approach proposes applying Wd,TC,andC, all together. Cold neutron source data sets of four redundant hydrogen pressure transmitters from a research reactor were used to perform the validation and verification. Results showed that the second and third modified approaches lead to reasonable improvement of the PSA technique. All approaches implemented in this study were similar in that they have the capability to (1 identify and isolate a drifted sensor that should undergo calibration, (2 identify a faulty sensor/s due to long and continuous missing data range, and (3 identify a healthy sensor. Keywords: Nuclear Reactors

  20. Severe accident approach - final report. Evaluation of design measures for severe accident prevention and consequence mitigation

    Tentner, A.M.; Parma, E.; Wei, T.; Wigeland, R.


    An important goal of the US DOE reactor development program is to conceptualize advanced safety design features for a demonstration Sodium Fast Reactor (SFR). The treatment of severe accidents is one of the key safety issues in the design approach for advanced SFR systems. It is necessary to develop an in-depth understanding of the risk of severe accidents for the SFR so that appropriate risk management measures can be implemented early in the design process. This report presents the results of a review of the SFR features and phenomena that directly influence the sequence of events during a postulated severe accident. The report identifies the safety features used or proposed for various SFR designs in the US and worldwide for the prevention and/or mitigation of Core Disruptive Accidents (CDA). The report provides an overview of the current SFR safety approaches and the role of severe accidents. Mutual understanding of these design features and safety approaches is necessary for future collaborations between the US and its international partners as part of the GEN IV program. The report also reviews the basis for an integrated safety approach to severe accidents for the SFR that reflects the safety design knowledge gained in the US during the Advanced Liquid Metal Reactor (ALMR) and Integral Fast Reactor (IFR) programs. This approach relies on inherent reactor and plant safety performance characteristics to provide additional safety margins. The goal of this approach is to prevent development of severe accident conditions, even in the event of initiators with safety system failures previously recognized to lead directly to reactor damage.

  1. Severe accident approach - final report. Evaluation of design measures for severe accident prevention and consequence mitigation.

    Tentner, A. M.; Parma, E.; Wei, T.; Wigeland, R.; Nuclear Engineering Division; SNL; INL


    An important goal of the US DOE reactor development program is to conceptualize advanced safety design features for a demonstration Sodium Fast Reactor (SFR). The treatment of severe accidents is one of the key safety issues in the design approach for advanced SFR systems. It is necessary to develop an in-depth understanding of the risk of severe accidents for the SFR so that appropriate risk management measures can be implemented early in the design process. This report presents the results of a review of the SFR features and phenomena that directly influence the sequence of events during a postulated severe accident. The report identifies the safety features used or proposed for various SFR designs in the US and worldwide for the prevention and/or mitigation of Core Disruptive Accidents (CDA). The report provides an overview of the current SFR safety approaches and the role of severe accidents. Mutual understanding of these design features and safety approaches is necessary for future collaborations between the US and its international partners as part of the GEN IV program. The report also reviews the basis for an integrated safety approach to severe accidents for the SFR that reflects the safety design knowledge gained in the US during the Advanced Liquid Metal Reactor (ALMR) and Integral Fast Reactor (IFR) programs. This approach relies on inherent reactor and plant safety performance characteristics to provide additional safety margins. The goal of this approach is to prevent development of severe accident conditions, even in the event of initiators with safety system failures previously recognized to lead directly to reactor damage.

  2. Investigation of tt in the full hadronic final state at CDF with a neural network approach

    Sidoti, A; Busetto, G; Castro, A; Dusini, S; Lazzizzera, I; Wyss, J


    In this work we present the results of a neural network (NN) approach to the measurement of the tt production cross-section and top mass in the all-hadronic channel, analyzing data collected at the Collider Detector at Fermilab (CDF) experiment. We have used a hardware implementation of a feedforward neural network, TOTEM, the product of a collaboration of INFN (Istituto Nazionale Fisica Nucleare)-IRST (Istituto per la Ricerca Scientifica e Tecnologica)-University of Trento, Italy. Particular attention has been paid to the evaluation of the systematics specifically related to the NN approach. The results are consistent with those obtained at CDF by conventional data selection techniques. (38 refs).

  3. 300 Area dangerous waste tank management system: Compliance plan approach. Final report


    In its Dec. 5, 1989 letter to DOE-Richland (DOE-RL) Operations, the Washington State Dept. of Ecology requested that DOE-RL prepare ''a plant evaluating alternatives for storage and/or treatment of hazardous waste in the 300 Area...''. This document, prepared in response to that letter, presents the proposed approach to compliance of the 300 Area with the federal Resource Conservation and Recovery Act and Washington State's Chapter 173-303 WAC, Dangerous Waste Regulations. It also contains 10 appendices which were developed as bases for preparing the compliance plan approach. It refers to the Radioactive Liquid Waste System facilities and to the radioactive mixed waste

  4. A potential theory approach to an algorithm of conceptual space partitioning

    Roman Urban


    Full Text Available A potential theory approach to an algorithm of conceptual space partitioning This paper proposes a new classification algorithm for the partitioning of a conceptual space. All the algorithms which have been used until now have mostly been based on the theory of Voronoi diagrams. This paper proposes an approach based on potential theory, with the criteria for measuring similarities between objects in the conceptual space being based on the Newtonian potential function. The notion of a fuzzy prototype, which generalizes the previous definition of a prototype, is introduced. Furthermore, the necessary conditions that a natural concept must meet are discussed. Instead of convexity, as proposed by Gärdenfors, the notion of geodesically convex sets is used. Thus, if a concept corresponds to a set which is geodesically convex, it is a natural concept. This definition applies, for example, if the conceptual space is an Euclidean space. As a by-product of the construction of the algorithm, an extension of the conceptual space to d-dimensional Riemannian manifolds is obtained.   Algorytm podziału przestrzeni konceptualnych przy użyciu teorii potencjału W niniejszej pracy zaproponowany został nowy algorytm podziału przestrzeni konceptualnej. Dotąd podział taki zazwyczaj wykorzystywał teorię diagramów Voronoi. Nasze podejście do problemu oparte jest na teorii potencjału Miara podobieństwa pomiędzy elementami przestrzeni konceptualnej bazuje na Newtonowskiej funkcji potencjału. Definiujemy pojęcie rozmytego prototypu, który uogólnia dotychczas stosowane definicje prototypu. Ponadto zajmujemy się warunkiem koniecznym, który musi spełniać naturalny koncept. Zamiast wypukłości zaproponowanej przez Gärdenforsa, rozważamy linie geodezyjne w obszarze odpowiadającym danemu konceptowi naturalnemu, otrzymując warunek mówiący, że koncept jest konceptem naturalnym, jeżeli zbiór odpowiadający temu konceptowi jest geodezyjnie wypuk

  5. ESCORT enhanced diversity and space coding for underground metro and railway transmission - IST 1999-20006 - D2021 report - Final user needs report amendments



    The D2021 report has presented the final user needs definition of the RATP underground (France) and of the Bilbao (Spain). These two underground operators have merged in a same approach the final users needs in wireless communications concerning the track to train communications (voice and data services), the requirements for ground staff (maintenance personnel, garage and station personnel, administration and management personnel) but also the future needs concerning the passenger informatio...

  6. Orotracheal Intubation Using the Retromolar Space: A Reliable Alternative Intubation Approach to Prevent Dental Injury

    Linh T. Nguyen


    Full Text Available Despite recent advances in airway management, perianesthetic dental injury remains one of the most common anesthesia-related adverse events and cause for malpractice litigation against anesthesia providers. Recommended precautions for prevention of dental damage may not always be effective because these techniques involve contact and pressure exerted on vulnerable teeth. We describe a novel approach using the retromolar space to insert a flexible fiberscope for tracheal tube placement as a reliable method to achieve atraumatic tracheal intubation. Written consent for publication has been obtained from the patient.

  7. Learner-Centered Instruction (LCI): Volume 7. Evaluation of the LCI Approach. Final Report.

    Pieper, William J.; And Others

    An evaluation of the learner-centered instruction (LCI) approach to training was conducted by comparing the LCI F-111A weapons control systems mechanic/technician course with the conventional Air Force course for the same Air Force specialty code (AFSC) on the following dimensions; job performance of course graduates, man-hour and dollar costs of…

  8. BRST quantization of Yang-Mills theory: A purely Hamiltonian approach on Fock space

    Öttinger, Hans Christian


    We develop the basic ideas and equations for the BRST quantization of Yang-Mills theories in an explicit Hamiltonian approach, without any reference to the Lagrangian approach at any stage of the development. We present a new representation of ghost fields that combines desirable self-adjointness properties with canonical anticommutation relations for ghost creation and annihilation operators, thus enabling us to characterize the physical states on a well-defined Fock space. The Hamiltonian is constructed by piecing together simple BRST invariant operators to obtain a minimal invariant extension of the free theory. It is verified that the evolution equations implied by the resulting minimal Hamiltonian provide a quantum version of the classical Yang-Mills equations. The modifications and requirements for the inclusion of matter are discussed in detail.

  9. Researcher’s Academic Culture in the Educational Space of the University: Linguo-Axiological Approach

    Olena Semenog


    Full Text Available The article is devoted to the nature of the concepts “classic University”, “cultural and educational space of the University”, “research activity of future professional”, “researcher’s academic culture” and approach to academic culture as the basis of research culture in a university. It is defined that the concept of academic culture is complex. We are talking in general about the culture at the university, values, traditions, norms, rules of scientific research, and the scientific language culture, the culture of spirituality and morality, the culture of communication between science tutors and students, a culture of unique pedagogical action of master and his social, moral responsibility for the studying results. The formation of academic culture and own style, is better to develop on the positions of personal-activity, competence, axiological, cultural, acmeological approaches.

  10. The management approach to the NASA space station definition studies at the Manned Spacecraft Center

    Heberlig, J. C.


    The overall management approach to the NASA Phase B definition studies for space stations, which were initiated in September 1969 and completed in July 1972, is reviewed with particular emphasis placed on the management approach used by the Manned Spacecraft Center. The internal working organizations of the Manned Spacecraft Center and its prime contractor, North American Rockwell, are delineated along with the interfacing techniques used for the joint Government and industry study. Working interfaces with other NASA centers, industry, and Government agencies are briefly highlighted. The controlling documentation for the study (such as guidelines and constraints, bibliography, and key personnel) is reviewed. The historical background and content of the experiment program prepared for use in this Phase B study are outlined and management concepts that may be considered for future programs are proposed.

  11. A real-space renormalization approach to the Kubo-Greenwood formula in mirror Fibonacci systems

    Sanchez, Vicenta; Wang Chumin


    An exact real-space renormalization method is developed to address the electronic transport in mirror Fibonacci chains at a macroscopic scale by means of the Kubo-Greenwood formula. The results show that the mirror symmetry induces a large number of transparent states in the dc conductivity spectra, contrary to the simple Fibonacci case. A length scaling analysis over ten orders of magnitude reveals the existence of critically localized states and their ac conduction spectra show a highly oscillating behaviour. For multidimensional quasiperiodic systems, a novel renormalization plus convolution method is proposed. This combined renormalization + convolution method has shown an extremely elevated computing efficiency, being able to calculate electrical conductance of a three-dimensional non-crystalline solid with 10 30 atoms. Finally, the dc and ac conductances of mirror Fibonacci nanowires are also investigated, where a quantized dc-conductance variation with the Fermi energy is found, as observed in gold nanowires

  12. Exploration of Plasma Jets Approach to High Energy Density Physics. Final report

    Chen, Chiping [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)


    High-energy-density laboratory plasma (HEDLP) physics is an emerging, important area of research in plasma physics, nuclear physics, astrophysics, and particle acceleration. While the HEDLP regime occurs at extreme conditions which are often found naturally in space but not on the earth, it may be accessible by colliding high intensity plasmas such as high-energy-density plasma jets, plasmoids or compact toroids from plasma guns. The physics of plasma jets is investigated in the context of high energy density laboratory plasma research. This report summarizes results of theoretical and computational investigation of a plasma jet undergoing adiabatic compression and adiabatic expansion. A root-mean-squared (rms) envelope theory of plasma jets is developed. Comparison between theory and experiment is made. Good agreement between theory and experiment is found.

  13. Definition, development, and demonstration of analytical procedures for the structured assessment approach. Final report


    Analytical procedures were refined for the Structural Assessment Approach for assessing the Material Control and Accounting systems at facilities that contain special nuclear material. Requirements were established for an efficient, feasible algorithm to be used in evaluating system performance measures that involve the probability of detection. Algorithm requirements to calculate the probability of detection for a given type of adversary and the target set are described

  14. The systems approach for applying artificial intelligence to space station automation (Invited Paper)

    Grose, Vernon L.


    The progress of technology is marked by fragmentation -- dividing research and development into ever narrower fields of specialization. Ultimately, specialists know everything about nothing. And hope for integrating those slender slivers of specialty into a whole fades. Without an integrated, all-encompassing perspective, technology becomes applied in a lopsided and often inefficient manner. A decisionary model, developed and applied for NASA's Chief Engineer toward establishment of commercial space operations, can be adapted to the identification, evaluation, and selection of optimum application of artificial intelligence for space station automation -- restoring wholeness to a situation that is otherwise chaotic due to increasing subdivision of effort. Issues such as functional assignments for space station task, domain, and symptom modules can be resolved in a manner understood by all parties rather than just the person with assigned responsibility -- and ranked by overall significance to mission accomplishment. Ranking is based on the three basic parameters of cost, performance, and schedule. This approach has successfully integrated many diverse specialties in situations like worldwide terrorism control, coal mining safety, medical malpractice risk, grain elevator explosion prevention, offshore drilling hazards, and criminal justice resource allocation -- all of which would have otherwise been subject to "squeaky wheel" emphasis and support of decision-makers.

  15. The Final Count Down: A Review of Three Decades of Flight Controller Training Methods for Space Shuttle Mission Operations

    Dittermore, Gary; Bertels, Christie


    Operations of human spaceflight systems is extremely complex; therefore, the training and certification of operations personnel is a critical piece of ensuring mission success. Mission Control Center (MCC-H), at the Lyndon B. Johnson Space Center in Houston, Texas, manages mission operations for the Space Shuttle Program, including the training and certification of the astronauts and flight control teams. An overview of a flight control team s makeup and responsibilities during a flight, and details on how those teams are trained and certified, reveals that while the training methodology for developing flight controllers has evolved significantly over the last thirty years the core goals and competencies have remained the same. In addition, the facilities and tools used in the control center have evolved. Changes in methodology and tools have been driven by many factors, including lessons learned, technology, shuttle accidents, shifts in risk posture, and generational differences. Flight controllers share their experiences in training and operating the space shuttle. The primary training method throughout the program has been mission simulations of the orbit, ascent, and entry phases, to truly train like you fly. A review of lessons learned from flight controller training suggests how they could be applied to future human spaceflight endeavors, including missions to the moon or to Mars. The lessons learned from operating the space shuttle for over thirty years will help the space industry build the next human transport space vehicle.

  16. Third International Scientific and Practical Conference «Space Travel is Approaching Reality» (Successful Event in Difficult Times

    Matusevych Tetiana


    Full Text Available The article analyzes the presentations of participants of III International Scientific and Practical Conference «Space Travel – approaching reality», held on 6–7 November 2014 in Kharkiv, Ukraine


    Simona Amankevičiūtė


    Full Text Available This article conceptualizes the image of women in the sexist advertisements of the 1950s and 60s and in current advertising discourse by combining the research traditions of both cognitive linguistics and semiotic image analysis. The aim of the research is to try to evaluate how canonical positionings of women in the hyperreality of advertisements may slip into everyday discourse (stereotype space and to present an interpretation of the creators’ visual lexicon. It is presumed that the traditional (formed by feminist linguists approach to sexist advertising as an expression of an androcentric worldview in culture may be considered too subjectively critical. This study complements an interpretation of women’s social roles in advertising with cognitive linguistic insights on the subject’s (woman’s visualisation and positioning in ad space. The article briefly overviews the feminist approach to women’s place in public discourse, and discusses the relevance of Goffman’s Gender Studies to an investigation of women’s images in advertising. The scholar’s contribution to adapting cognitive frame theory for an investigation of visuals in advertising is also discussed. The analysed ads were divided into three groups by Goffman’s classification, according to the concrete visuals used to represent women’s bodies or parts thereof: dismemberment, commodification, and subordination ritual. The classified stereotypical images of women’s bodies are discussed as visual metonymy, visual metaphor, and image schemas.

  18. Contaminant ingress into multizone buildings: An analytical state-space approach

    Parker, Simon


    The ingress of exterior contaminants into buildings is often assessed by treating the building interior as a single well-mixed space. Multizone modelling provides an alternative way of representing buildings that can estimate concentration time series in different internal locations. A state-space approach is adopted to represent the concentration dynamics within multizone buildings. Analysis based on this approach is used to demonstrate that the exposure in every interior location is limited to the exterior exposure in the absence of removal mechanisms. Estimates are also developed for the short term maximum concentration and exposure in a multizone building in response to a step-change in concentration. These have considerable potential for practical use. The analytical development is demonstrated using a simple two-zone building with an inner zone and a range of existing multizone models of residential buildings. Quantitative measures are provided of the standard deviation of concentration and exposure within a range of residential multizone buildings. Ratios of the maximum short term concentrations and exposures to single zone building estimates are also provided for the same buildings. © 2013 Tsinghua University Press and Springer-Verlag Berlin Heidelberg.

  19. Resolving kinematic redundancy with constraints using the FSP (Full Space Parameterization) approach

    Pin, F.G.; Tulloch, F.A.


    A solution method is presented for the motion planning and control of kinematically redundant serial-link manipulators in the presence of motion constraints such as joint limits or obstacles. Given a trajectory for the end-effector, the approach utilizes the recently proposed Full Space Parameterization (FSP) method to generate a parameterized expression for the entire space of solutions of the unconstrained system. At each time step, a constrained optimization technique is then used to analytically find the specific joint motion solution that satisfies the desired task objective and all the constraints active during the time step. The method is applicable to systems operating in a priori known environments or in unknown environments with sensor-based obstacle detection. The derivation of the analytical solution is first presented for a general type of kinematic constraint and is then applied to the problem of motion planning for redundant manipulators with joint limits and obstacle avoidance. Sample results using planar and 3-D manipulators with various degrees of redundancy are presented to illustrate the efficiency and wide applicability of constrained motion planning using the FSP approach

  20. Mentoring SFRM: A New Approach to International Space Station Flight Controller Training

    Huning, Therese; Barshi, Immanuel; Schmidt, Lacey


    The Mission Operations Directorate (MOD) of the Johnson Space Center is responsible for providing continuous operations support for the International Space Station (ISS). Operations support requires flight controllers who are skilled in team performance as well as the technical operations of the ISS. Space Flight Resource Management (SFRM), a NASA adapted variant of Crew Resource Management (CRM), is the competency model used in the MOD. ISS flight controller certification has evolved to include a balanced focus on development of SFRM and technical expertise. The latest challenge the MOD faces is how to certify an ISS flight controller (operator) to a basic level of effectiveness in 1 year. SFRM training uses a two-pronged approach to expediting operator certification: 1) imbed SFRM skills training into all operator technical training and 2) use senior flight controllers as mentors. This paper focuses on how the MOD uses senior flight controllers as mentors to train SFRM skills. Methods: A mentor works with an operator throughout the training flow. Inserted into the training flow are guided-discussion sessions and on-the-job observation opportunities focusing on specific SFRM skills, including: situational leadership, conflict management, stress management, cross-cultural awareness, self care and team care while on-console, communication, workload management, and situation awareness. The mentor and operator discuss the science and art behind the skills, cultural effects on skills applications, recognition of good and bad skills applications, recognition of how skills application changes subtly in different situations, and individual goals and techniques for improving skills. Discussion: This mentoring program provides an additional means of transferring SFRM knowledge compared to traditional CRM training programs. Our future endeavors in training SFRM skills (as well as other organization s) may benefit from adding team performance skills mentoring. This paper

  1. Task-space separation principle: a force-field approach to motion planning for redundant manipulators.

    Tommasino, Paolo; Campolo, Domenico


    In this work, we address human-like motor planning in redundant manipulators. Specifically, we want to capture postural synergies such as Donders' law, experimentally observed in humans during kinematically redundant tasks, and infer a minimal set of parameters to implement similar postural synergies in a kinematic model. For the model itself, although the focus of this paper is to solve redundancy by implementing postural strategies derived from experimental data, we also want to ensure that such postural control strategies do not interfere with other possible forms of motion control (in the task-space), i.e. solving the posture/movement problem. The redundancy problem is framed as a constrained optimization problem, traditionally solved via the method of Lagrange multipliers. The posture/movement problem can be tackled via the separation principle which, derived from experimental evidence, posits that the brain processes static torques (i.e. posture-dependent, such as gravitational torques) separately from dynamic torques (i.e. velocity-dependent). The separation principle has traditionally been applied at a joint torque level. Our main contribution is to apply the separation principle to Lagrange multipliers, which act as task-space force fields, leading to a task-space separation principle. In this way, we can separate postural control (implementing Donders' law) from various types of tasks-space movement planners. As an example, the proposed framework is applied to the (redundant) task of pointing with the human wrist. Nonlinear inverse optimization (NIO) is used to fit the model parameters and to capture motor strategies displayed by six human subjects during pointing tasks. The novelty of our NIO approach is that (i) the fitted motor strategy, rather than raw data, is used to filter and down-sample human behaviours; (ii) our framework is used to efficiently simulate model behaviour iteratively, until it converges towards the experimental human strategies.

  2. Probabilistic Approach to Enable Extreme-Scale Simulations under Uncertainty and System Faults. Final Technical Report

    Knio, Omar [Duke Univ., Durham, NC (United States). Dept. of Mechanical Engineering and Materials Science


    The current project develops a novel approach that uses a probabilistic description to capture the current state of knowledge about the computational solution. To effectively spread the computational effort over multiple nodes, the global computational domain is split into many subdomains. Computational uncertainty in the solution translates into uncertain boundary conditions for the equation system to be solved on those subdomains, and many independent, concurrent subdomain simulations are used to account for this bound- ary condition uncertainty. By relying on the fact that solutions on neighboring subdomains must agree with each other, a more accurate estimate for the global solution can be achieved. Statistical approaches in this update process make it possible to account for the effect of system faults in the probabilistic description of the computational solution, and the associated uncertainty is reduced through successive iterations. By combining all of these elements, the probabilistic reformulation allows splitting the computational work over very many independent tasks for good scalability, while being robust to system faults.

  3. Approaches for scalable modeling and emulation of cyber systems : LDRD final report.

    Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.; Rudish, Don W.


    The goal of this research was to combine theoretical and computational approaches to better understand the potential emergent behaviors of large-scale cyber systems, such as networks of {approx} 10{sup 6} computers. The scale and sophistication of modern computer software, hardware, and deployed networked systems have significantly exceeded the computational research community's ability to understand, model, and predict current and future behaviors. This predictive understanding, however, is critical to the development of new approaches for proactively designing new systems or enhancing existing systems with robustness to current and future cyber threats, including distributed malware such as botnets. We have developed preliminary theoretical and modeling capabilities that can ultimately answer questions such as: How would we reboot the Internet if it were taken down? Can we change network protocols to make them more secure without disrupting existing Internet connectivity and traffic flow? We have begun to address these issues by developing new capabilities for understanding and modeling Internet systems at scale. Specifically, we have addressed the need for scalable network simulation by carrying out emulations of a network with {approx} 10{sup 6} virtualized operating system instances on a high-performance computing cluster - a 'virtual Internet'. We have also explored mappings between previously studied emergent behaviors of complex systems and their potential cyber counterparts. Our results provide foundational capabilities for further research toward understanding the effects of complexity in cyber systems, to allow anticipating and thwarting hackers.

  4. X-ray analysis of residual stress gradients in TiN coatings by a Laplace space approach and cross-sectional nanodiffraction: a critical comparison.

    Stefenelli, Mario; Todt, Juraj; Riedl, Angelika; Ecker, Werner; Müller, Thomas; Daniel, Rostislav; Burghammer, Manfred; Keckes, Jozef


    Novel scanning synchrotron cross-sectional nanobeam and conventional laboratory as well as synchrotron Laplace X-ray diffraction methods are used to characterize residual stresses in exemplary 11.5 µm-thick TiN coatings. Both real and Laplace space approaches reveal a homogeneous tensile stress state and a very pronounced compressive stress gradient in as-deposited and blasted coatings, respectively. The unique capabilities of the cross-sectional approach operating with a beam size of 100 nm in diameter allow the analysis of stress variation with sub-micrometre resolution at arbitrary depths and the correlation of the stress evolution with the local coating microstructure. Finally, advantages and disadvantages of both approaches are extensively discussed.

  5. Approach to Integrate Global-Sun Models of Magnetic Flux Emergence and Transport for Space Weather Studies

    Mansour, Nagi N.; Wray, Alan A.; Mehrotra, Piyush; Henney, Carl; Arge, Nick; Godinez, H.; Manchester, Ward; Koller, J.; Kosovichev, A.; Scherrer, P.; hide


    The Sun lies at the center of space weather and is the source of its variability. The primary input to coronal and solar wind models is the activity of the magnetic field in the solar photosphere. Recent advancements in solar observations and numerical simulations provide a basis for developing physics-based models for the dynamics of the magnetic field from the deep convection zone of the Sun to the corona with the goal of providing robust near real-time boundary conditions at the base of space weather forecast models. The goal is to develop new strategic capabilities that enable characterization and prediction of the magnetic field structure and flow dynamics of the Sun by assimilating data from helioseismology and magnetic field observations into physics-based realistic magnetohydrodynamics (MHD) simulations. The integration of first-principle modeling of solar magnetism and flow dynamics with real-time observational data via advanced data assimilation methods is a new, transformative step in space weather research and prediction. This approach will substantially enhance an existing model of magnetic flux distribution and transport developed by the Air Force Research Lab. The development plan is to use the Space Weather Modeling Framework (SWMF) to develop Coupled Models for Emerging flux Simulations (CMES) that couples three existing models: (1) an MHD formulation with the anelastic approximation to simulate the deep convection zone (FSAM code), (2) an MHD formulation with full compressible Navier-Stokes equations and a detailed description of radiative transfer and thermodynamics to simulate near-surface convection and the photosphere (Stagger code), and (3) an MHD formulation with full, compressible Navier-Stokes equations and an approximate description of radiative transfer and heating to simulate the corona (Module in BATS-R-US). CMES will enable simulations of the emergence of magnetic structures from the deep convection zone to the corona. Finally, a plan

  6. Low-pressure approach to the formation and study of exciplex systems. Final report

    Sanzone, G.


    Under this contract, the following goals were set. (1) Development and construction of an experimental system for the study of the kinetics of excimers, and demonstrate the validity of the low-pressure approach to such studies. The apparatus was to consist of the following: (a) cluster-molecular-beam source of van der Waals dimers and higher oligomers; (b) modulated-beam mass spectrometer; (c) low-energy electron beam for the production of excimers; (d) vacuum-ultraviolet to Visible detection and photon-counting system to monitor excimer emission; (e) flash-excited tunable laser for studies of resonant self-absorptions. (2) Form Ar 2 in its van der Waals ground state. (3) Produce Ar 2 * by electron bombardment of Ar 2 . (4) Perform fluorescence and photon absorption studies of Ar 2 *. At the end of the contract period, goals 1 and 2 have been met; experiments 3 and 4 have been designed

  7. A Regionalization Approach to select the final watershed parameter set among the Pareto solutions

    Park, G. H.; Micheletty, P. D.; Carney, S.; Quebbeman, J.; Day, G. N.


    The calibration of hydrological models often results in model parameters that are inconsistent with those from neighboring basins. Considering that physical similarity exists within neighboring basins some of the physically related parameters should be consistent among them. Traditional manual calibration techniques require an iterative process to make the parameters consistent, which takes additional effort in model calibration. We developed a multi-objective optimization procedure to calibrate the National Weather Service (NWS) Research Distributed Hydrological Model (RDHM), using the Nondominant Sorting Genetic Algorithm (NSGA-II) with expert knowledge of the model parameter interrelationships one objective function. The multi-objective algorithm enables us to obtain diverse parameter sets that are equally acceptable with respect to the objective functions and to choose one from the pool of the parameter sets during a subsequent regionalization step. Although all Pareto solutions are non-inferior, we exclude some of the parameter sets that show extremely values for any of the objective functions to expedite the selection process. We use an apriori model parameter set derived from the physical properties of the watershed (Koren et al., 2000) to assess the similarity for a given parameter across basins. Each parameter is assigned a weight based on its assumed similarity, such that parameters that are similar across basins are given higher weights. The parameter weights are useful to compute a closeness measure between Pareto sets of nearby basins. The regionalization approach chooses the Pareto parameter sets that minimize the closeness measure of the basin being regionalized. The presentation will describe the results of applying the regionalization approach to a set of pilot basins in the Upper Colorado basin as part of a NASA-funded project.

  8. Evaluation and demonstration of decentralized space and water heating versus centralized services for new and rehabilitated multifamily buildings. Final report

    Belkus, P. [Foster-Miller, Inc., Waltham, MA (US); Tuluca, A. [Steven Winter Associates, Inc., Norwalk, CT (US)


    The general objective of this research was aimed at developing sufficient technical and economic know-how to convince the building and design communities of the appropriateness and energy advantages of decentralized space and water heating for multifamily buildings. Two main goals were established to guide this research. First, the research sought to determine the cost-benefit advantages of decentralized space and water heating versus centralized systems for multifamily applications based on innovative gas piping and appliance technologies. The second goal was to ensure that this information is made available to the design community.

  9. Path integral approach for superintegrable potentials on spaces of non-constant curvature. Pt. 2. Darboux spaces D{sub III} and D{sub IV}

    Grosche, C. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Pogosyan, G.S. [Joint Inst. of Nuclear Research, Moscow (Russian Federation). Bogoliubov Lab. of Theoretical Physics]|[Guadalajara Univ., Jalisco (Mexico). Dept. de Matematicas CUCEI; Sissakian, A.N. [Joint Inst. of Nuclear Research, Moscow (Russian Federation). Bogoliubov Lab. of Theoretical Physics


    This is the second paper on the path integral approach of superintegrable systems on Darboux spaces, spaces of non-constant curvature. We analyze in the spaces D{sub III} and D{sub IV} five respectively four superintegrable potentials, which were first given by Kalnins et al. We are able to evaluate the path integral in most of the separating coordinate systems, leading to expressions for the Green functions, the discrete and continuous wave-functions, and the discrete energy-spectra. In some cases, however, the discrete spectrum cannot be stated explicitly, because it is determined by a higher order polynomial equation. We show that also the free motion in Darboux space of type III can contain bound states, provided the boundary conditions are appropriate. We state the energy spectrum and the wave-functions, respectively. (orig.)

  10. Final Technical Report: "Representing Endogenous Technological Change in Climate Policy Models: General Equilibrium Approaches"

    Ian Sue Wing


    The research supported by this award pursued three lines of inquiry: (1) The construction of dynamic general equilibrium models to simulate the accumulation and substitution of knowledge, which has resulted in the preparation and submission of several papers: (a) A submitted pedagogic paper which clarifies the structure and operation of computable general equilibrium (CGE) models (C.2), and a review article in press which develops a taxonomy for understanding the representation of technical change in economic and engineering models for climate policy analysis (B.3). (b) A paper which models knowledge directly as a homogeneous factor, and demonstrates that inter-sectoral reallocation of knowledge is the key margin of adjustment which enables induced technical change to lower the costs of climate policy (C.1). (c) An empirical paper which estimates the contribution of embodied knowledge to aggregate energy intensity in the U.S. (C.3), followed by a companion article which embeds these results within a CGE model to understand the degree to which autonomous energy efficiency improvement (AEEI) is attributable to technical change as opposed to sub-sectoral shifts in industrial composition (C.4) (d) Finally, ongoing theoretical work to characterize the precursors and implications of the response of innovation to emission limits (E.2). (2) Data development and simulation modeling to understand how the characteristics of discrete energy supply technologies determine their succession in response to emission limits when they are embedded within a general equilibrium framework. This work has produced two peer-reviewed articles which are currently in press (B.1 and B.2). (3) Empirical investigation of trade as an avenue for the transmission of technological change to developing countries, and its implications for leakage, which has resulted in an econometric study which is being revised for submission to a journal (E.1). As work commenced on this topic, the U.S. withdrawal

  11. Approaches to Outdoor Thermal Comfort Thresholds through Public Space Design: A Review

    Andre Santos Nouri


    Full Text Available Based on the Köppen Geiger (KG classification system, this review article examines existing studies and projects that have endeavoured to address local outdoor thermal comfort thresholds through Public Space Design (PSD. The review is divided into two sequential stages, whereby (1 overall existing approaches to pedestrian thermal comfort thresholds are reviewed within both quantitative and qualitative spectrums; and (2 the different techniques and measures are reviewed and framed into four Measure Review Frameworks (MRFs, in which each type of PSD measure is presented alongside its respective local scale urban specificities/conditions and their resulting thermal attenuation outcomes. The result of this review article is the assessment of how current practices of PSD within three specific subcategories of the KG ‘Temperate’ group have addressed microclimatic aggravations such as elevated urban temperatures and Urban Heat Island (UHI effects. Based upon a bottom-up approach, the interdisciplinary practice of PSD is hence approached as a means to address existing and future thermal risk factors within the urban public realm in an era of potential climate change.

  12. Taking SiC Power Devices to the Final Frontier: Addressing Challenges of the Space Radiation Environment

    Lauenstein, Jean-Marie; Casey, Megan


    Silicon carbide power device technology has the potential to enable a new generation of aerospace power systems that demand high efficiency, rapid switching, and reduced mass and volume in order to expand space-based capabilities. For this potential to be realized, SiC devices must be capable of withstanding the harsh space radiation environment. Commercial SiC components exhibit high tolerance to total ionizing dose but to date, have not performed well under exposure to heavy ion radiation representative of the on-orbit galactic cosmic rays. Insertion of SiC power device technology into space applications to achieve breakthrough performance gains will require intentional development of components hardened to the effects of these highly-energetic heavy ions. This work presents heavy-ion test data obtained by the authors over the past several years for discrete SiC power MOSFETs, JFETs, and diodes in order to increase the body of knowledge and understanding that will facilitate hardening of this technology to space radiation effects. Specifically, heavy-ion irradiation data taken under different bias, temperature, and ion beam conditions is presented for devices from different manufacturers, and the emerging patterns discussed.

  13. Final Technical Report - Use of Systems Biology Approaches to Develop Advanced Biofuel-Synthesizing Cyanobacterial Strains

    Pakrasi, Himadri [Washington Univ., St. Louis, MO (United States)


    The overall objective of this project was to use a systems biology approach to evaluate the potentials of a number of cyanobacterial strains for photobiological production of advanced biofuels and/or their chemical precursors. Cyanobacteria are oxygen evolving photosynthetic prokaryotes. Among them, certain unicellular species such as Cyanothece can also fix N2, a process that is exquisitely sensitive to oxygen. To accommodate such incompatible processes in a single cell, Cyanothece produces oxygen during the day, and creates an O2-limited intracellular environment during the night to perform O2-sensitive processes such as N2-fixation. Thus, Cyanothece cells are natural bioreactors for the storage of captured solar energy with subsequent utilization at a different time during a diurnal cycle. Our studies include the identification of a novel, fast-growing, mixotrophic, transformable cyanobacterium. This strain has been sequenced and will be made available to the community. In addition, we have developed genome-scale models for a family of cyanobacteria to assess their metabolic repertoire. Furthermore, we developed a method for rapid construction of metabolic models using multiple annotation sources and a metabolic model of a related organism. This method will allow rapid annotation and screening of potential phenotypes based on the newly available genome sequences of many organisms.

  14. Final Report - Composite Fermion Approach to Strongly Interacting Quasi Two Dimensional Electron Gas Systems

    Quinn, John


    Work related to this project introduced the idea of an effective monopole strength Q* that acted as the effective angular momentum of the lowest shell of composite Fermions (CF). This allowed us to predict the angular momentum of the lowest band of energy states for any value of the applied magnetic field simply by determining N{sub QP} the number of quasielectrons (QE) or quasiholes (QH) in a partially filled CF shell and adding angular momenta of the N{sub QP} Fermions excitations. The approach reported treated the filled CF level as a vacuum state which could support QE and QH excitations. Numerical diagonalization of small systems allowed us to determine the angular momenta, the energy, and the pair interaction energies of these elementary excitations. The spectra of low energy states could then be evaluated in a Fermi liquid-like picture, treating the much smaller number of quasiparticles and their interactions instead of the larger system of N electrons with Coulomb interactions.

  15. Final report on the comprehensive approach to energy conservation for the Aboriginal community in Ontario

    Fox, C.D. [Fort William First Nation, Thunder Bay, ON (Canada)


    This report presented a comprehensive approach to energy conservation programming for the Fort William First Nation, located in Thunder Bay, Ontario. The report outlined the historical context of the relationship between the Canadian government and Aboriginal people. The Aboriginal community in Ontario was described with reference to the difference between First Nations population, Metis, and Inuit. Statistics on the Aboriginal population in Ontario was broken down. Different Aboriginal organizations as well as organizations serving Aboriginal peoples were identified and described. The report also described the political process and administrative protocol for energy conservation and energy efficiency. Energy conservation in the Aboriginal community was also explained. Last, the report provided several recommendations related to awareness and education; translation; incentives; delivery mechanisms; and pilot projects. The report concluded with an agreement to hold a provincial conference in Toronto on the issues raised in the report. The report concluded that an Aboriginal unit within the Bureau of Conservation of the Ontario Power Authority was envisioned to plan, develop, implement, manage and monitor the deliverables resulting from the report.

  16. Final Report for Bio-Inspired Approaches to Moving-Target Defense Strategies

    Fink, Glenn A.; Oehmen, Christopher S.


    This report records the work and contributions of the NITRD-funded Bio-Inspired Approaches to Moving-Target Defense Strategies project performed by Pacific Northwest National Laboratory under the technical guidance of the National Security Agency’s R6 division. The project has incorporated a number of bio-inspired cyber defensive technologies within an elastic framework provided by the Digital Ants. This project has created the first scalable, real-world prototype of the Digital Ants Framework (DAF)[11] and integrated five technologies into this flexible, decentralized framework: (1) Ant-Based Cyber Defense (ABCD), (2) Behavioral Indicators, (3) Bioinformatic Clas- sification, (4) Moving-Target Reconfiguration, and (5) Ambient Collaboration. The DAF can be used operationally to decentralize many such data intensive applications that normally rely on collection of large amounts of data in a central repository. In this work, we have shown how these component applications may be decentralized and may perform analysis at the edge. Operationally, this will enable analytics to scale far beyond current limitations while not suffering from the bandwidth or computational limitations of centralized analysis. This effort has advanced the R6 Cyber Security research program to secure digital infrastructures by developing a dynamic means to adaptively defend complex cyber systems. We hope that this work will benefit both our client’s efforts in system behavior modeling and cyber security to the overall benefit of the nation.

  17. Final Technical Report -- Bridging the PSI Knowledge Gap: A Multiscale Approach

    Whyte, Dennis [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)


    -species plasmas and metals was experimentally studied with independent measurement methods across the Center. This surprising result challenges the universal use of the binary-collision approximation in sputtering predictions and continues to be the subject of study. In order to address this issue MIT developed a new in situ erosion measurement technique based on ion beam analysis which can be used at elevated material temperatures. This exciting new technique is now being used to study material erosion in high performance plasma thrusters for space exploration and is being adopted to fusion experimental devices. This is an indicator of the positive synergies that arise from such a Center, with the research having impact beyond the initial area of study. The Center also served successfully as an organizing force for communication to the science community. The MIT members of the Center provided many high-profile overview presentations at prestigious international conferences and national workshops. The research resulted in three student theses and 24 peer-reviewed publications. PSI research continues to be identified as a critical area for fusion energy.

  18. A Systems Approach to Bio-Oil Stabilization - Final Technical Report

    Brown, Robert C; Meyer, Terrence; Fox, Rodney; Submramaniam, Shankar; Shanks, Brent; Smith, Ryan G


    The objective of this project is to develop practical, cost effective methods for stabilizing biomass-derived fast pyrolysis oil for at least six months of storage under ambient conditions. The U.S. Department of Energy has targeted three strategies for stabilizing bio-oils: (1) reducing the oxygen content of the organic compounds comprising pyrolysis oil; (2) removal of carboxylic acid groups such that the total acid number (TAN) of the pyrolysis oil is dramatically reduced; and (3) reducing the charcoal content, which contains alkali metals known to catalyze reactions that increase the viscosity of bio-oil. Alkali and alkaline earth metals (AAEM), are known to catalyze decomposition reactions of biomass carbohydrates to produce light oxygenates that destabilize the resulting bio-oil. Methods envisioned to prevent the AAEM from reaction with the biomass carbohydrates include washing the AAEM out of the biomass with water or dilute acid or infusing an acid catalyst to passivate the AAEM. Infusion of acids into the feedstock to convert all of the AAEM to salts which are stable at pyrolysis temperatures proved to be a much more economically feasible process. Our results from pyrolyzing acid infused biomass showed increases in the yield of anhydrosugars by greater than 300% while greatly reducing the yield of light oxygenates that are known to destabilize bio-oil. Particulate matter can interfere with combustion or catalytic processing of either syngas or bio-oil. It also is thought to catalyze the polymerization of bio-oil, which increases the viscosity of bio-oil over time. High temperature bag houses, ceramic candle filters, and moving bed granular filters have been variously suggested for syngas cleaning at elevated temperatures. High temperature filtration of bio-oil vapors has also been suggested by the National Renewable Energy Laboratory although there remain technical challenges to this approach. The fast pyrolysis of biomass yields three main organic

  19. Infinite-mode squeezed coherent states and non-equilibrium statistical mechanics (phase-space-picture approach)

    Yeh, L.


    The phase-space-picture approach to quantum non-equilibrium statistical mechanics via the characteristic function of infinite- mode squeezed coherent states is introduced. We use quantum Brownian motion as an example to show how this approach provides an interesting geometrical interpretation of quantum non-equilibrium phenomena

  20. Application of a computationally efficient geostatistical approach to characterizing variably spaced water-table data

    Quinn, J.J.


    Geostatistical analysis of hydraulic head data is useful in producing unbiased contour plots of head estimates and relative errors. However, at most sites being characterized, monitoring wells are generally present at different densities, with clusters of wells in some areas and few wells elsewhere. The problem that arises when kriging data at different densities is in achieving adequate resolution of the grid while maintaining computational efficiency and working within software limitations. For the site considered, 113 data points were available over a 14-mi 2 study area, including 57 monitoring wells within an area of concern of 1.5 mi 2 . Variogram analyses of the data indicate a linear model with a negligible nugget effect. The geostatistical package used in the study allows a maximum grid of 100 by 100 cells. Two-dimensional kriging was performed for the entire study area with a 500-ft grid spacing, while the smaller zone was modeled separately with a 100-ft spacing. In this manner, grid cells for the dense area and the sparse area remained small relative to the well separation distances, and the maximum dimensions of the program were not exceeded. The spatial head results for the detailed zone were then nested into the regional output by use of a graphical, object-oriented database that performed the contouring of the geostatistical output. This study benefitted from the two-scale approach and from very fine geostatistical grid spacings relative to typical data separation distances. The combining of the sparse, regional results with those from the finer-resolution area of concern yielded contours that honored the actual data at every measurement location. The method applied in this study can also be used to generate reproducible, unbiased representations of other types of spatial data

  1. A multifractal approach to space-filling recovery for PET quantification

    Willaime, Julien M. Y., E-mail:; Aboagye, Eric O. [Comprehensive Cancer Imaging Centre, Imperial College London, Hammersmith Hospital, London W12 0NN (United Kingdom); Tsoumpas, Charalampos [Division of Medical Physics, University of Leeds, LS2 9JT (United Kingdom); Turkheimer, Federico E. [Department of Neuroimaging, Institute of Psychiatry, King’s College London, London SE5 8AF (United Kingdom)


    Purpose: A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). Methods: A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV {sub mean}) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal and synthetic objects contaminated by partial volume effects (PVEs), validated on realistic {sup 18}F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical {sup 18}F-fluorothymidine PET test–retest dataset. Results: TLA estimates were stable for a range of resolutions typical in PET oncology (4–6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV {sub mean} or TV measurements across imaging protocols. Conclusions: The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.

  2. High-Payoff Space Transportation Design Approach with a Technology Integration Strategy

    McCleskey, C. M.; Rhodes, R. E.; Chen, T.; Robinson, J.


    A general architectural design sequence is described to create a highly efficient, operable, and supportable design that achieves an affordable, repeatable, and sustainable transportation function. The paper covers the following aspects of this approach in more detail: (1) vehicle architectural concept considerations (including important strategies for greater reusability); (2) vehicle element propulsion system packaging considerations; (3) vehicle element functional definition; (4) external ground servicing and access considerations; and, (5) simplified guidance, navigation, flight control and avionics communications considerations. Additionally, a technology integration strategy is forwarded that includes: (a) ground and flight test prior to production commitments; (b) parallel stage propellant storage, such as concentric-nested tanks; (c) high thrust, LOX-rich, LOX-cooled first stage earth-to-orbit main engine; (d) non-toxic, day-of-launch-loaded propellants for upper stages and in-space propulsion; (e) electric propulsion and aero stage control.

  3. A new approach to the analysis of the phase space of f(R)-gravity

    Carloni, S., E-mail: [Centro Multidisciplinar de Astrofisica—CENTRA, Instituto Superior Tecnico – IST, Universidade de Lisboa – UL, Avenida Rovisco Pais 1, 1049-001 (Portugal)


    We propose a new dynamical system formalism for the analysis of f(R) cosmologies. The new approach eliminates the need for cumbersome inversions to close the dynamical system and allows the analysis of the phase space of f(R)-gravity models which cannot be investigated using the standard technique. Differently form previously proposed similar techniques, the new method is constructed in such a way to associate to the fixed points scale factors, which contain four integration constants (i.e. solutions of fourth order differential equations). In this way a new light is shed on the physical meaning of the fixed points. We apply this technique to some f(R) Lagrangians relevant for inflationary and dark energy models.

  4. Solar pumping of solid state lasers for space mission: a novel approach

    Boetti, N. G.; Lousteau, J.; Negro, D.; Mura, E.; Scarpignato, G. C.; Perrone, G.; Milanese, D.; Abrate, S.


    Solar pumped laser (SPL) can find wide applications in space missions, especially for long lasting ones. In this paper a new technological approach for the realization of a SPL based on fiber laser technology is proposed. We present a preliminary study, focused on the active material performance evaluation, towards the realization of a Nd3+ -doped fiber laser made of phosphate glass materials, emitting at 1.06 μm. For this research several Nd3+ -doped phosphate glass samples were fabricated, with concentration of Nd3+ up to 10 mol%. Physical and thermal properties of the glasses were measured and their spectroscopic properties are described. The effect of Nd3+ doping concentration on emission spectra and lifetimes was investigated in order to study the concentration quenching effect on luminescence performance.

  5. Reliability modeling of a hard real-time system using the path-space approach

    Kim, Hagbae


    A hard real-time system, such as a fly-by-wire system, fails catastrophically (e.g. losing stability) if its control inputs are not updated by its digital controller computer within a certain timing constraint called the hard deadline. To assess and validate those systems' reliabilities by using a semi-Markov model that explicitly contains the deadline information, we propose a path-space approach deriving the upper and lower bounds of the probability of system failure. These bounds are derived by using only simple parameters, and they are especially suitable for highly reliable systems which should recover quickly. Analytical bounds are derived for both exponential and Wobble failure distributions encountered commonly, which have proven effective through numerical examples, while considering three repair strategies: repair-as-good-as-new, repair-as-good-as-old, and repair-better-than-old

  6. Phase-space description of wave packet approach to electronic transport in nanoscale systems

    Szydłowski, D; Wołoszyn, M; Spisak, B J


    The dynamics of conduction electrons in resonant tunnelling nanosystems is studied within the phase-space approach based on the Wigner distribution function. The time evolution of the distribution function is calculated from the time-dependent quantum kinetic equation for which an effective numerical method is presented. Calculations of the transport properties of a double-barrier resonant tunnelling diode are performed to illustrate the proposed techniques. Additionally, analysis of the transient effects in the nanosystem is carried out and it is shown that for some range of the bias voltage the temporal variations of electronic current can take negative values. The explanation of this effect is based on the analysis of the time changes of the Wigner distribution function. The decay time of the temporal current oscillations in the nanosystem as a function of the bias voltage is determined. (paper)

  7. Truncated Hilbert Space Approach for the 1+1D phi^4 Theory

    CERN. Geneva


    (an informal seminar, not a regular string seminar) We used the massive analogue of the truncated conformal space approach to study the broken phase of the 1+1 dimensional scalar phi^4 model in finite volume, similarly to the work by S. Rychkov and L. Vitale. In our work, the finite size spectrum was determined numerically using an effective eigensolver routine, which was followed by a simple extrapolation in the cutoff energy. We analyzed both the periodic and antiperiodic sectors. The results were compared with semiclassical and Bethe-Yang results as well as perturbation theory. We obtained the coupling dependence of the infinite volume breather and kink masses for moderate couplings. The results fit well with semiclassics and perturbative estimations, and confirm the conjecture of Mussardo that at most two neutral excitations can exist in the spectrum. We believe that improving our method with the renormalization procedure of Rychkov et al. enables to measure further interesting quantities such as decay ra...

  8. A new approach for the evaluation of the effective electrode spacing in spherical ion chambers

    Maghraby, Ahmed M., E-mail: [National Institute of Standards (NIS), Ionizing Radiation Metrology Laboratory, Tersa Street 12211, Giza P.O. Box: 136 (Egypt); Shqair, Mohammed [Physics Department, Faculty of Science and Humanities, Sattam Bin Abdul Aziz University, Alkharj (Saudi Arabia)


    Proper determination of the effective electrode spacing (d{sub eff}) of an ion chamber ensures proper determination of its collection efficiency either in continuous or in pulsed radiation in addition to the proper evaluation of the transit time. Boag's method for the determination of d{sub eff} assumes the spherical shape of the internal electrode of the spherical ion chambers which is not always true, except for some cases, its common shape is cylindrical. Current work provides a new approach for the evaluation of the effective electrode spacing in spherical ion chambers considering the cylindrical shape of the internal electrode. Results indicated that d{sub eff} values obtained through current work are less than those obtained using Boag's method by factors ranging from 12.1% to 26.9%. Current method also impacts the numerically evaluated collection efficiency (f) where values obtained differ by factors up to 3% at low potential (V) values while at high V values minor differences were noticed. Additionally, impacts on the evaluation of the transit time (τ{sub i}) were obtained. It is concluded that approximating the internal electrode as a sphere may result in false values of d{sub eff}, f, and τ{sub i}.

  9. Space station electrical power distribution analysis using a load flow approach

    Emanuel, Ervin M.


    The space station's electrical power system will evolve and grow in a manner much similar to the present terrestrial electrical power system utilities. The initial baseline reference configuration will contain more than 50 nodes or busses, inverters, transformers, overcurrent protection devices, distribution lines, solar arrays, and/or solar dynamic power generating sources. The system is designed to manage and distribute 75 KW of power single phase or three phase at 20 KHz, and grow to a level of 300 KW steady state, and must be capable of operating at a peak of 450 KW for 5 to 10 min. In order to plan far into the future and keep pace with load growth, a load flow power system analysis approach must be developed and utilized. This method is a well known energy assessment and management tool that is widely used throughout the Electrical Power Utility Industry. The results of a comprehensive evaluation and assessment of an Electrical Distribution System Analysis Program (EDSA) is discussed. Its potential use as an analysis and design tool for the 20 KHz space station electrical power system is addressed.

  10. Ethical approach to digital skills. Sense and use in virtual educational spaces



    Full Text Available In the context of technology and cyberspace, should we do everything we can do? The answer given to this question is not ethical, is political: safety. The safety and security are overshadowing the ethical question about the meaning of technology. Cyberspace imposes a "new logic" and new forms of "ownership". When it comes to the Internet in relation to children not always adopt logic of accountability to the cyberspace, Internet showing a space not only ethical and technical. We talk about safe Internet, Internet healthy, and Internet Fit for Children... why not talk over Internet ethics? With this work we approach digital skills as those skills that help us to position ourselves and guide us in cyberspace. Something that is not possible without also ethical skills. So, in this article we will try to build and propose a model for analyzing the virtual learning spaces (and cyberspace in general based on the categories of "use" and "sense" as different levels of ownership that indicate the types of competences needed to access cyberspace.  

  11. Scattering in quantum field theory: the M.P.S.A. approach in complex momentum space

    Bros, J.


    In this course, we intend to show how 'Many-Particle Structure Analysis' (M.P.S.A.) can be worked out in the standard field-theoretical framework, by using integral relations in complex momentum space involving 'l-particle irreducible kernels'. The ultimate purpose of this approach is to obtain the best possible knowledge of the singularities (location, nature, type of ramification) and of the ambient holomorphy (or meromorphy) domains of the n-point Green functions and scattering amplitudes, and at the same time to derive analytic structural equations for them which display the global organization of these singularities. The generation of Landau singularities for integrals and Fredholm resolvents, taken on cycles in complex space, will be explained on the basis of the Picard-Lefschetz formula (presented and used in simple situations). Among various results described, we present and analyse a structural equation for the six-point function (and for the 3 → 3 particle scattering function), valid in a domain containing the three-particle normal threshold

  12. Modeling solvation effects in real-space and real-time within density functional approaches

    Delgado, Alain [Istituto Nanoscienze - CNR, Centro S3, via Campi 213/A, 41125 Modena (Italy); Centro de Aplicaciones Tecnológicas y Desarrollo Nuclear, Calle 30 # 502, 11300 La Habana (Cuba); Corni, Stefano; Pittalis, Stefano; Rozzi, Carlo Andrea [Istituto Nanoscienze - CNR, Centro S3, via Campi 213/A, 41125 Modena (Italy)


    The Polarizable Continuum Model (PCM) can be used in conjunction with Density Functional Theory (DFT) and its time-dependent extension (TDDFT) to simulate the electronic and optical properties of molecules and nanoparticles immersed in a dielectric environment, typically liquid solvents. In this contribution, we develop a methodology to account for solvation effects in real-space (and real-time) (TD)DFT calculations. The boundary elements method is used to calculate the solvent reaction potential in terms of the apparent charges that spread over the van der Waals solute surface. In a real-space representation, this potential may exhibit a Coulomb singularity at grid points that are close to the cavity surface. We propose a simple approach to regularize such singularity by using a set of spherical Gaussian functions to distribute the apparent charges. We have implemented the proposed method in the OCTOPUS code and present results for the solvation free energies and solvatochromic shifts for a representative set of organic molecules in water.

  13. A novel method for creating working space during endoscopic thyroidectomy via bilateral areolar approach.

    Tan, Yi-Hong; Du, Guo-Neng; Xiao, Yu-Gen; Qiu, Wan-Shou; Wu, Tao


    Endoscopic thyroidectomy (ET) can be performed through the bilateral areolar approach (BAA). A working space (WS) is typically created on the surface of the pectoral fascia in the chest wall and in the subplatysmal space in the neck. There are several limitations of using this WS. The aim of this study was to establish a new WS for ET. A retrospective review was performed on 85 patients with benign thyroid nodules who had undergone ET through a BAA. A WS was created between the anterior and poster layers of the superficial pectoral fascia (SPF) in the chest and underneath the deep layer of the investing layer (IL) in the neck. The time for creating the WS was 7.2 ± 2.1 (range, 5-12) minutes. No hemorrhage occurred during the procedure. Fat liquefaction occurred in 2 patients. Edema of the neck skin flap presented as lack of a suprasternal notch. No skin numbness occurred. No patient required postoperative pain medication. All patients were extremely satisfied with the cosmetic results. This new method of establishing a WS between the two layers of the SPF and underneath the IL is simple and fast, provides good exposure, yields less postoperative pain, and has a lower risk of skin burn.

  14. Modelling airborne gravity data by means of adapted Space-Wise approach

    Sampietro, Daniele; Capponi, Martina; Hamdi Mansi, Ahmed; Gatti, Andrea


    Regional gravity field modelling by means of remove - restore procedure is nowadays widely applied to predict grids of gravity anomalies (Bouguer, free-air, isostatic, etc.) in gravimetric geoid determination as well as in exploration geophysics. Considering this last application, due to the required accuracy and resolution, airborne gravity observations are generally adopted. However due to the relatively high acquisition velocity, presence of atmospheric turbulence, aircraft vibration, instrumental drift, etc. airborne data are contaminated by a very high observation error. For this reason, a proper procedure to filter the raw observations both in the low and high frequency should be applied to recover valuable information. In this work, a procedure to predict a grid or a set of filtered along track gravity anomalies, by merging GGM and airborne dataset, is presented. The proposed algorithm, like the Space-Wise approach developed by Politecnico di Milano in the framework of GOCE data analysis, is based on a combination of along track Wiener filter and Least Squares Collocation adjustment and properly considers the different altitudes of the gravity observations. Among the main differences with respect to the satellite application of the Space-Wise approach there is the fact that, while in processing GOCE data the stochastic characteristics of the observation error can be considered a-priori well known, in airborne gravimetry, due to the complex environment in which the observations are acquired, these characteristics are unknown and should be retrieved from the dataset itself. Some innovative theoretical aspects focusing in particular on the theoretical covariance modelling are presented too. In the end, the goodness of the procedure is evaluated by means of a test on real data recovering the gravitational signal with a predicted accuracy of about 0.25 mGal.

  15. Design Space Approach for Preservative System Optimization of an Anti-Aging Eye Fluid Emulsion.

    Lourenço, Felipe Rebello; Francisco, Fabiane Lacerda; Ferreira, Márcia Regina Spuri; Andreoli, Terezinha De Jesus; Löbenberg, Raimar; Bou-Chacra, Nádia


    The use of preservatives must be optimized in order to ensure the efficacy of an antimicrobial system as well as the product safety. Despite the wide variety of preservatives, the synergistic or antagonistic effects of their combinations are not well established and it is still an issue in the development of pharmaceutical and cosmetic products. The purpose of this paper was to establish a space design using a simplex-centroid approach to achieve the lowest effective concentration of 3 preservatives (methylparaben, propylparaben, and imidazolidinyl urea) and EDTA for an emulsion cosmetic product. Twenty-two formulae of emulsion differing only by imidazolidinyl urea (A: 0.00 to 0.30% w/w), methylparaben (B: 0.00 to 0.20% w/w), propylparaben (C: 0.00 to 0.10% w/w) and EDTA (D: 0.00 to 0.10% w/w) concentrations were prepared. They were tested alone and in binary, ternary and quaternary combinations. Aliquots of these formulae were inoculated with several microorganisms. An electrochemical method was used to determine microbial burden immediately after inoculation and after 2, 4, 8, 12, 24, 48, and 168 h. An optimization strategy was used to obtain the concentrations of preservatives and EDTA resulting in a most effective preservative system of all microorganisms simultaneously. The use of preservatives and EDTA in combination has the advantage of exhibiting a potential synergistic effect against a wider spectrum of microorganisms. Based on graphic and optimization strategies, we proposed a new formula containing a quaternary combination (A: 55%; B: 30%; C: 5% and D: 10% w/w), which complies with the specification of a conventional challenge test. A design space approach was successfully employed in the optimization of concentrations of preservatives and EDTA in an emulsion cosmetic product.

  16. Finite frequency shear wave splitting tomography: a model space search approach

    Mondal, P.; Long, M. D.


    Observations of seismic anisotropy provide key constraints on past and present mantle deformation. A common method for upper mantle anisotropy is to measure shear wave splitting parameters (delay time and fast direction). However, the interpretation is not straightforward, because splitting measurements represent an integration of structure along the ray path. A tomographic approach that allows for localization of anisotropy is desirable; however, tomographic inversion for anisotropic structure is a daunting task, since 21 parameters are needed to describe general anisotropy. Such a large parameter space does not allow a straightforward application of tomographic inversion. Building on previous work on finite frequency shear wave splitting tomography, this study aims to develop a framework for SKS splitting tomography with a new parameterization of anisotropy and a model space search approach. We reparameterize the full elastic tensor, reducing the number of parameters to three (a measure of strength based on symmetry considerations for olivine, plus the dip and azimuth of the fast symmetry axis). We compute Born-approximation finite frequency sensitivity kernels relating model perturbations to splitting intensity observations. The strong dependence of the sensitivity kernels on the starting anisotropic model, and thus the strong non-linearity of the inverse problem, makes a linearized inversion infeasible. Therefore, we implement a Markov Chain Monte Carlo technique in the inversion procedure. We have performed tests with synthetic data sets to evaluate computational costs and infer the resolving power of our algorithm for synthetic models with multiple anisotropic layers. Our technique can resolve anisotropic parameters on length scales of ˜50 km for realistic station and event configurations for dense broadband experiments. We are proceeding towards applications to real data sets, with an initial focus on the High Lava Plains of Oregon.

  17. Quantum harmonic Brownian motion in a general environment: A modified phase-space approach

    Yeh, L.


    After extensive investigations over three decades, the linear-coupling model and its equivalents have become the standard microscopic models for quantum harmonic Brownian motion, in which a harmonically bound Brownian particle is coupled to a quantum dissipative heat bath of general type modeled by infinitely many harmonic oscillators. The dynamics of these models have been studied by many authors using the quantum Langevin equation, the path-integral approach, quasi-probability distribution functions (e.g., the Wigner function), etc. However, the quantum Langevin equation is only applicable to some special problems, while other approaches all involve complicated calculations due to the inevitable reduction (i.e., contraction) operation for ignoring/eliminating the degrees of freedom of the heat bath. In this dissertation, the author proposes an improved methodology via a modified phase-space approach which employs the characteristic function (the symplectic Fourier transform of the Wigner function) as the representative of the density operator. This representative is claimed to be the most natural one for performing the reduction, not only because of its simplicity but also because of its manifestation of geometric meaning. Accordingly, it is particularly convenient for studying the time evolution of the Brownian particle with an arbitrary initial state. The power of this characteristic function is illuminated through a detailed study of several physically interesting problems, including the environment-induced damping of quantum interference, the exact quantum Fokker-Planck equations, and the relaxation of non-factorizable initial states. All derivations and calculations axe shown to be much simplified in comparison with other approaches. In addition to dynamical problems, a novel derivation of the fluctuation-dissipation theorem which is valid for all quantum linear systems is presented

  18. Disease severity, not operative approach, drives organ space infection after pediatric appendectomy.

    Kelly, Kristin N; Fleming, Fergal J; Aquina, Christopher T; Probst, Christian P; Noyes, Katia; Pegoli, Walter; Monson, John R T


    This study examines patient and operative factors associated with organ space infection (OSI) in children after appendectomy, specifically focusing on the role of operative approach. Although controversy exists regarding the risk of increased postoperative intra-abdominal infections after laparoscopic appendectomy, this approach has been largely adopted in the treatment of pediatric acute appendicitis. Children aged 2 to 18 years undergoing open or laparoscopic appendectomy for acute appendicitis were selected from the 2012 American College of Surgeons Pediatric National Surgical Quality Improvement Program database. Univariate analysis compared patient and operative characteristics with 30-day OSI and incisional complication rates. Factors with a P value of less than 0.1 and clinical importance were included in the multivariable logistic regression models. A P value less than 0.05 was considered significant. For 5097 children undergoing appendectomy, 4514 surgical procedures (88.6%) were performed laparoscopically. OSI occurred in 155 children (3%), with half of these infections developing postdischarge. Significant predictors for OSI included complicated appendicitis, preoperative sepsis, wound class III/IV, and longer operative time. Although 5.2% of patients undergoing open surgery developed OSI (odds ratio = 1.82; 95% confidence interval, 1.21-2.76; P = 0.004), operative approach was not associated with increased relative odds of OSI (odds ratio = 0.99; confidence interval, 0.64-1.55; P = 0.970) after adjustment for other risk factors. Overall, the model had excellent predictive ability (c-statistic = 0.837). This model suggests that disease severity, not operative approach, as previously suggested, drives OSI development in children. Although 88% of appendectomies in this population were performed laparoscopically, these findings support utilization of the surgeon's preferred surgical technique and may help guide postoperative counsel in high-risk children.

  19. Concurrent Multidisciplinary Preliminary Assessment of Space Systems (COMPASS) Final Report: Advanced Long-Life Lander Investigating the Venus Environment (ALIVE)

    Oleson, Steven R.


    The COncurrent Multidisciplinary Preliminary Assessment of Space Systems (COMPASS) Team partnered with the Applied Research Laboratory to perform a NASA Innovative Advanced Concepts (NIAC) Program study to evaluate chemical based power systems for keeping a Venus lander alive (power and cooling) and functional for a period of days. The mission class targeted was either a Discovery ($500M) or New Frontiers ($750M to $780M) class mission.

  20. A simulation based optimization approach to model and design life support systems for manned space missions

    Aydogan, Selen

    This dissertation considers the problem of process synthesis and design of life-support systems for manned space missions. A life-support system is a set of technologies to support human life for short and long-term spaceflights, via providing the basic life-support elements, such as oxygen, potable water, and food. The design of the system needs to meet the crewmember demand for the basic life-support elements (products of the system) and it must process the loads generated by the crewmembers. The system is subject to a myriad of uncertainties because most of the technologies involved are still under development. The result is high levels of uncertainties in the estimates of the model parameters, such as recovery rates or process efficiencies. Moreover, due to the high recycle rates within the system, the uncertainties are amplified and propagated within the system, resulting in a complex problem. In this dissertation, two algorithms have been successfully developed to help making design decisions for life-support systems. The algorithms utilize a simulation-based optimization approach that combines a stochastic discrete-event simulation and a deterministic mathematical programming approach to generate multiple, unique realizations of the controlled evolution of the system. The timelines are analyzed using time series data mining techniques and statistical tools to determine the necessary technologies, their deployment schedules and capacities, and the necessary basic life-support element amounts to support crew life and activities for the mission duration.

  1. Application of a Systems Engineering Approach to Support Space Reactor Development

    Wold, Scott


    In 1992, approximately 25 Russian and 12 U.S. engineers and technicians were involved in the transport, assembly, inspection, and testing of over 90 tons of Russian equipment associated with the Thermionic System Evaluation Test (TSET) Facility. The entire Russian Baikal Test Stand, consisting of a 5.79 m tall vacuum chamber and related support equipment, was reassembled and tested at the TSET facility in less than four months. In November 1992, the first non-nuclear operational test of a complete thermionic power reactor system in the U.S. was accomplished three months ahead of schedule and under budget. A major factor in this accomplishment was the application of a disciplined top-down systems engineering approach and application of a spiral development model to achieve the desired objectives of the TOPAZ International Program (TIP). Systems Engineering is a structured discipline that helps programs and projects conceive, develop, integrate, test and deliver products and services that meet customer requirements within cost and schedule. This paper discusses the impact of Systems Engineering and a spiral development model on the success of the TOPAZ International Program and how the application of a similar approach could help ensure the success of future space reactor development projects

  2. Robust mode space approach for atomistic modeling of realistically large nanowire transistors

    Huang, Jun Z.; Ilatikhameneh, Hesameddin; Povolotskyi, Michael; Klimeck, Gerhard


    Nanoelectronic transistors have reached 3D length scales in which the number of atoms is countable. Truly atomistic device representations are needed to capture the essential functionalities of the devices. Atomistic quantum transport simulations of realistically extended devices are, however, computationally very demanding. The widely used mode space (MS) approach can significantly reduce the numerical cost, but a good MS basis is usually very hard to obtain for atomistic full-band models. In this work, a robust and parallel algorithm is developed to optimize the MS basis for atomistic nanowires. This enables engineering-level, reliable tight binding non-equilibrium Green's function simulation of nanowire metal-oxide-semiconductor field-effect transistor (MOSFET) with a realistic cross section of 10 nm × 10 nm using a small computer cluster. This approach is applied to compare the performance of InGaAs and Si nanowire n-type MOSFETs (nMOSFETs) with various channel lengths and cross sections. Simulation results with full-band accuracy indicate that InGaAs nanowire nMOSFETs have no drive current advantage over their Si counterparts for cross sections up to about 10 nm × 10 nm.

  3. The approach to risk analysis in three industries: nuclear power, space systems, and chemical process

    Garrick, B.J.


    The aerospace, nuclear power, and chemical processing industries are providing much of the incentive for the development and application of advanced risk analysis techniques to engineered systems. Risk analysis must answer three basic questions: What can go wrong? How likely is it? and What are the consequences? The result of such analyses is not only a quantitative answer to the question of 'What is the risk', but, more importantly, a framework for intelligent and visible risk management. Because of the societal importance of the subject industries and the amount of risk analysis activity involved in each, it is interesting to look for commonalities, differences, and, hopefully, a basis for some standardization. Each industry has its strengths: the solid experience base of the chemical industry, the extensive qualification and testing procedures of the space industry, and the integrative and quantitative risk and reliability methodologies developed for the nuclear power industry. In particular, most advances in data handling, systems interaction modeling, and uncertainty analysis have come from the probabilistic risk assessment work in the nuclear safety field. In the final analysis, all three industries would greatly benefit from a more deliberate technology exchange program in the rapidly evolving discipline of quantitative risk analysis. (author)

  4. Development of Operational Free-Space-Optical (FSO) Laser Communication Systems Final Report CRADA No. TC02093.0

    Ruggiero, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Orgren, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)


    This project was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of California)/Lawrence Livermore National Laboratory (LLNL) and LGS Innovations, LLC (formerly Lucent Technologies, Inc.), to develop long-range and mobile operational free-space optical (FSO) laser communication systems for specialized government applications. LLNL and LGS Innovations formerly Lucent Bell Laboratories Government Communications Systems performed this work for a United States Government (USG) Intelligence Work for Others (I-WFO) customer, also referred to as "Government Customer", or "Customer" and "Government Sponsor." The CRADA was a critical and required part of the LLNL technology transfer plan for the customer.

  5. Real-space grids and the Octopus code as tools for the development of new simulation approaches for electronic systems

    Andrade, Xavier; Strubbe, David; De Giovannini, Umberto; Larsen, Ask Hjorth; Oliveira, Micael J. T.; Alberdi-Rodriguez, Joseba; Varas, Alejandro; Theophilou, Iris; Helbig, Nicole; Verstraete, Matthieu J.; Stella, Lorenzo; Nogueira, Fernando; Aspuru-Guzik, Alán; Castro, Alberto; Marques, Miguel A. L.; Rubio, Angel

    Real-space grids are a powerful alternative for the simulation of electronic systems. One of the main advantages of the approach is the flexibility and simplicity of working directly in real space where the different fields are discretized on a grid, combined with competitive numerical performance and great potential for parallelization. These properties constitute a great advantage at the time of implementing and testing new physical models. Based on our experience with the Octopus code, in this article we discuss how the real-space approach has allowed for the recent development of new ideas for the simulation of electronic systems. Among these applications are approaches to calculate response properties, modeling of photoemission, optimal control of quantum systems, simulation of plasmonic systems, and the exact solution of the Schr\\"odinger equation for low-dimensionality systems.

  6. Real-space local polynomial basis for solid-state electronic-structure calculations: A finite-element approach

    Pask, J.E.; Klein, B.M.; Fong, C.Y.; Sterne, P.A.


    We present an approach to solid-state electronic-structure calculations based on the finite-element method. In this method, the basis functions are strictly local, piecewise polynomials. Because the basis is composed of polynomials, the method is completely general and its convergence can be controlled systematically. Because the basis functions are strictly local in real space, the method allows for variable resolution in real space; produces sparse, structured matrices, enabling the effective use of iterative solution methods; and is well suited to parallel implementation. The method thus combines the significant advantages of both real-space-grid and basis-oriented approaches and so promises to be particularly well suited for large, accurate ab initio calculations. We develop the theory of our approach in detail, discuss advantages and disadvantages, and report initial results, including electronic band structures and details of the convergence of the method. copyright 1999 The American Physical Society

  7. Feasibility of geothermal space/water heating for Mammoth Lakes Village, California. Final report, September 1976--September 1977

    Sims, A.V.; Racine, W.C.


    Results of a study to determine the technical, economic, and environmental feasibility of geothermal district heating for Mammoth Lakes Village, California are reported. The geothermal district heating system selected is technically feasible and will use existing technology in its design and operation. District heating can provide space and water heating energy for typical customers at lower cost than alternative sources of energy. If the district heating system is investor owned, lower costs are realized after five to six years of operation, and if owned by a nonprofit organization, after zero to three years. District heating offers lower costs than alternatives much sooner in time if co-generation and/or DOE participation in system construction are included in the analysis. During a preliminary environmental assessment, no potential adverse environmental impacts could be identified of sufficient consequence to preclude the construction and operation of the proposed district heating system. A follow-on program aimed at implementing district heating in Mammoth is outlined.

  8. Application of space and aviation technology to improve the safety and reliability of nuclear power plant operations. Final report


    This report investigates various technologies that have been developed and utilized by the aerospace community, particularly the National Aeronautics and Space Administration (NASA) and the aviation industry, that would appear to have some potential for contributing to improved operational safety and reliability at commercial nuclear power plants of the type being built and operated in the United States today. The main initiator for this study, as well as many others, was the accident at the Three Mile Island (TMI) nuclear power plant in March 1979. Transfer and application of technology developed by NASA, as well as other public and private institutions, may well help to decrease the likelihood of similar incidents in the future

  9. Electrical performance characteristics of high power converters for space power applications. Final report, 1 January 1988-30 September 1989

    Stuart, T.A.; King, R.J.


    The first goal of this project was to investigate various converters that would be suitable for processing electric power derived from a nuclear reactor. The implementation is indicated of a 20 kHz system that includes a source converter, a ballast converter, and a fixed frequency converter for generating the 20 kHz output. This system can be converted to dc simply by removing the fixed frequency converter. This present study emphasized the design and testing of the source and ballast converters. A push-pull current-fed (PPCF) design was selected for the source converter, and a 2.7 kW version of this was implemented using three 900 watt modules in parallel. The characteristic equation for two converters in parallel was derived, but this analysis did not yield any experimental methods for measuring relative stability. The three source modules were first tested individually and then in parallel as a 2.7 kW system. All tests proved to be satisfactory; the system was stable; efficiency and regulation were acceptable; and the system was fault tolerant. The design of a ballast-load converter, which was operated as a shunt regulator, was investigated. The proposed power circuit is suitable for use with BJTs because proportional base drive is easily implemented. A control circuit which minimizes switching frequency ripple and automatically bypasses a faulty shunt section was developed. A nonlinear state-space-averaged model of the shunt regulator was developed and shown to produce an accurate incremental (small-signal) dynamic model, even though the usual state-space-averaging assumptions were not met. The nonlinear model was also shown to be useful for large-signal dynamic simulation using PSpice

  10. Optimal parameters for final position of teeth in space closure in case of a missing upper lateral incisor.

    Lombardo, Luca; D'Ercole, Antonio; Latini, Michele Carmelo; Siciliani, Giuseppe


    The aim of this study was to provide clinical indications for the correct management of appliances in space closure treatment of patients with agenesis of the upper lateral incisors. Virtual setup for space closure was performed in 30 patients with upper lateral incisor agenesis. Tip, torque and in-out values were measured and compared with those of previous authors. In the upper dentition, the tip values were comparable to those described by Andrews (Am J Orthod 62(3):296-309, 1972), except for at the first premolars, which require a greater tip, and the first molars, a lesser tip. The torque values showed no differences except for at the canines, where it was greater, and the in-out values were between those reported by Andrews and those by Watanabe et al. (The Shikwa Gakuho 96:209-222, 1996) (except for U3 and U4). The following prescriptions are advisable: tip 5°, torque 8° and in-out 2.5 for U1; tip 9°, torque 3° and in-out 3.25 for U3; tip 10°, torque -8° and in-out 3.75 for U4; and tip 5°, torque -8° and in-out 4 for U5. Andrews' prescription is suitable for the lower jaw, except for at L6. It is also advisable to execute selective grinding (1.33±0.5 mm) and extrusion (0.68±0.23 mm) on the upper canine during treatment, and the first premolar requires some intrusion (0.56±0.30 mm).

  11. A new energy-efficient control approach for space telescope drive system

    Zhou, Wangping; Wang, Yong

    Drive control makes the telescope accurately track celestial bodies in spite of external and in-ternal disturbances, and is a key technique to the performance of telescopes. In this paper, we propose a nonlinear adaptive observer based on power reversible approach for high preci-sion position tracking, i.e., space telescopes. The nonlinear adaptive observer automatically estimates the disturbances in drive system, and the observed value is applied to compensate for the real disturbances. With greatly reduced disturbances, the control precision can be ev-idently improved. In conventional drive control, the brake device is often used to slow down the reaction wheel and may waste enormous energy. To avoid those disadvantages, an H-bridge is put forward for wheel speed regulation. Such H-bridge has four independent sections, and each section mainly consists of a power electronic switch and an anti-parallel diode. A pair of diagonal sections is switched on for speeding up the reaction wheel and the other pair act in reverse. During the period of the wheel slowing down, the armature current of drive motor goes through the two path-wise diodes to discharge the battery. Thusly, energy waste is avoided. Based on the disturbance compensation, an optimal controller is designed to minimize an eval-uation function which is made up of a weighted sum of position errors and energy consumption. The outputs of the controller are amplified to control the H-bridge. Simulations are performed in MATLAB language. The results show that high precision control can be obtained by the proposed approach. And the energy consumption will be remarkably reduced.

  12. Different Approaches for Ensuring Performance/Reliability of Plastic Encapsulated Microcircuits (PEMs) in Space Applications

    Gerke, R. David; Sandor, Mike; Agarwal, Shri; Moor, Andrew F.; Cooper, Kim A.


    Engineers within the commercial and aerospace industries are using trade-off and risk analysis to aid in reducing spacecraft system cost while increasing performance and maintaining high reliability. In many cases, Commercial Off-The-Shelf (COTS) components, which include Plastic Encapsulated Microcircuits (PEMs), are candidate packaging technologies for spacecrafts due to their lower cost, lower weight and enhanced functionality. Establishing and implementing a parts program that effectively and reliably makes use of these potentially less reliable, but state-of-the-art devices, has become a significant portion of the job for the parts engineer. Assembling a reliable high performance electronic system, which includes COTS components, requires that the end user assume a risk. To minimize the risk involved, companies have developed methodologies by which they use accelerated stress testing to assess the product and reduce the risk involved to the total system. Currently, there are no industry standard procedures for accomplishing this risk mitigation. This paper will present the approaches for reducing the risk of using PEMs devices in space flight systems as developed by two independent Laboratories. The JPL procedure involves primarily a tailored screening with accelerated stress philosophy while the APL procedure is primarily, a lot qualification procedure. Both Laboratories successfully have reduced the risk of using the particular devices for their respective systems and mission requirements.

  13. Distribution function approach to redshift space distortions. Part V: perturbation theory applied to dark matter halos

    Vlah, Zvonimir; Seljak, Uroš [Institute for Theoretical Physics, University of Zürich, Zürich (Switzerland); Okumura, Teppei [Institute for the Early Universe, Ewha Womans University, Seoul, S. Korea (Korea, Republic of); Desjacques, Vincent, E-mail:, E-mail:, E-mail:, E-mail: [Département de Physique Théorique and Center for Astroparticle Physics (CAP) Université de Genéve, Genéve (Switzerland)


    Numerical simulations show that redshift space distortions (RSD) introduce strong scale dependence in the power spectra of halos, with ten percent deviations relative to linear theory predictions even on relatively large scales (k < 0.1h/Mpc) and even in the absence of satellites (which induce Fingers-of-God, FoG, effects). If unmodeled these effects prevent one from extracting cosmological information from RSD surveys. In this paper we use Eulerian perturbation theory (PT) and Eulerian halo biasing model and apply it to the distribution function approach to RSD, in which RSD is decomposed into several correlators of density weighted velocity moments. We model each of these correlators using PT and compare the results to simulations over a wide range of halo masses and redshifts. We find that with an introduction of a physically motivated halo biasing, and using dark matter power spectra from simulations, we can reproduce the simulation results at a percent level on scales up to k ∼ 0.15h/Mpc at z = 0, without the need to have free FoG parameters in the model.

  14. Testing for Level Shifts in Fractionally Integrated Processes: a State Space Approach

    Monache, Davide Delle; Grassi, Stefano; Santucci de Magistris, Paolo

    Short memory models contaminated by level shifts have similar long-memory features as fractionally integrated processes. This makes hard to verify whether the true data generating process is a pure fractionally integrated process when employing standard estimation methods based on the autocorrela......Short memory models contaminated by level shifts have similar long-memory features as fractionally integrated processes. This makes hard to verify whether the true data generating process is a pure fractionally integrated process when employing standard estimation methods based...... on the autocorrelation function or the periodogram. In this paper, we propose a robust testing procedure, based on an encompassing parametric specification that allows to disentangle the level shifts from the fractionally integrated component. The estimation is carried out on the basis of a state-space methodology...... and it leads to a robust estimate of the fractional integration parameter also in presence of level shifts. Once the memory parameter is correctly estimated, we use the KPSS test for presence of level shift. The Monte Carlo simulations show how this approach produces unbiased estimates of the memory parameter...

  15. A rationalized approach to the imaging of space-occupying lesions in the liver

    Engelbrecht, H.E.


    A rational approach to the imaging of mass lesions within the liver has been presented. An attempt has been made to advocate a philosophy which emphasizes the importance of considering pathological, biochemical, clinical and likely management criteria in each case before selecting a first-line imaging procedure. The subject is presented under three headings: i) What That is, clinical and pathological criteria for assesing the nature of a suspected space-occupying lesion in the liver; ii) Why That is a projection of the likely practical value of the result; iii) How That is determination of a logical imaging program depending on the assesment of criteria under the first two headings. The following examples of active treatment are discussed: partial hepotectomy, highly vascular lesions, toxaemia and pyrexia. The following factors influence the decision of the imaging procedure to be used: the accuracy of the modality in relation to the suspected lesion, local availability of equipment and expentise, invasive versus non-invasive aspects and cost-effectiveness

  16. International Space Station Centrifuge Rotor Models A Comparison of the Euler-Lagrange and the Bond Graph Modeling Approach

    Nguyen, Louis H.; Ramakrishnan, Jayant; Granda, Jose J.


    The assembly and operation of the International Space Station (ISS) require extensive testing and engineering analysis to verify that the Space Station system of systems would work together without any adverse interactions. Since the dynamic behavior of an entire Space Station cannot be tested on earth, math models of the Space Station structures and mechanical systems have to be built and integrated in computer simulations and analysis tools to analyze and predict what will happen in space. The ISS Centrifuge Rotor (CR) is one of many mechanical systems that need to be modeled and analyzed to verify the ISS integrated system performance on-orbit. This study investigates using Bond Graph modeling techniques as quick and simplified ways to generate models of the ISS Centrifuge Rotor. This paper outlines the steps used to generate simple and more complex models of the CR using Bond Graph Computer Aided Modeling Program with Graphical Input (CAMP-G). Comparisons of the Bond Graph CR models with those derived from Euler-Lagrange equations in MATLAB and those developed using multibody dynamic simulation at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) are presented to demonstrate the usefulness of the Bond Graph modeling approach for aeronautics and space applications.

  17. Anthropogenic resource subsidies determine space use by Australian arid zone dingoes: an improved resource selection modelling approach.

    Thomas M Newsome

    Full Text Available Dingoes (Canis lupus dingo were introduced to Australia and became feral at least 4,000 years ago. We hypothesized that dingoes, being of domestic origin, would be adaptable to anthropogenic resource subsidies and that their space use would be affected by the dispersion of those resources. We tested this by analyzing Resource Selection Functions (RSFs developed from GPS fixes (locations of dingoes in arid central Australia. Using Generalized Linear Mixed-effect Models (GLMMs, we investigated resource relationships for dingoes that had access to abundant food near mine facilities, and for those that did not. From these models, we predicted the probability of dingo occurrence in relation to anthropogenic resource subsidies and other habitat characteristics over ∼ 18,000 km(2. Very small standard errors and subsequent pervasively high P-values of results will become more important as the size of data sets, such as our GPS tracking logs, increases. Therefore, we also investigated methods to minimize the effects of serial and spatio-temporal correlation among samples and unbalanced study designs. Using GLMMs, we accounted for some of the correlation structure of GPS animal tracking data; however, parameter standard errors remained very small and all predictors were highly significant. Consequently, we developed an alternative approach that allowed us to review effect sizes at different spatial scales and determine which predictors were sufficiently ecologically meaningful to include in final RSF models. We determined that the most important predictor for dingo occurrence around mine sites was distance to the refuse facility. Away from mine sites, close proximity to human-provided watering points was predictive of dingo dispersion as were other landscape factors including palaeochannels, rocky rises and elevated drainage depressions. Our models demonstrate that anthropogenically supplemented food and water can alter dingo-resource relationships. The

  18. An Approach to Using Toxicogenomic Data in U.S. EPA Human Health Risk Assessments: A Dibutyl Phthalate Case Study (Final Report, 2010)

    EPA announced the availability of the final report, An Approach to Using Toxicogenomic Data in U.S. EPA Human Health Risk Assessments: A Dibutyl Phthalate Case Study. This report outlines an approach to evaluate genomic data for use in risk assessment and a case study to ...

  19. Predicting temperature and moisture distributions in conditioned spaces using the zonal approach

    Mendonca, K.C. [Parana Pontifical Catholic Univ., Curitiba (Brazil); Wurtz, E.; Inard, C. [La Rochelle Univ., La Rochelle, Cedex (France). LEPTAB


    Moisture interacts with building elements in a number of different ways that impact upon building performance, causing deterioration of building materials, as well as contributing to poor indoor air quality. In humid climates, moisture represents one of the major loads in conditioned spaces. It is therefore important to understand and model moisture transport accurately. This paper discussed an intermediate zonal approach to building a library of data in order to predict whole hygrothermal behavior in conditioned rooms. The zonal library included 2 models in order to consider building envelope moisture buffering effects as well as taking into account the dynamic aspect of jet airflow in the zonal method. The zonal library was then applied to a case study to show the impact of external humidity on the whole hygrothermal performance of a room equipped with a vertical fan-coil unit. The proposed theory was structured into 3 groups representing 3 building domains: indoor air; envelope; and heating, ventilation and air conditioning (HVAC) systems. The indoor air sub-model related to indoor air space, where airflow speed was considered to be low. The envelope sub-model related to the radiation exchanges between the envelope and its environment as well as to the heat and mass transfers through the envelope material. The HVAC system sub-model referred to the whole system including equipment, control and specific airflow from the equipment. All the models were coupled into SPARK, where the resulting set of non-linear equations were solved simultaneously. A case study of a large office conditioned by a vertical fan-coil unit with a rectangular air supply diffuser was presented. Details of the building's external and internal environment were provided, as well as convective heat and mass transfer coefficients and temperature distributions versus time. Results of the study indicated that understanding building material moisture buffering effects is as important as

  20. Optimization of the graph model of the water conduit network, based on the approach of search space reducing

    Korovin, Iakov S.; Tkachenko, Maxim G.


    In this paper we present a heuristic approach, improving the efficiency of methods, used for creation of efficient architecture of water distribution networks. The essence of the approach is a procedure of search space reduction the by limiting the range of available pipe diameters that can be used for each edge of the network graph. In order to proceed the reduction, two opposite boundary scenarios for the distribution of flows are analysed, after which the resulting range is further narrowed by applying a flow rate limitation for each edge of the network. The first boundary scenario provides the most uniform distribution of the flow in the network, the opposite scenario created the net with the highest possible flow level. The parameters of both distributions are calculated by optimizing systems of quadratic functions in a confined space, which can be effectively performed with small time costs. This approach was used to modify the genetic algorithm (GA). The proposed GA provides a variable number of variants of each gene, according to the number of diameters in list, taking into account flow restrictions. The proposed approach was implemented to the evaluation of a well-known test network - the Hanoi water distribution network [1], the results of research were compared with a classical GA with an unlimited search space. On the test data, the proposed trip significantly reduced the search space and provided faster and more obvious convergence in comparison with the classical version of GA.

  1. Path integral approach for quantum motion on spaces of non-constant curvature according to Koenigs - Three dimensions

    Grosche, C.


    In this contribution a path integral approach for the quantum motion on three-dimensional spaces according to Koenigs, for short''Koenigs-Spaces'', is discussed. Their construction is simple: One takes a Hamiltonian from three-dimensional flat space and divides it by a three-dimensional superintegrable potential. Such superintegrable potentials will be the isotropic singular oscillator, the Holt-potential, the Coulomb potential, or two centrifugal potentials, respectively. In all cases a non-trivial space of non-constant curvature is generated. In order to obtain a proper quantum theory a curvature term has to be incorporated into the quantum Hamiltonian. For possible bound-state solutions we find equations up to twelfth order in the energy E. (orig.)

  2. Commercial Space Transportation and Approaches to landing sites over Maritime Areas

    Morlang, Frank; Hampe, Jens; Kaltenhäuser, Sven; Schmitt, Dirk-Roger


    Commercial Space Transportation becomes an international business and requires landing opportunities all over the world. Hence the integration of space vehicles in other airspace than the US NAS is an important topic to be considered. The Single European Sky ATM Research Programme (SESAR) is preparing the implementation of a new ATM system in Europe. The requirements are defined by the concept of the shared Business Trajectory and System Wide Information Management (SWIM). Space vehicle op...

  3. The lattice spinor QED Hamiltonian critique of the continuous space approach

    Sidorov, A.V.; Zastavenko, L.G.


    We give the irreproachable, from the point of view of gauge invariance, derivation of the lattice spinor QED Hamiltonian. Our QED Hamiltonian is manifestly gauge invariant. We point out important defects of the continuous space formulation of the QED that make, in our opinion, the lattice QED obviously preferable to the continuous space QED. We state that it is impossible to give a continuous space QED formulation which is compatible with the condition of gauge invariance. 17 refs

  4. Allocating city space to multiple transportation modes: A new modeling approach consistent with the physics of transport

    Gonzales, Eric J.; Geroliminis, Nikolas; Cassidy, Michael J.; Daganzo, Carlos F.


    A macroscopic modeling approach is proposed for allocating a city’s road space among competing transport modes. In this approach, a city or neighborhood street network is viewed as a reservoir with aggregated traffic. Taking the number of vehicles (accumulation) in a reservoir as input, we show how one can reliably predict system performance in terms of person and vehicle hours spent in the system and person and vehicle kilometers traveled. The approach is used here to unveil two important ...

  5. Pre-Big Bang, space-time structure, asymptotic Universe. Spinorial space-time and a new approach to Friedmann-like equations

    Gonzalez-Mestres, Luis


    Planck and other recent data in Cosmology and Particle Physics can open the way to controversial analyses concerning the early Universe and its possible ultimate origin. Alternatives to standard cosmology include pre-Big Bang approaches, new space-time geometries and new ultimate constituents of matter. Basic issues related to a possible new cosmology along these lines clearly deserve further exploration. The Planck collaboration reports an age of the Universe t close to 13.8 Gyr and a present ratio H between relative speeds and distances at cosmic scale around 67.3 km/s/Mpc. The product of these two measured quantities is then slightly below 1 (about 0.95), while it can be exactly 1 in the absence of matter and cosmological constant in patterns based on the spinorial space-time we have considered in previous papers. In this description of space-time we first suggested in 1996-97, the cosmic time t is given by the modulus of a SU(2) spinor and the Lundmark-Lemaître-Hubble (LLH) expansion law turns out to be of purely geometric origin previous to any introduction of standard matter and relativity. Such a fundamental geometry, inspired by the role of half-integer spin in Particle Physics, may reflect an equilibrium between the dynamics of the ultimate constituents of matter and the deep structure of space and time. Taking into account the observed cosmic acceleration, the present situation suggests that the value of 1 can be a natural asymptotic limit for the product H t in the long-term evolution of our Universe up to possible small corrections. In the presence of a spinorial space-time geometry, no ad hoc combination of dark matter and dark energy would in any case be needed to get an acceptable value of H and an evolution of the Universe compatible with observation. The use of a spinorial space-time naturally leads to unconventional properties for the space curvature term in Friedmann-like equations. It therefore suggests a major modification of the standard

  6. Trial and Error: A new Approach to Space-Bounded Learning

    Ameur, F.; Fischer, Paul; Hoeffgen, H.-U.


    A pac-learning algorithm is d-space bounded, if it stores at most d examples from the sample at any time. We characterize the d-space learnable concept classes. For this purpose we introduce the compression parameter of a concept class 𝒞 and design our trial and error learning algorithm. We ...

  7. Nonsmooth differential geometry-an approach tailored for spaces with Ricci curvature bounded from below

    Gigli, Nicola


    The author discusses in which sense general metric measure spaces possess a first order differential structure. Building on this, spaces with Ricci curvature bounded from below a second order calculus can be developed, permitting the author to define Hessian, covariant/exterior derivatives and Ricci curvature.

  8. A compressive sensing approach to the calculation of the inverse data space

    Khan, Babar Hasan; Saragiotis, Christos; Alkhalifah, Tariq Ali


    Seismic processing in the Inverse Data Space (IDS) has its advantages like the task of removing the multiples simply becomes muting the zero offset and zero time data in the inverse domain. Calculation of the Inverse Data Space by sparse inversion

  9. The Cube and the Poppy Flower: Participatory Approaches for Designing Technology-Enhanced Learning Spaces

    Casanova, Diogo; Mitchell, Paul


    This paper presents an alternative method for learning space design that is driven by user input. An exploratory study was undertaken at an English university with the aim of redesigning technology-enhanced learning spaces. Two provocative concepts were presented through participatory design workshops during which students and teachers reflected…

  10. Quality-by-Design (QbD): An integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and process design space development.

    Wu, Huiquan; White, Maury; Khan, Mansoor A


    The aim of this work was to develop an integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and design space development. A dynamic co-precipitation process by gradually introducing water to the ternary system of naproxen-Eudragit L100-alcohol was monitored at real-time in situ via Lasentec FBRM and PVM. 3D map of count-time-chord length revealed three distinguishable process stages: incubation, transition, and steady-state. The effects of high risk process variables (slurry temperature, stirring rate, and water addition rate) on both derived co-precipitation process rates and final chord-length-distribution were evaluated systematically using a 3(3) full factorial design. Critical process variables were identified via ANOVA for both transition and steady state. General linear models (GLM) were then used for parameter estimation for each critical variable. Clear trends about effects of each critical variable during transition and steady state were found by GLM and were interpreted using fundamental process principles and Nyvlt's transfer model. Neural network models were able to link process variables with response variables at transition and steady state with R(2) of 0.88-0.98. PVM images evidenced nucleation and crystal growth. Contour plots illustrated design space via critical process variables' ranges. It demonstrated the utility of integrated PAT approach for QbD development. Published by Elsevier B.V.

  11. Building HIA approaches into strategies for green space use: an example from Plymouth's (UK) Stepping Stones to Nature project.

    Richardson, J; Goss, Z; Pratt, A; Sharman, J; Tighe, M


    The health and well-being benefits of access to green space are well documented. Research suggests positive findings regardless of social group, however barriers exist that limit access to green space, including proximity, geography and differing social conditions. Current public health policy aims to broaden the range of environmental public health interventions through effective partnership working, providing opportunities to work across agencies to promote the use of green space. Health Impact Assessment (HIA) is a combination of methods and procedures to assess the potential health and well-being impacts of policies, developments and projects. It provides a means by which negative impacts can be mitigated and positive impacts can be enhanced, and has potential application for assessing green space use. This paper describes the application of a HIA approach to a multi-agency project (Stepping Stones to Nature--SS2N) in the UK designed to improve local green spaces and facilitate green space use in areas classified as having high levels of deprivation. The findings suggest that the SS2N project had the potential to provide significant positive benefits in the areas of physical activity, mental and social well-being. Specific findings for one locality identified a range of actions that could be taken to enhance benefits, and mitigate negative factors such as anti-social behaviour. The HIA approach proved to be a valuable process through which impacts of a community development/public health project could be enhanced and negative impacts prevented at an early stage; it illustrates how a HIA approach could enhance multi-agency working to promote health and well-being in communities.

  12. On the computation of the demagnetization tensor field for an arbitrary particle shape using a Fourier space approach

    Beleggia, M.; Graef, M. de


    A method is presented to compute the demagnetization tensor field for uniformly magnetized particles of arbitrary shape. By means of a Fourier space approach it is possible to compute analytically the Fourier representation of the demagnetization tensor field for a given shape. Then, specifying the direction of the uniform magnetization, the demagnetizing field and the magnetostatic energy associated with the particle can be evaluated. In some particular cases, the real space representation is computable analytically. In general, a numerical inverse fast Fourier transform is required to perform the inversion. As an example, the demagnetization tensor field for the tetrahedron will be given

  13. Space base laser torque applied on LEO satellites of various geometries at satellite’s closest approach

    N.S. Khalifa


    Full Text Available In light of using laser power in space applications, the motivation of this paper is to use a space based solar pumped laser to produce a torque on LEO satellites of various shapes. It is assumed that there is a space station that fires laser beam toward the satellite so the beam spreading due to diffraction is considered to be the dominant effect on the laser beam propagation. The laser torque is calculated at the point of closest approach between the space station and some sun synchronous low Earth orbit cubesats. The numerical application shows that space based laser torque has a significant contribution on the LEO cubesats. It has a maximum value in the order of 10−8 Nm which is comparable with the residual magnetic moment. However, it has a minimum value in the order 10−11 Nm which is comparable with the aerodynamic and gravity gradient torque. Consequently, space based laser torque can be used as an active attitude control system.

  14. Lie-deformed quantum Minkowski spaces from twists: Hopf-algebraic versus Hopf-algebroid approach

    Lukierski, Jerzy; Meljanac, Daniel; Meljanac, Stjepan; Pikutić, Danijel; Woronowicz, Mariusz


    We consider new Abelian twists of Poincare algebra describing nonsymmetric generalization of the ones given in [1], which lead to the class of Lie-deformed quantum Minkowski spaces. We apply corresponding twist quantization in two ways: as generating quantum Poincare-Hopf algebra providing quantum Poincare symmetries, and by considering the quantization which provides Hopf algebroid describing class of quantum relativistic phase spaces with built-in quantum Poincare covariance. If we assume that Lorentz generators are orbital i.e. do not describe spin degrees of freedom, one can embed the considered generalized phase spaces into the ones describing the quantum-deformed Heisenberg algebras.

  15. A compressive sensing approach to the calculation of the inverse data space

    Khan, Babar Hasan


    Seismic processing in the Inverse Data Space (IDS) has its advantages like the task of removing the multiples simply becomes muting the zero offset and zero time data in the inverse domain. Calculation of the Inverse Data Space by sparse inversion techniques has seen mitigation of some artifacts. We reformulate the problem by taking advantage of some of the developments from the field of Compressive Sensing. The seismic data is compressed at the sensor level by recording projections of the traces. We then process this compressed data directly to estimate the inverse data space. Due to the smaller number of data set we also gain in terms of computational complexity.

  16. New Li-Yau-Hamilton Inequalities for the Ricci Flow via the Space-Time Approach

    Chow, Bennett; Knopf, Dan


    We generalize Hamilton's matrix Li-Yau-type Harnack estimate for the Ricci flow by considering the space of all LYH (Li-Yau-Hamilton) quadratics that arise as curvature tensors of space-time connections satisfying the Ricci flow with respect to the natural space-time degenerate metric. As a special case, we employ scaling arguments to derive a linear-type matrix LYH estimate. The new LYH quadratics obtained in this way are associated to the system of the Ricci flow coupled to a 1-form and a 2...

  17. Green's functions in Bianchi type-I spaces. Relation between Minkowski and Euclidean approaches

    Bukhbinder, I.L.; Kirillova, E.N.


    A theory is considered for a free scalar field with a conformal connection in a curved space-time with a Bianchi type-I metric. A representation is obtained for the Green's function G∼ in in in the form of an integral of a Schwinger-DeWitt kernel along a contour in a plane of complex-valued proper time. It is shown how as transition may be accomplished from Green's functions in space with the Euclidean signature to Green's functions in space with Minkowski signature and vice versa

  18. Distribution function approach to redshift space distortions. Part II: N-body simulations

    Okumura, Teppei; Seljak, Uroš; McDonald, Patrick; Desjacques, Vincent


    Measurement of redshift-space distortions (RSD) offers an attractive method to directly probe the cosmic growth history of density perturbations. A distribution function approach where RSD can be written as a sum over density weighted velocity moment correlators has recently been developed. In this paper we use results of N-body simulations to investigate the individual contributions and convergence of this expansion for dark matter. If the series is expanded as a function of powers of μ, cosine of the angle between the Fourier mode and line of sight, then there are a finite number of terms contributing at each order. We present these terms and investigate their contribution to the total as a function of wavevector k. For μ 2 the correlation between density and momentum dominates on large scales. Higher order corrections, which act as a Finger-of-God (FoG) term, contribute 1% at k ∼ 0.015hMpc −1 , 10% at k ∼ 0.05hMpc −1 at z = 0, while for k > 0.15hMpc −1 they dominate and make the total negative. These higher order terms are dominated by density-energy density correlations which contributes negatively to the power, while the contribution from vorticity part of momentum density auto-correlation adds to the total power, but is an order of magnitude lower. For μ 4 term the dominant term on large scales is the scalar part of momentum density auto-correlation, while higher order terms dominate for k > 0.15hMpc −1 . For μ 6 and μ 8 we find it has very little power for k −1 , shooting up by 2–3 orders of magnitude between k −1 and k −1 . We also compare the expansion to the full 2-d P ss (k,μ), as well as to the monopole, quadrupole, and hexadecapole integrals of P ss (k,μ). For these statistics an infinite number of terms contribute and we find that the expansion achieves percent level accuracy for kμ −1 at 6-th order, but breaks down on smaller scales because the series is no longer perturbative. We explore resummation of the terms into Fo

  19. Contaminant ingress into multizone buildings: An analytical state-space approach

    Parker, Simon; Coffey, Chris; Gravesen, Jens; Kirkpatrick, James; Ratcliffe, Keith; Lingard, Bryan; Nally, James


    The ingress of exterior contaminants into buildings is often assessed by treating the building interior as a single well-mixed space. Multizone modelling provides an alternative way of representing buildings that can estimate concentration time

  20. Tensegrity Approaches to In-Space Construction of a 1g Growable Habitat

    National Aeronautics and Space Administration — This proposal seeks to design a rotating habitat with a robotic system that constructs the structure and provides a habitat growth capability. The tensegrity...

  1. A Novel Approach of Sensitive Infrared Signal Detection for Space Applications

    National Aeronautics and Space Administration — Develop an innovative frequency up-conversion device that will efficiently convert the infrared signals into visible/near-infrared signals to enable detection of...

  2. Molecular basis sets - a general similarity-based approach for representing chemical spaces.

    Raghavendra, Akshay S; Maggiora, Gerald M


    A new method, based on generalized Fourier analysis, is described that utilizes the concept of "molecular basis sets" to represent chemical space within an abstract vector space. The basis vectors in this space are abstract molecular vectors. Inner products among the basis vectors are determined using an ansatz that associates molecular similarities between pairs of molecules with their corresponding inner products. Moreover, the fact that similarities between pairs of molecules are, in essentially all cases, nonzero implies that the abstract molecular basis vectors are nonorthogonal, but since the similarity of a molecule with itself is unity, the molecular vectors are normalized to unity. A symmetric orthogonalization procedure, which optimally preserves the character of the original set of molecular basis vectors, is used to construct appropriate orthonormal basis sets. Molecules can then be represented, in general, by sets of orthonormal "molecule-like" basis vectors within a proper Euclidean vector space. However, the dimension of the space can become quite large. Thus, the work presented here assesses the effect of basis set size on a number of properties including the average squared error and average norm of molecular vectors represented in the space-the results clearly show the expected reduction in average squared error and increase in average norm as the basis set size is increased. Several distance-based statistics are also considered. These include the distribution of distances and their differences with respect to basis sets of differing size and several comparative distance measures such as Spearman rank correlation and Kruscal stress. All of the measures show that, even though the dimension can be high, the chemical spaces they represent, nonetheless, behave in a well-controlled and reasonable manner. Other abstract vector spaces analogous to that described here can also be constructed providing that the appropriate inner products can be directly

  3. Worms to astronauts: Canadian Space Agency approach to life sciences in support of exploration

    Buckley, Nicole; Johnson-Green, Perry; Lefebvre, Luc

    As the pace of human exploration of space is accelerated, the need to address the challenges of long-duration human missions becomes imperative. Working with limited resources, we must determine the most effective way to meet this challenge. A great deal of science management centres on "applied" versus "basic" research as the cornerstone of a program. We have chosen to largely ignore such a labeling of science and concentrate on quality, as determined by peer review, as the primary criterion for science selection. Space Life Sciences is a very young science and access to space continues to be difficult. Because we have few opportunities for conducting science, and space life science is very challenging, we are comfortable maintaining a very high bar for selection. In order to ensure adequate depth to our community we have elected to concentrate our efforts. Working in concert with members of the community, we have identified specific areas of focus that are chosen by their importance in space, but also according to Canada's strength in the terrestrial counterpart of the research. It is hoped that through a balanced but highly competitive program with the emphasis on quality, Canadian scientists can contribute to making space a safer, more welcoming place for our astronauts.

  4. Representing a Model Using Data Mining Approach for Maximizing Profit with Considering Product Assortment and Space Allocation Decisions

    Manoochehr Ansari


    Full Text Available The choice of which products to stock among numerous competing products and how much space to allocate to those products are central decisions for retailers. This study aimed to apply data mining approach so that, we got needed information from large datasets of sale transactions to find the relations between products and to make product assortments. Thus, we represented a model for product assortment and space allocation. Research population was transactional data of a store, the sample included transactional data of one-month period in the time series. Data were collected in October and November, 2015 from Shaghayegh store. 525 transactions with regard to 79 different products were analyzed. Based on the result 10 product assortments formed although some products were allocated to more than 1 product category. By solving profit equation and finding volume increase indices we allocated spaces for each product assortment.

  5. Keeping it real: revisiting a real-space approach to running ensembles of cosmological N-body simulations

    Orban, Chris


    In setting up initial conditions for ensembles of cosmological N-body simulations there are, fundamentally, two choices: either maximizing the correspondence of the initial density field to the assumed fourier-space clustering or, instead, matching to real-space statistics and allowing the DC mode (i.e. overdensity) to vary from box to box as it would in the real universe. As a stringent test of both approaches, I perform ensembles of simulations using power law and a ''powerlaw times a bump'' model inspired by baryon acoustic oscillations (BAO), exploiting the self-similarity of these initial conditions to quantify the accuracy of the matter-matter two-point correlation results. The real-space method, which was originally proposed by Pen 1997 [1] and implemented by Sirko 2005 [2], performed well in producing the expected self-similar behavior and corroborated the non-linear evolution of the BAO feature observed in conventional simulations, even in the strongly-clustered regime (σ 8 ∼>1). In revisiting the real-space method championed by [2], it was also noticed that this earlier study overlooked an important integral constraint correction to the correlation function in results from the conventional approach that can be important in ΛCDM simulations with L box ∼ −1 Gpc and on scales r∼>L box /10. Rectifying this issue shows that the fourier space and real space methods are about equally accurate and efficient for modeling the evolution and growth of the correlation function, contrary to previous claims. An appendix provides a useful independent-of-epoch analytic formula for estimating the importance of the integral constraint bias on correlation function measurements in ΛCDM simulations

  6. EEG source space analysis of the supervised factor analytic approach for the classification of multi-directional arm movement

    Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai


    Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.

  7. Building a quality culture in the Office of Space Flight: Approach, lessons learned and implications for the future

    Roberts, C. Shannon


    The purpose of this paper is to describe the approach and lessons learned by the Office of Space Flight (OSF), National Aeronautics and Space Administration (NASA), in its introduction of quality. In particular, the experience of OSF Headquarters is discussed as an example of an organization within NASA that is considering both the business and human elements of the change and the opportunities the quality focus presents to improve continuously. It is hoped that the insights shared will be of use to those embarking upon similar cultural changes. The paper is presented in the following parts: the leadership challenge; background; context of the approach to quality; initial steps; current initiatives; lessons learned; and implications for the future.

  8. Topology-based description of the NCA cathode configurational space and an approach of its effective reduction

    Zolotarev Pavel


    Full Text Available Modification of existing solid electrolyte and cathode materialsis a topic of interest for theoreticians and experimentalists. In particular, itrequires elucidation of the influence of dopants on the characteristics of thestudying materials. For the reason of high complexity of theconfigurational space of doped/deintercalated systems, application of thecomputer modeling approaches is hindered, despite significant advances ofcomputational facilities in last decades. In this study, we propose a scheme,which allows to reduce a set of structures of a modeled configurationalspace for the subsequent study by means of the time-consuming quantumchemistry methods. Application of the proposed approach is exemplifiedthrough the study of the configurational space of the commercialLiNi0.8Co0.15Al0.05O2 (NCA cathode material approximant.

  9. Topology-based description of the NCA cathode configurational space and an approach of its effective reduction

    Zolotarev, Pavel; Eremin, Roman


    Modification of existing solid electrolyte and cathode materialsis a topic of interest for theoreticians and experimentalists. In particular, itrequires elucidation of the influence of dopants on the characteristics of thestudying materials. For the reason of high complexity of theconfigurational space of doped/deintercalated systems, application of thecomputer modeling approaches is hindered, despite significant advances ofcomputational facilities in last decades. In this study, we propose a scheme,which allows to reduce a set of structures of a modeled configurationalspace for the subsequent study by means of the time-consuming quantumchemistry methods. Application of the proposed approach is exemplifiedthrough the study of the configurational space of the commercialLiNi0.8Co0.15Al0.05O2 (NCA) cathode material approximant.

  10. Research of features and structure of electoral space of Ukraine in 2014 with the use of synthetic approach

    M. M. Shelemba


    Full Text Available The article is aimed at the ground of expediency of the use of synthetic authorial model for research of features and structure of electoral space of Ukraine in 2014 year. Methodological principles of the use of synthetic model are expounded with the use of quality and quantitative methods researches of electoral space, among that methods of factor and cross­correlation analysis. A synthetic model (approach that is built on the basis of the use of the best scientific approaches takes into account features and progress of electoral space of Ukraine trends. The analysis of features and structure of electoral space of Ukraine is conducted in 2014 with the use of an offer model. The application author synthetic model allows the study of the use of association factor and correlation analysis to justify support to political parties during election campaigns, respectively, depending on the factors and the most important correlates. It was found that electoral choice depends on the actions of those factors in the highest degree the expectations of the region. This article has shown that the use of Ukraine at this stage of the investigated during election campaigns as the most significant social correlates of «Human Development Index» is reasonable and one that makes it possible to obtain reliable results. It is proved that a high level of correlation holds at a high level of support the party and, consequently, high sense of social correlates all variants of expert research.

  11. Topology-based description of the NCA cathode configurational space and an approach of its effective reduction

    Zolotarev Pavel; Eremin Roman


    Modification of existing solid electrolyte and cathode materialsis a topic of interest for theoreticians and experimentalists. In particular, itrequires elucidation of the influence of dopants on the characteristics of thestudying materials. For the reason of high complexity of theconfigurational space of doped/deintercalated systems, application of thecomputer modeling approaches is hindered, despite significant advances ofcomputational facilities in last decades. In this study, we propose a...

  12. Neighborhood spaces

    D. C. Kent; Won Keun Min


    Neighborhood spaces, pretopological spaces, and closure spaces are topological space generalizations which can be characterized by means of their associated interior (or closure) operators. The category NBD of neighborhood spaces and continuous maps contains PRTOP as a bicoreflective subcategory and CLS as a bireflective subcategory, whereas TOP is bireflectively embedded in PRTOP and bicoreflectively embedded in CLS. Initial and final structures are described in these categories, and it is s...

  13. Impact of time and space evolution of ion tracks in nonvolatile memory cells approaching nanoscale

    Cellere, G.; Paccagnella, A.; Murat, M.; Barak, J.; Akkerman, A.; Harboe-Sorensen, R.; Virtanen, A.; Visconti, A.; Bonanomi, M.


    Swift heavy ions impacting on matter lose energy through the creation of dense tracks of charges. The study of the space and time evolution of energy exchange allows understanding the single event effects behavior in advanced microelectronic devices. In particular, the shrinking of minimum feature size of most advanced memory devices makes them very interesting test vehicles to study these effects since the device and the track dimensions are comparable; hence, measured effects are directly correlated with the time and space evolution of the energy release. In this work we are studying the time and space evolution of ion tracks by using advanced non volatile memories and Monte Carlo simulations. Experimental results are very well explained by the theoretical calculations.

  14. Traveling with blindness: A qualitative space-time approach to understanding visual impairment and urban mobility.

    Wong, Sandy


    This paper draws from Hägerstrand's space-time framework to generate new insights on the everyday mobilities of individuals with visual impairments in the San Francisco Bay Area. While existing research on visual impairment and mobility emphasizes individual physical limitations resulting from vision loss or inaccessible public spaces, this article highlights and bridges both the behavioral and social processes that influence individual mobility. A qualitative analysis of sit-down and mobile interview data reveals that the space-time constraints of people with visual impairments are closely linked to their access to transportation, assistive technologies, and mobile devices. The findings deepen our understandings of the relationship between health and mobility, and present intervention opportunities for improving the quality of life for people with visual impairment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Path space measures for Dirac and Schroedinger equations: Nonstandard analytical approach

    Nakamura, T.


    A nonstandard path space *-measure is constructed to justify the path integral formula for the Dirac equation in two-dimensional space endash time. A standard measure as well as a standard path integral is obtained from it. We also show that, even for the Schroedinger equation, for which there is no standard measure appropriate for a path integral, there exists a nonstandard measure to define a *-path integral whose standard part agrees with the ordinary path integral as defined by a limit from time-slice approximant. copyright 1997 American Institute of Physics

  16. Dynamical modeling approach to risk assessment for radiogenic leukemia among astronauts engaged in interplanetary space missions.

    Smirnova, Olga A; Cucinotta, Francis A


    A recently developed biologically motivated dynamical model of the assessment of the excess relative risk (ERR) for radiogenic leukemia among acutely/continuously irradiated humans (Smirnova, 2015, 2017) is applied to estimate the ERR for radiogenic leukemia among astronauts engaged in long-term interplanetary space missions. Numerous scenarios of space radiation exposure during space missions are used in the modeling studies. The dependence of the ERR for leukemia among astronauts on several mission parameters including the dose equivalent rates of galactic cosmic rays (GCR) and large solar particle events (SPEs), the number of large SPEs, the time interval between SPEs, mission duration, the degree of astronaut's additional shielding during SPEs, the degree of their additional 12-hour's daily shielding, as well as the total mission dose equivalent, is examined. The results of the estimation of ERR for radiogenic leukemia among astronauts, which are obtained in the framework of the developed dynamical model for various scenarios of space radiation exposure, are compared with the corresponding results, computed by the commonly used linear model. It is revealed that the developed dynamical model along with the linear model can be applied to estimate ERR for radiogenic leukemia among astronauts engaged in long-term interplanetary space missions in the range of applicability of the latter. In turn, the developed dynamical model is capable of predicting the ERR for leukemia among astronauts for the irradiation regimes beyond the applicability range of the linear model in emergency cases. As a supplement to the estimations of cancer incidence and death (REIC and REID) (Cucinotta et al., 2013, 2017), the developed dynamical model for the assessment of the ERR for leukemia can be employed on the pre-mission design phase for, e.g., the optimization of the regimes of astronaut's additional shielding in the course of interplanetary space missions. The developed model can

  17. Sustainability: an Approach in Planning to Raise the Quality of Life Through Open Space Development

    Sonal Y. Khobragade


    Full Text Available A presentation of the notion of sustainable development through the eye of a town planner, by elucidating how open space development would change the character of the city and augment towards harmony in the socio-environmental chords of sustainable development. It is an attempt to put forward awareness about the sustainability and environmental risk to ultimately reconcile ecological, social and economic factors of society. It is an attempt to reflect on socio-environmental dimension of the open space planning by addressing urban metamorphosis.

  18. Foundation plate on the elastic half-space, deterministic and probabilistic approach

    Tvrdá Katarína


    Full Text Available Interaction between the foundation plate and subgrade can be described by different mathematical - physical model. Elastic foundation can be modelled by different types of models, e.g. one-parametric model, two-parametric model and a comprehensive model - Boussinesque (elastic half-space had been used. The article deals with deterministic and probabilistic analysis of deflection of the foundation plate on the elastic half-space. Contact between the foundation plate and subsoil was modelled using contact elements node-node. At the end the obtained results are presented.

  19. Accounting for sampling error when inferring population synchrony from time-series data: a Bayesian state-space modelling approach with applications.

    Hugues Santin-Janin

    Full Text Available BACKGROUND: Data collected to inform time variations in natural population size are tainted by sampling error. Ignoring sampling error in population dynamics models induces bias in parameter estimators, e.g., density-dependence. In particular, when sampling errors are independent among populations, the classical estimator of the synchrony strength (zero-lag correlation is biased downward. However, this bias is rarely taken into account in synchrony studies although it may lead to overemphasizing the role of intrinsic factors (e.g., dispersal with respect to extrinsic factors (the Moran effect in generating population synchrony as well as to underestimating the extinction risk of a metapopulation. METHODOLOGY/PRINCIPAL FINDINGS: The aim of this paper was first to illustrate the extent of the bias that can be encountered in empirical studies when sampling error is neglected. Second, we presented a space-state modelling approach that explicitly accounts for sampling error when quantifying population synchrony. Third, we exemplify our approach with datasets for which sampling variance (i has been previously estimated, and (ii has to be jointly estimated with population synchrony. Finally, we compared our results to those of a standard approach neglecting sampling variance. We showed that ignoring sampling variance can mask a synchrony pattern whatever its true value and that the common practice of averaging few replicates of population size estimates poorly performed at decreasing the bias of the classical estimator of the synchrony strength. CONCLUSION/SIGNIFICANCE: The state-space model used in this study provides a flexible way of accurately quantifying the strength of synchrony patterns from most population size data encountered in field studies, including over-dispersed count data. We provided a user-friendly R-program and a tutorial example to encourage further studies aiming at quantifying the strength of population synchrony to account for

  20. Long-Term Prospects for Developments in Space (A Scenario Approach)


    existence of such a frontier, of such an area of activity, of such a locus of dynamism, initiative, and entrepreneurship should be very healthy for both...the estab- lishment of criteria for space traveler selection and preparation in private hands. Unfortunately, the franchised operating corporation in

  1. An effective approach to reducing strategy space for maintenance optimisation of multistate series–parallel systems

    Zhou, Yifan; Lin, Tian Ran; Sun, Yong; Bian, Yangqing; Ma, Lin


    Maintenance optimisation of series–parallel systems is a research topic of practical significance. Nevertheless, a cost-effective maintenance strategy is difficult to obtain due to the large strategy space for maintenance optimisation of such systems. The heuristic algorithm is often employed to deal with this problem. However, the solution obtained by the heuristic algorithm is not always the global optimum and the algorithm itself can be very time consuming. An alternative method based on linear programming is thus developed in this paper to overcome such difficulties by reducing strategy space of maintenance optimisation. A theoretical proof is provided in the paper to verify that the proposed method is at least as effective as the existing methods for strategy space reduction. Numerical examples for maintenance optimisation of series–parallel systems having multistate components and considering both economic dependence among components and multiple-level imperfect maintenance are also presented. The simulation results confirm that the proposed method is more effective than the existing methods in removing inappropriate maintenance strategies of multistate series–parallel systems. - Highlights: • A new method using linear programming is developed to reduce the strategy space. • The effectiveness of the new method for strategy reduction is theoretically proved. • Imperfect maintenance and economic dependence are considered during optimisation

  2. A non-linear state space approach to model groundwater fluctuations

    Berendrecht, W.L.; Heemink, A.W.; Geer, F.C. van; Gehrels, J.C.


    A non-linear state space model is developed for describing groundwater fluctuations. Non-linearity is introduced by modeling the (unobserved) degree of water saturation of the root zone. The non-linear relations are based on physical concepts describing the dependence of both the actual

  3. Graphical Programming: A systems approach for telerobotic servicing of space assets

    Pinkerton, J.T.; Patten, R.


    Satellite servicing is in many ways analogous to subsea robotic servicing in the late 1970's. A cost effective, reliable, telerobotic capability had to be demonstrated before the oil companies invested money in deep water robot serviceable production facilities. In the same sense, aeronautic engineers will not design satellites for telerobotic servicing until such a quantifiable capability has been demonstrated. New space servicing systems will be markedly different than existing space robot systems. Past space manipulator systems, including the Space Shuttle's robot arm, have used master/slave technologies with poor fidelity, slow operating speeds and most importantly, in-orbit human operators. In contrast, new systems will be capable of precision operations, conducted at higher rates of speed, and be commanded via ground-control communication links. Challenges presented by this environment include achieving a mandated level of robustness and dependability, radiation hardening, minimum weight and power consumption, and a system which accommodates the inherent communication delay between the ground station and the satellite. There is also a need for a user interface which is easy to use, ensures collision free motions, and is capable of adjusting to an unknown workcell (for repair operations the condition of the satellite may not be known in advance). This paper describes the novel technologies required to deliver such a capability

  4. Space-charge-limited currents: An E-infinity Cantorian approach

    Zmeškal, O.; Nešpůrek, Stanislav; Weiter, M.


    Roč. 34, č. 2 (2007), s. 143-158 ISSN 0960-0779 R&D Projects: GA MPO FT-TA/036; GA AV ČR IAA100100622 Institutional research plan: CEZ:AV0Z40500505 Keywords : space charge * fractal * charge injection Subject RIV: CD - Macromolecular Chemistry Impact factor: 3.025, year: 2007

  5. Singing the Spaces: Artful Approaches to Navigating the Emotional Landscape in Environmental Education

    Burkhart, Jocelyn


    This paper briefly explores the gap in the environmental education literature on emotions, and then offers a rationale and potential directions for engaging the emotions more fully, through the arts. Using autoenthnographic and arts-based methods, and including original songs and invitational reflective questions to open spaces for further inquiry…

  6. A Data Analysis Approach for Diagnosing Malfunctioning in Domestic Space Heating

    Tabatabaei, S.

    Around one third of worldwide energy usage is for the residential section and 60% of the energy consumption in this domestic area is for space heating. Therefore, monitoring and controlling this part of energy usage can have a major effect on the overall energy consumption and also on the emission

  7. A quantitative approach to measuring the cerebrospinal fluid space with CT

    Zeumer, H.; Hacke, W.; Hartwich, P.


    A method for measuring the subarachnoid space by using an independent CT evaluation unit is described. The normal values have been calculated for patients, according to age, and three examples are presented demonstrating reversible decrease of brain volume in patients suffering anorexia nervosa and chronic alcoholism. (orig.)

  8. Space Station: NASA's software development approach increases safety and cost risks. Report to the Chairman, Committee on Science, Space, and Technology, House of Representatives


    The House Committee on Science, Space, and Technology asked NASA to study software development issues for the space station. How well NASA has implemented key software engineering practices for the station was asked. Specifically, the objectives were to determine: (1) if independent verification and validation techniques are being used to ensure that critical software meets specified requirements and functions; (2) if NASA has incorporated software risk management techniques into program; (3) whether standards are in place that will prescribe a disciplined, uniform approach to software development; and (4) if software support tools will help, as intended, to maximize efficiency in developing and maintaining the software. To meet the objectives, NASA proceeded: (1) reviewing and analyzing software development objectives and strategies contained in NASA conference publications; (2) reviewing and analyzing NASA, other government, and industry guidelines for establishing good software development practices; (3) reviewing and analyzing technical proposals and contracts; (4) reviewing and analyzing software management plans, risk management plans, and program requirements; (4) reviewing and analyzing reports prepared by NASA and contractor officials that identified key issues and challenges facing the program; (5) obtaining expert opinions on what constitutes appropriate independent V-and-V and software risk management activities; (6) interviewing program officials at NASA headquarters in Washington, DC; at the Space Station Program Office in Reston, Virginia; and at the three work package centers; Johnson in Houston, Texas; Marshall in Huntsville, Alabama; and Lewis in Cleveland, Ohio; and (7) interviewing contractor officials doing work for NASA at Johnson and Marshall. The audit work was performed in accordance with generally accepted government auditing standards, between April 1991 and May 1992.

  9. Analysis of Approaches to the Near-Earth Orbit Cleanup from Space Debris of the Size Below10 cm

    V. I. Maiorova


    Full Text Available Nowadays, there are a lot of concepts aimed at space debris removal from the near-Earth orbits being under way at different stages of detailed engineering and design. As opposed to large-size space debris (upper-stages, rocket bodies, non-active satellites, to track the small objects of space debris (SOSD, such as picosatellites, satellite fragments, pyrotechnic devices, and other items less than 10 cm in size, using the ground stations is, presently, a challenge.This SOSD feature allows the authors to propose the two most rational approaches, which use, respectively, a passive and an active (prompt maneuverable space vehicles (SV and appropriate schematic diagrams for their collection:1 Passive scheme – space vehicle (SV to be launched into an orbit is characterized by high mathematical expectation of collision with a large amount of SOSD and, accordingly, by high probability to be captured using both active or the passive tools. The SV does not execute any maneuvers, but can be equipped with a propulsion system required for orbit’s maintenance and correction and also for solving the tasks of long-range guidance.2 Active scheme – the SV is to be launched into the target or operating orbit and executes a number of maneuvers to capture the SOSD using both active and passive tools. Thus, such a SV has to be equipped with a rather high-trust propulsion system, which allows the change of its trajectory and also with the guidance system to provide it with target coordinates. The guidance system can be built on either radio or optical devices, it can be installed onboard the debris-removal SV or onboard the SV which operates as a supply unit (if such SVs are foreseen.The paper describes each approach, emphasizes advantages and disadvantages, and defines the cutting-edge technologies to be implemented.

  10. Coherent Structures and Spectral Energy Transfer in Turbulent Plasma: A Space-Filter Approach

    Camporeale, E.; Sorriso-Valvo, L.; Califano, F.; Retinò, A.


    Plasma turbulence at scales of the order of the ion inertial length is mediated by several mechanisms, including linear wave damping, magnetic reconnection, the formation and dissipation of thin current sheets, and stochastic heating. It is now understood that the presence of localized coherent structures enhances the dissipation channels and the kinetic features of the plasma. However, no formal way of quantifying the relationship between scale-to-scale energy transfer and the presence of spatial structures has been presented so far. In the Letter we quantify such a relationship analyzing the results of a two-dimensional high-resolution Hall magnetohydrodynamic simulation. In particular, we employ the technique of space filtering to derive a spectral energy flux term which defines, in any point of the computational domain, the signed flux of spectral energy across a given wave number. The characterization of coherent structures is performed by means of a traditional two-dimensional wavelet transformation. By studying the correlation between the spectral energy flux and the wavelet amplitude, we demonstrate the strong relationship between scale-to-scale transfer and coherent structures. Furthermore, by conditioning one quantity with respect to the other, we are able for the first time to quantify the inhomogeneity of the turbulence cascade induced by topological structures in the magnetic field. Taking into account the low space-filling factor of coherent structures (i.e., they cover a small portion of space), it emerges that 80% of the spectral energy transfer (both in the direct and inverse cascade directions) is localized in about 50% of space, and 50% of the energy transfer is localized in only 25% of space.

  11. Managing for Multifunctionality in Urban Open Spaces: Approaches for Sustainable Development

    Wenzheng Shi

    Full Text Available ABSTRACT: Landscape management plays a key role in improving the quality of urban environments and enhancing the multifunctionality of green infrastructure. It works to guide the efficient and effective management of green spaces for sustainability and the well-being of users. However, while most researchers have emphasised spatial planning as a basis for developing green infrastructure to promote new strategic connections in urban green space, they have simultaneously ignored the impact of management. Against this background, this paper argues that if our towns and cities seek to maintain the well-being of citizens while also achieving sustainable environments, they must engage in effective landscape management to improve their green infrastructure. It is not enough to simply design or maintain parks and green spaces so as to keep up their physical condition; rather, green infrastructure work should be adapted to the understanding and implementation of managers, users and stakeholders in an integrated management process in order to provide more services for sustainable development. A selected study in Sheffield investigated the management planning required for sustainable development. It is beneficial to learn the experiences of management planning in Sheffield, a city which has rich management practices for green and open spaces. This study will analyse how management planning helps local authorities and managers to improve multifunctional green and open spaces in the context of sustainable development. As a result, the study also explores the framework of management planning with regard to the transferability of the existing practices in Sheffield. It also attempts to provide a primer for sustainability impact assessments in other cities with a considered knowledge exchange. KEYWORDS: Management planning, green infrastructure, multifunctionality, sustainability, knowledge exchange

  12. Exploring G Protein-Coupled Receptors (GPCRs) Ligand Space via Cheminformatics Approaches: Impact on Rational Drug Design

    Basith, Shaherin; Cui, Minghua; Macalino, Stephani J. Y.; Park, Jongmi; Clavio, Nina A. B.; Kang, Soosung; Choi, Sun


    The primary goal of rational drug discovery is the identification of selective ligands which act on single or multiple drug targets to achieve the desired clinical outcome through the exploration of total chemical space. To identify such desired compounds, computational approaches are necessary in predicting their drug-like properties. G Protein-Coupled Receptors (GPCRs) represent one of the largest and most important integral membrane protein families. These receptors serve as increasingly attractive drug targets due to their relevance in the treatment of various diseases, such as inflammatory disorders, metabolic imbalances, cardiac disorders, cancer, monogenic disorders, etc. In the last decade, multitudes of three-dimensional (3D) structures were solved for diverse GPCRs, thus referring to this period as the “golden age for GPCR structural biology.” Moreover, accumulation of data about the chemical properties of GPCR ligands has garnered much interest toward the exploration of GPCR chemical space. Due to the steady increase in the structural, ligand, and functional data of GPCRs, several cheminformatics approaches have been implemented in its drug discovery pipeline. In this review, we mainly focus on the cheminformatics-based paradigms in GPCR drug discovery. We provide a comprehensive view on the ligand– and structure-based cheminformatics approaches which are best illustrated via GPCR case studies. Furthermore, an appropriate combination of ligand-based knowledge with structure-based ones, i.e., integrated approach, which is emerging as a promising strategy for cheminformatics-based GPCR drug design is also discussed. PMID:29593527

  13. Short-term wind speed prediction using an unscented Kalman filter based state-space support vector regression approach

    Chen, Kuilin; Yu, Jie


    Highlights: • A novel hybrid modeling method is proposed for short-term wind speed forecasting. • Support vector regression model is constructed to formulate nonlinear state-space framework. • Unscented Kalman filter is adopted to recursively update states under random uncertainty. • The new SVR–UKF approach is compared to several conventional methods for short-term wind speed prediction. • The proposed method demonstrates higher prediction accuracy and reliability. - Abstract: Accurate wind speed forecasting is becoming increasingly important to improve and optimize renewable wind power generation. Particularly, reliable short-term wind speed prediction can enable model predictive control of wind turbines and real-time optimization of wind farm operation. However, this task remains challenging due to the strong stochastic nature and dynamic uncertainty of wind speed. In this study, unscented Kalman filter (UKF) is integrated with support vector regression (SVR) based state-space model in order to precisely update the short-term estimation of wind speed sequence. In the proposed SVR–UKF approach, support vector regression is first employed to formulate a nonlinear state-space model and then unscented Kalman filter is adopted to perform dynamic state estimation recursively on wind sequence with stochastic uncertainty. The novel SVR–UKF method is compared with artificial neural networks (ANNs), SVR, autoregressive (AR) and autoregressive integrated with Kalman filter (AR-Kalman) approaches for predicting short-term wind speed sequences collected from three sites in Massachusetts, USA. The forecasting results indicate that the proposed method has much better performance in both one-step-ahead and multi-step-ahead wind speed predictions than the other approaches across all the locations

  14. Three-dimensionality of space and the quantum bit: an information-theoretic approach

    Müller, Markus P; Masanes, Lluís


    It is sometimes pointed out as a curiosity that the state space of quantum two-level systems, i.e. the qubit, and actual physical space are both three-dimensional and Euclidean. In this paper, we suggest an information-theoretic analysis of this relationship, by proving a particular mathematical result: suppose that physics takes place in d spatial dimensions, and that some events happen probabilistically (not assuming quantum theory in any way). Furthermore, suppose there are systems that carry ‘minimal amounts of direction information’, interacting via some continuous reversible time evolution. We prove that this uniquely determines spatial dimension d = 3 and quantum theory on two qubits (including entanglement and unitary time evolution), and that it allows observers to infer local spatial geometry from probability measurements. (paper)


    D. F. AL RIZA


    Full Text Available This paper presents a sizing optimization methodology of panel and battery capacity in a standalone photovoltaic system with lighting load. Performance of the system is identified by performing Loss of Power Supply Probability (LPSP calculation. Input data used for the calculation is the daily weather data and system components parameters. Capital Cost and Life Cycle Cost (LCC is calculated as optimization parameters. Design space for optimum system configuration is identified based on a given LPSP value, Capital Cost and Life Cycle Cost. Excess energy value is used as an over-design indicator in the design space. An economic analysis, including cost of the energy and payback period, for selected configurations are also studied.

  16. Approaching Environmental Health Disparities and Green Spaces: An Ecosystem Services Perspective

    Viniece Jennings


    Full Text Available Health disparities occur when adverse health conditions are unequal across populations due in part to gaps in wealth. These disparities continue to plague global health. Decades of research suggests that the natural environment can play a key role in sustaining the health of the public. However, the influence of the natural environment on health disparities is not well-articulated. Green spaces provide ecosystem services that are vital to public health. This paper discusses the link between green spaces and some of the nation’s leading health issues such as obesity, cardiovascular health, heat-related illness, and psychological health. These associations are discussed in terms of key demographic variables—race, ethnicity, and income. The authors also identify research gaps and recommendations for future research.

  17. A perturbative approach to the redshift space power spectrum: beyond the Standard Model

    Bose, Benjamin; Koyama, Kazuya, E-mail:, E-mail: [Institute of Cosmology and Gravitation, University of Portsmouth, Burnaby Road, Portsmouth, Hampshire, PO1 3FX (United Kingdom)


    We develop a code to produce the power spectrum in redshift space based on standard perturbation theory (SPT) at 1-loop order. The code can be applied to a wide range of modified gravity and dark energy models using a recently proposed numerical method by A.Taruya to find the SPT kernels. This includes Horndeski's theory with a general potential, which accommodates both chameleon and Vainshtein screening mechanisms and provides a non-linear extension of the effective theory of dark energy up to the third order. Focus is on a recent non-linear model of the redshift space power spectrum which has been shown to model the anisotropy very well at relevant scales for the SPT framework, as well as capturing relevant non-linear effects typical of modified gravity theories. We provide consistency checks of the code against established results and elucidate its application within the light of upcoming high precision RSD data.

  18. An object-based approach for detecting small brain lesions: application to Virchow-Robin spaces.

    Descombes, Xavier; Kruggel, Frithjof; Wollny, Gert; Gertz, Hermann Josef


    This paper is concerned with the detection of multiple small brain lesions from magnetic resonance imaging (MRI) data. A model based on the marked point process framework is designed to detect Virchow-Robin spaces (VRSs). These tubular shaped spaces are due to retraction of the brain parenchyma from its supplying arteries. VRS are described by simple geometrical objects that are introduced as small tubular structures. Their radiometric properties are embedded in a data term. A prior model includes interactions describing the clustering property of VRS. A Reversible Jump Markov Chain Monte Carlo algorithm (RJMCMC) optimizes the proposed model, obtained by multiplying the prior and the data model. Example results are shown on T1-weighted MRI datasets of elderly subjects.

  19. Using the whole-building design approach to incorporate daylighting into a retail space: Preprint

    Hayter, S.; Torcellini, P.; Eastment, M.; Judkoff, R.


    This paper focuses on implementation of daylighting into the Bighorn Center, a collection of home improvement retail spaces in Silverthorne, Colorado, which were constructed in three phases. Daylighting was an integral part of the design of the Phase 3 building. Energy consultants optimized the daylighting design through detailed modeling using an hourly building energy simulation tool. Energy consultants also used this tool to address the building owner's concerns related to customer comfort and increased product sales.

  20. Linearized Navier-Stokes equations in R3: an approach in weighted Sobolev spaces

    Amrouche, Ch.; Meslameni, M.; Nečasová, Šárka


    Roč. 7, č. 5 (2014), s. 901-916 ISSN 1937-1632 R&D Projects: GA ČR(CZ) GAP201/11/1304 Institutional support: RVO:67985840 Keywords : generalized Oseen equations * weighted Sobolev spaces * generalized solutions Subject RIV: BA - General Mathematics Impact factor: 0.567, year: 2014

  1. A new approach to reduce uncertainties in space radiation cancer risk predictions.

    Francis A Cucinotta

    Full Text Available The prediction of space radiation induced cancer risk carries large uncertainties with two of the largest uncertainties being radiation quality and dose-rate effects. In risk models the ratio of the quality factor (QF to the dose and dose-rate reduction effectiveness factor (DDREF parameter is used to scale organ doses for cosmic ray proton and high charge and energy (HZE particles to a hazard rate for γ-rays derived from human epidemiology data. In previous work, particle track structure concepts were used to formulate a space radiation QF function that is dependent on particle charge number Z, and kinetic energy per atomic mass unit, E. QF uncertainties where represented by subjective probability distribution functions (PDF for the three QF parameters that described its maximum value and shape parameters for Z and E dependences. Here I report on an analysis of a maximum QF parameter and its uncertainty using mouse tumor induction data. Because experimental data for risks at low doses of γ-rays are highly uncertain which impacts estimates of maximum values of relative biological effectiveness (RBEmax, I developed an alternate QF model, denoted QFγAcute where QFs are defined relative to higher acute γ-ray doses (0.5 to 3 Gy. The alternate model reduces the dependence of risk projections on the DDREF, however a DDREF is still needed for risk estimates for high-energy protons and other primary or secondary sparsely ionizing space radiation components. Risk projections (upper confidence levels (CL for space missions show a reduction of about 40% (CL∼50% using the QFγAcute model compared the QFs based on RBEmax and about 25% (CL∼35% compared to previous estimates. In addition, I discuss how a possible qualitative difference leading to increased tumor lethality for HZE particles compared to low LET radiation and background tumors remains a large uncertainty in risk estimates.

  2. Complex Permittivity Measurements of Textiles and Leather in a Free Space: An Angular-Invariant Approach

    Kapilevich, B.; Litvak, B.; Anisimov, M.; Hardon, D.; Pinhasi, Y.


    The paper describes the complex permittivity measurements of textiles and leathers in a free space at 330 GHz. The destructive role of the Rayleigh scattering effect is considered and the angular-invariant limit for an incidence angle has been found out experimentally within 25–30 degrees. If incidence angle exceeds this critical parameter, the uncertainty caused by the Rayleigh scattering is drastically increased preventing accurate measurements of the real and imaginary parts of a bulky mat...

  3. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis.

    Held, Christian; Nattkemper, Tim; Palmisano, Ralf; Wittenberg, Thomas


    Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline's modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum.

  4. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis

    Christian Held


    Full Text Available Introduction: Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. Methods: In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline′s modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. Results: This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. Conclusion: The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum.

  5. A Description Of Space Relations In An NLP Model: The ABBYY Compreno Approach

    Aleksey Leontyev


    Full Text Available The current paper is devoted to a formal analysis of the space category and, especially, to questions bound with the presentation of space relations in a formal NLP model. The aim is to demonstrate how linguistic and cognitive problems relating to spatial categorization, definition of spatial entities, and the expression of different locative senses in natural languages can be solved in an artificial intelligence system. We offer a description of the locative groups in the ABBYY Compreno formalism – an integral NLP framework applied for machine translation, semantic search, fact extraction, and other tasks based on the semantic analysis of texts. The model is based on a universal semantic hierarchy of the thesaurus type and includes a description of all possible semantic and syntactic links every word can attach. In this work we define the set of semantic locative relations between words, suggest different tools for their syntactic presentation, give formal restrictions for the word classes that can denote spaces, and show different strategies of dealing with locative prepositions, especially as far as the problem of their machine translation is concerned.

  6. Methods of approaching decoherence in the flavor sector due to space-time foam

    Mavromatos, N. E.; Sarkar, Sarben


    In the first part of this work we discuss possible effects of stochastic space-time foam configurations of quantum gravity on the propagation of “flavored” (Klein-Gordon and Dirac) neutral particles, such as neutral mesons and neutrinos. The formalism is not the usually assumed Lindblad one, but it is based on random averages of quantum fluctuations of space-time metrics over which the propagation of the matter particles is considered. We arrive at expressions for the respective oscillation probabilities between flavors which are quite distinct from the ones pertaining to Lindblad-type decoherence, including in addition to the (expected) Gaussian decay with time, a modification to oscillation behavior, as well as a power-law cutoff of the time-profile of the respective probability. In the second part we consider space-time foam configurations of quantum-fluctuating charged-black holes as a way of generating (parts of) neutrino mass differences, mimicking appropriately the celebrated Mikheyev-Smirnov-Wolfenstein (MSW) effects of neutrinos in stochastically fluctuating random media. We pay particular attention to disentangling genuine quantum-gravity effects from ordinary effects due to the propagation of a neutrino through ordinary matter. Our results are of interest to precision tests of quantum-gravity models using neutrinos as probes.

  7. A probabilistic approach to safety/reliability of space nuclear power systems

    Medford, G.; Williams, K.; Kolaczkowski, A.


    An ongoing effort is investigating the feasibility of using probabilistic risk assessment (PRA) modeling techniques to construct a living model of a space nuclear power system. This is being done in conjunction with a traditional reliability and survivability analysis of the SP-100 space nuclear power system. The initial phase of the project consists of three major parts with the overall goal of developing a top-level system model and defining initiating events of interest for the SP-100 system. The three major tasks were performing a traditional survivability analysis, performing a simple system reliability analysis, and constructing a top-level system fault-tree model. Each of these tasks and their interim results are discussed in this paper. Initial results from the study support the conclusion that PRA modeling techniques can provide a valuable design and decision-making tool for space reactors. The ability of the model to rank and calculate relative contributions from various failure modes allows design optimization for maximum safety and reliability. Future efforts in the SP-100 program will see data development and quantification of the model to allow parametric evaluations of the SP-100 system. Current efforts have shown the need for formal data development and test programs within such a modeling framework

  8. Space space space

    Trembach, Vera


    Space is an introduction to the mysteries of the Universe. Included are Task Cards for independent learning, Journal Word Cards for creative writing, and Hands-On Activities for reinforcing skills in Math and Language Arts. Space is a perfect introduction to further research of the Solar System.

  9. A Numerical Approach to Estimate the Ballistic Coefficient of Space Debris from TLE Orbital Data

    Narkeliunas, Jonas


    Low Earth Orbit (LEO) is full of space debris, which consist of spent rocket stages, old satellites and fragments from explosions and collisions. As of 2009, more than 21,000 orbital debris larger than 10 cm are known to exist], and while it is hard to track anything smaller than that, the estimated population of particles between 1 and 10 cm in diameter is approximately 500,000, whereas small as 1 cm exceeds 100 million. These objects orbit Earth with huge kinetic energies speeds usually exceed 7 kms. The shape of their orbit varies from almost circular to highly elliptical and covers all LEO, a region in space between 160 and 2,000 km above sea level. Unfortunately, LEO is also the place where most of our active satellites are situated, as well as, International Space Station (ISS) and Hubble Space Telescope, whose orbits are around 400 and 550 km above sea level, respectively.This poses a real threat as debris can collide with satellites and deal substantial damage or even destroy them.Collisions between two or more debris create clouds of smaller debris, which are harder to track and increase overall object density and collision probability. At some point, the debris density couldthen reach a critical value, which would start a chain reaction and the number of space debris would grow exponentially. This phenomenon was first described by Kessler in 1978 and he concluded that it would lead to creation of debris belt, which would vastly complicate satellite operations in LEO. The debris density is already relatively high, as seen from several necessary debris avoidance maneuvers done by Shuttle, before it was discontinued, and ISS. But not all satellites have a propulsion system to avoid collision, hence different methods need to be applied. One of the proposed collision avoidance concepts is called LightForce and it suggests using photon pressure to induce small orbital corrections to deflect debris from colliding. This method is very efficient as seen from

  10. Multiple Model-Based Synchronization Approaches for Time Delayed Slaving Data in a Space Launch Vehicle Tracking System

    Haryong Song


    Full Text Available Due to the inherent characteristics of the flight mission of a space launch vehicle (SLV, which is required to fly over very large distances and have very high fault tolerances, in general, SLV tracking systems (TSs comprise multiple heterogeneous sensors such as radars, GPS, INS, and electrooptical targeting systems installed over widespread areas. To track an SLV without interruption and to hand over the measurement coverage between TSs properly, the mission control system (MCS transfers slaving data to each TS through mission networks. When serious network delays occur, however, the slaving data from the MCS can lead to the failure of the TS. To address this problem, in this paper, we propose multiple model-based synchronization (MMS approaches, which take advantage of the multiple motion models of an SLV. Cubic spline extrapolation, prediction through an α-β-γ filter, and a single model Kalman filter are presented as benchmark approaches. We demonstrate the synchronization accuracy and effectiveness of the proposed MMS approaches using the Monte Carlo simulation with the nominal trajectory data of Korea Space Launch Vehicle-I.

  11. Observation of superconducting fluxons by transmission electron microscopy: A Fourier space approach to calculate the electron optical phase shifts and images

    Beleggia, M.; Pozzi, G.


    An approach is presented for the calculation of the electron optical phase shift experienced by high-energy electrons in a transmission electron microscope, when they interact with the magnetic field associated with superconducting fluxons in a thin specimen tilted with respect to the beam. It is shown that by decomposing the vector potential in its Fourier components and by calculating the phase shift of each component separately, it is possible to obtain the Fourier transform of the electron optical phase shift, which can be inverted either analytically or numerically. It will be shown how this method can be used to recover the result, previously obtained by the real-space approach, relative to the case of a straight flux tube perpendicular to the specimen surfaces. Then the method is applied to the case of a London fluxon in a thin film, where the bending and the broadening of the magnetic-field lines due to the finite specimen thickness are now correctly taken into account and not treated approximately by means of a parabolic fit. Finally, it will be shown how simple models for the pancake structure of the fluxon can be analyzed within this framework and the main features of electron transmission images predicted

  12. Symplectic approach to calculation of magnetic field line trajectories in physical space with realistic magnetic geometry in divertor tokamaks

    Punjabi, Alkesh; Ali, Halima


    A new approach to integration of magnetic field lines in divertor tokamaks is proposed. In this approach, an analytic equilibrium generating function (EGF) is constructed in natural canonical coordinates (ψ,θ) from experimental data from a Grad-Shafranov equilibrium solver for a tokamak. ψ is the toroidal magnetic flux and θ is the poloidal angle. Natural canonical coordinates (ψ,θ,φ) can be transformed to physical position (R,Z,φ) using a canonical transformation. (R,Z,φ) are cylindrical coordinates. Another canonical transformation is used to construct a symplectic map for integration of magnetic field lines. Trajectories of field lines calculated from this symplectic map in natural canonical coordinates can be transformed to trajectories in real physical space. Unlike in magnetic coordinates [O. Kerwin, A. Punjabi, and H. Ali, Phys. Plasmas 15, 072504 (2008)], the symplectic map in natural canonical coordinates can integrate trajectories across the separatrix surface, and at the same time, give trajectories in physical space. Unlike symplectic maps in physical coordinates (x,y) or (R,Z), the continuous analog of a symplectic map in natural canonical coordinates does not distort trajectories in toroidal planes intervening the discrete map. This approach is applied to the DIII-D tokamak [J. L. Luxon and L. E. Davis, Fusion Technol. 8, 441 (1985)]. The EGF for the DIII-D gives quite an accurate representation of equilibrium magnetic surfaces close to the separatrix surface. This new approach is applied to demonstrate the sensitivity of stochastic broadening using a set of perturbations that generically approximate the size of the field errors and statistical topological noise expected in a poloidally diverted tokamak. Plans for future application of this approach are discussed.

  13. Symplectic approach to calculation of magnetic field line trajectories in physical space with realistic magnetic geometry in divertor tokamaks

    Punjabi, Alkesh; Ali, Halima


    A new approach to integration of magnetic field lines in divertor tokamaks is proposed. In this approach, an analytic equilibrium generating function (EGF) is constructed in natural canonical coordinates (ψ,θ) from experimental data from a Grad-Shafranov equilibrium solver for a tokamak. ψ is the toroidal magnetic flux and θ is the poloidal angle. Natural canonical coordinates (ψ,θ,φ) can be transformed to physical position (R,Z,φ) using a canonical transformation. (R,Z,φ) are cylindrical coordinates. Another canonical transformation is used to construct a symplectic map for integration of magnetic field lines. Trajectories of field lines calculated from this symplectic map in natural canonical coordinates can be transformed to trajectories in real physical space. Unlike in magnetic coordinates [O. Kerwin, A. Punjabi, and H. Ali, Phys. Plasmas 15, 072504 (2008)], the symplectic map in natural canonical coordinates can integrate trajectories across the separatrix surface, and at the same time, give trajectories in physical space. Unlike symplectic maps in physical coordinates (x,y) or (R,Z), the continuous analog of a symplectic map in natural canonical coordinates does not distort trajectories in toroidal planes intervening the discrete map. This approach is applied to the DIII-D tokamak [J. L. Luxon and L. E. Davis, Fusion Technol. 8, 441 (1985)]. The EGF for the DIII-D gives quite an accurate representation of equilibrium magnetic surfaces close to the separatrix surface. This new approach is applied to demonstrate the sensitivity of stochastic broadening using a set of perturbations that generically approximate the size of the field errors and statistical topological noise expected in a poloidally diverted tokamak. Plans for future application of this approach are discussed.

  14. The Faster, Better, Cheaper Approach to Space Missions: An Engineering Management Assessment

    Hamaker, Joseph W.


    NASA was chartered as an independent civilian space agency in 1958 following the Soviet Union's dramatic launch of the Sputnik 1 (1957). In his state of the union address in May of 1961, President Kennedy issued to the fledging organization his famous challenge for a manned lunar mission by the end of the decade. The Mercury, Gemini and Apollo programs that followed put the utmost value on high quality, low risk (as low as possible within the context of space flight), quick results, all with little regard for cost. These circumstances essentially melded NASAs culture as an organization capable of great technological achievement but at extremely high cost. The Space Shuttle project, the next major agency endeavor, was put under severe annual budget constraints in the 1970's. NASAs response was to hold to the high quality standards, low risk and annual cost and let schedule suffer. The result was a significant delay in the introduction of the Shuttle as well as overall total cost growth. By the early 1990's, because NASA's budget was declining, the number of projects was also declining. Holding the same cost and schedule productivity levels as before was essentially causing NASA to price itself out of business. In 1992, the helm of NASA was turned over to a new Administrator. Dan Goldin's mantra was "faster, better, cheaper" and his enthusiasm and determination to change the NASA culture was not to be ignored. This research paper documents the various implementations of "faster, better, cheaper" that have been attempted, analyzes their impact and compares the cost performance of these new projects to previous NASA benchmarks. Fundamentally, many elements of "faster, better, cheaper" are found to be working well, especially on smaller projects. Some of the initiatives are found to apply only to smaller or experimental projects however, so that extrapolation to "flagship" projects may be problematic.

  15. The stochastic versus the Euclidean approach to quantum fields on a static space-time

    De Angelis, G.F.; de Falco, D.


    Equations are presented which modify the definition of the Gaussian field in the Rindler chart in order to make contact with the Wightman state, the Hartle-Hawking state, and the Euclidean field. By taking Ornstein-Uhlenbeck processes the authors have chosen, in the sense of stochastic mechanics, to place precisely the Fulling modes in their harmonic oscillator ground state. In this respect, together with the periodicity of Minkowski space-time, the authors observe that the covariance of the Ornstein-Uhlenbeck process can be obtained by analytical continuation of the Wightman function of the harmonic oscillator at zero temperature

  16. An Hilbert space approach for a class of arbitrage free implied volatilities models

    Brace, A.; Fabbri, G.; Goldys, B.


    We present an Hilbert space formulation for a set of implied volatility models introduced in \\cite{BraceGoldys01} in which the authors studied conditions for a family of European call options, varying the maturing time and the strike price $T$ an $K$, to be arbitrage free. The arbitrage free conditions give a system of stochastic PDEs for the evolution of the implied volatility surface ${\\hat\\sigma}_t(T,K)$. We will focus on the family obtained fixing a strike $K$ and varying $T$. In order to...


    L.V. Arun Shalin


    Full Text Available Clustering is a process of grouping elements together, designed in such a way that the elements assigned to similar data points in a cluster are more comparable to each other than the remaining data points in a cluster. During clustering certain difficulties related when dealing with high dimensional data are ubiquitous and abundant. Works concentrated using anonymization method for high dimensional data spaces failed to address the problem related to dimensionality reduction during the inclusion of non-binary databases. In this work we study methods for dimensionality reduction for non-binary database. By analyzing the behavior of dimensionality reduction for non-binary database, results in performance improvement with the help of tag based feature. An effective multi-clustering anonymization approach called Discrete Component Task Specific Multi-Clustering (DCTSM is presented for dimensionality reduction on non-binary database. To start with we present the analysis of attribute in the non-binary database and cluster projection identifies the sparseness degree of dimensions. Additionally with the quantum distribution on multi-cluster dimension, the solution for relevancy of attribute and redundancy on non-binary data spaces is provided resulting in performance improvement on the basis of tag based feature. Multi-clustering tag based feature reduction extracts individual features and are correspondingly replaced by the equivalent feature clusters (i.e. tag clusters. During training, the DCTSM approach uses multi-clusters instead of individual tag features and then during decoding individual features is replaced by corresponding multi-clusters. To measure the effectiveness of the method, experiments are conducted on existing anonymization method for high dimensional data spaces and compared with the DCTSM approach using Statlog German Credit Data Set. Improved tag feature extraction and minimum error rate compared to conventional anonymization

  18. Experimental and computational approaches to evaluate the environmental mitigation effect in narrow spaces by noble metal chemical addition (NMCA)

    Shimizu, Ryosuke; Ota, Nobuyuki; Nagase, Makoto; Aizawa, Motohiro; Ishida, Kazushige; Wada, Yoichi


    The environmental mitigation effect of NMCA in a narrow space was evaluated by experimental and computational approaches. In the experiment at 8 MPa and 553K, T-tube whose branched line had a narrow space was prepared, and the Zr electrodes were set in the branched line at certain intervals, which were 1, 3, 5, 7, 9, 11, 15 and 29 cm from the opening section of the branched line. Electrochemical corrosion potential (ECP) at the tip of the branched narrow space varied in response to the water chemistry in the main line which was at right angle with the branched line. Computational fluid dynamics (CFD) analysis reproduced the experimental results. It was also confirmed by CFD analysis that the ingress of water from the main line into the narrow space was accelerated by cavity flow and thermal convection. By CFD analysis in a thermal sleeve of actual plant condition, which had a narrow space, the concentration of dissolved oxygen at a tip of the thermal sleeve reached at 250 ppb within 300 sec, which was the same concentration of the main line. Noble metal deposition on the surface of the thermal sleeve was evaluated by mass transfer model. Noble metal deposition was the largest near the opening section of the branched line, and gradually decreased toward the tip section. In light of the consumption of dissolved oxygen in the branched line, noble metal deposition in the thermal sleeve was sufficient to reduce the ECP. It was expected that NMCA could mitigate the corrosion environment in the thermal sleeve. (author)

  19. State-Space Modeling and Performance Analysis of Variable-Speed Wind Turbine Based on a Model Predictive Control Approach

    H. Bassi


    Full Text Available Advancements in wind energy technologies have led wind turbines from fixed speed to variable speed operation. This paper introduces an innovative version of a variable-speed wind turbine based on a model predictive control (MPC approach. The proposed approach provides maximum power point tracking (MPPT, whose main objective is to capture the maximum wind energy in spite of the variable nature of the wind’s speed. The proposed MPC approach also reduces the constraints of the two main functional parts of the wind turbine: the full load and partial load segments. The pitch angle for full load and the rotating force for the partial load have been fixed concurrently in order to balance power generation as well as to reduce the operations of the pitch angle. A mathematical analysis of the proposed system using state-space approach is introduced. The simulation results using MATLAB/SIMULINK show that the performance of the wind turbine with the MPC approach is improved compared to the traditional PID controller in both low and high wind speeds.

  20. Novel approach for evaluation of air change rate in naturally ventilated occupied spaces based on metabolic CO2 time variation

    Melikov, Arsen Krikor; Markov, Detelin G.


    IAQ in many residential buildings relies on non-organized natural ventilation. Accurate evaluation of air change rate (ACR) in this situation is difficult due to the nature of the phenomenon - intermittent infiltration-exfiltration periods of mass exchange between the room air and the outdoor air...... at low rate. This paper describes a new approach for ACR evaluation in naturally ventilated occupied spaces. Actual metabolic CO2 time variation record in an interval of time is compared with the computed variation of metabolic CO2 for the same time interval under reference conditions: sleeping occupants...

  1. New real space correlated-basis-functions approach for the electron correlations of the semiconductor inversion layer

    Feng Weiguo; Wang Hongwei; Wu Xiang


    Based on the real space Correlated-Basis-Functions theory and the collective oscillation behaviour of the electron gas with effective Coulomb interaction, the many body wave function is obtained for the quasi-two-dimensional electron system in the semiconductor inversion layer. The pair-correlation function and the correlation energy of the system have been calculated by the integro-differential method in this paper. The comparison with the other previous theoretical results is also made. The new theoretical approach and its numerical results show that the pair-correlation functions are definitely positive and satisfy the normalization condition. (author). 10 refs, 2 figs

  2. 3rd International Conference on Particle Physics Beyond the Standard Model : Accelerator, Non-Accelerator and Space Approaches

    Beyond The Desert 2002


    The third conference on particle physics beyond the Standard Model (BEYOND THE DESERT'02 - Accelerator, Non-accelerator and Space Approaches) was held during 2--7 June, 2002 at the Finish town of Oulu, almost at the northern Arctic Circle. It was the first of the BEYOND conference series held outside Germany (CERN Courier March 2003, pp. 29-30). Traditionally the Scientific Programme of BEYOND conferences, brought into life in 1997 (see CERN Courier, November 1997, pp.16-18), covers almost all topics of modern particle physics (see contents).

  3. Particle currents in a space-time dependent and CP-violating Higgs background: a field theory approach

    Comelli, D.; Riotto, A.


    Motivated by cosmological applications like electroweak baryogenesis, we develop a field theoretic approach to the computation of particle currents on a space-time dependent and CP-violating Higgs background. We consider the Standard Model model with two Higgs doublets and CP violation in the scalar sector, and compute both fermionic and Higgs currents by means of an expansion in the background fields. We discuss the gauge dependence of the results and the renormalization of the current operators, showing that in the limit of local equilibrium, no extra renormalization conditions are needed in order to specify the system completely. (orig.)

  4. The economic impact of global climate change on Mediterranean rangeland ecosystems. A Space-for-Time approach

    Fleischer, Aliza; Sternberg, Marcelo


    Global Climate Change (GCC) can bring about changes in ecosystems and consequently in their services value. Here we show that the urban population in Israel values the green landscape of rangelands in the mesic Mediterranean climate region and is willing to pay for preserving it in light of the expected increasing aridity conditions in this region. Their valuation of the landscape is higher than that of the grazing services these rangelands provide for livestock growers. These results stem from a Time-for-Space approach with which we were able to measure changes in biomass production and rainfall at four experimental sites along an aridity gradient. (author)

  5. Digital Cellular Solid Pressure Vessels: A Novel Approach for Human Habitation in Space

    Cellucci, Daniel; Jenett, Benjamin; Cheung, Kenneth C.


    It is widely assumed that human exploration beyond Earth's orbit will require vehicles capable of providing long duration habitats that simulate an Earth-like environment - consistent artificial gravity, breathable atmosphere, and sufficient living space- while requiring the minimum possible launch mass. This paper examines how the qualities of digital cellular solids - high-performance, repairability, reconfigurability, tunable mechanical response - allow the accomplishment of long-duration habitat objectives at a fraction of the mass required for traditional structural technologies. To illustrate the impact digital cellular solids could make as a replacement to conventional habitat subsystems, we compare recent proposed deep space habitat structural systems with a digital cellular solids pressure vessel design that consists of a carbon fiber reinforced polymer (CFRP) digital cellular solid cylindrical framework that is lined with an ultra-high molecular weight polyethylene (UHMWPE) skin. We use the analytical treatment of a linear specific modulus scaling cellular solid to find the minimum mass pressure vessel for a structure and find that, for equivalent habitable volume and appropriate safety factors, the use of digital cellular solids provides clear methods for producing structures that are not only repairable and reconfigurable, but also higher performance than their conventionally manufactured counterparts.

  6. Trajectory approach to dissipative quantum phase space dynamics: Application to barrier scattering

    Hughes, Keith H.; Wyatt, Robert E.


    The Caldeira-Leggett master equation, expressed in Lindblad form, has been used in the numerical study of the effect of a thermal environment on the dynamics of the scattering of a wave packet from a repulsive Eckart barrier. The dynamics are studied in terms of phase space trajectories associated with the distribution function, W(q,p,t). The equations of motion for the trajectories include quantum terms that introduce nonlocality into the motion, which imply that an ensemble of correlated trajectories needs to be propagated. However, use of the derivative propagation method (DPM) allows each trajectory to be propagated individually. This is achieved by deriving equations of motion for the partial derivatives of W(q,p,t) that appear in the master equation. The effects of dissipation on the trajectories are studied and results are shown for the transmission probability. On short time scales, decoherence is demonstrated by a swelling of trajectories into momentum space. For a nondissipative system, a comparison is made of the DPM with the 'exact' transmission probability calculated from a fixed grid calculation

  7. Linguacultural space “Man-Nature” in literary texts: cognitive and pragmatic approach

    Eldarova Ruzanna Alievna


    Full Text Available The magnitude of representation of nature images, the links to the author’s mind, the hero, the reader can be considered in literary texts as one of the most important sources for identifying the parameters of the national picture of the world and the individually author’s transformation of its components. Researches that identify patterns of functioning linguacultural spaces in the texts are able to give new results projected in the linguistic picture of the ethnic group of the world due to reflections in literary texts of archetypal, stereotyped images of peculiar linguistic culture and ethnic group as a whole as well as individually-copyright, which characterize a particular linguistic identity and its conception of the world. Cognitive paradigm of modern linguistics, anthropocentric in nature allows to consider culture as a process modeling language, which naturally highlights the problem of linguistic linguaculture of predetermined value. Great importance in this regard is the concept of space as linguocultural cognitive model of objective reality. Cognitive-pragmatic potential of a literary text is deepening due to the introduction the descriptions of nature, since they always implement the ethical, aesthetic, and intellectual abilities of the creative subject.

  8. Semi-classical approaches to the phase space evolutions in intermediate energy heavy ion collisions

    Remaud, B; Sebille, F; Raffray, Y; Gregoire, C; Vinet, L


    The properties of semi-classical phase space evolution equations - as the Vlasov/Boltzmann equations - are discussed in the context of the heavy ion reaction theory at intermediate energies (from 10 to 100 MeV per nucleon). The generalized coherent state set is shown to form a (over) complete basis for the phase space; then every solution of the Vlasov/Boltzmann equations can be defined as a convolution product of the generalized coherent state basis by an appropriate weight function w. The uniform approximation for w is shown to provide an accurate semi-classical description of fermion systems in their ground state: the examples of fermions in a harmonic well and of cold nuclei are discussed. The solution of the Vlasov equation amounts to follow the time evolution of the coherent states which play the role of a moving basis. For the Boltzmann equation, the collision term is taken into account by explicit or implicit variations of the function w. Typical applications are discussed: nuclear response to the giant monopole resonance excitation, fast nucleon emission in heavy-ion reactions. (orig.).

  9. An intelligent hybrid scheme for optimizing parking space: A Tabu metaphor and rough set based approach

    Soumya Banerjee


    Full Text Available Congested roads, high traffic, and parking problems are major concerns for any modern city planning. Congestion of on-street spaces in official neighborhoods may give rise to inappropriate parking areas in office and shopping mall complex during the peak time of official transactions. This paper proposes an intelligent and optimized scheme to solve parking space problem for a small city (e.g., Mauritius using a reactive search technique (named as Tabu Search assisted by rough set. Rough set is being used for the extraction of uncertain rules that exist in the databases of parking situations. The inclusion of rough set theory depicts the accuracy and roughness, which are used to characterize uncertainty of the parking lot. Approximation accuracy is employed to depict accuracy of a rough classification [1] according to different dynamic parking scenarios. And as such, the hybrid metaphor proposed comprising of Tabu Search and rough set could provide substantial research directions for other similar hard optimization problems.

  10. When Distance Matters: Perceptual Bias and Behavioral Response for Approaching Sounds in Peripersonal and Extrapersonal Space

    Camponogara, I.; Komeilipoor, N.; Cesari, P.


    Studies on sound perception show a tendency to overestimate the distance of an approaching sound source, leading to a faster reaction time compared to a receding sound source. Nevertheless, it is unclear whether motor preparation and execution change according to the perceived sound direction and

  11. Sampling Practices and Social Spaces: Exploring a Hip-Hop Approach to Higher Education

    Petchauer, Emery


    Much more than a musical genre, hip-hop culture exists as an animating force in the lives of many young adults. This article looks beyond the moral concerns often associated with rap music to explore how hip-hop as a larger set of expressions and practices implicates the educational experiences, activities, and approaches for students. The article…

  12. Rapid space trajectory generation using a Fourier series shape-based approach

    Taheri, Ehsan

    With the insatiable curiosity of human beings to explore the universe and our solar system, it is essential to benefit from larger propulsion capabilities to execute efficient transfers and carry more scientific equipments. In the field of space trajectory optimization the fundamental advances in using low-thrust propulsion and exploiting the multi-body dynamics has played pivotal role in designing efficient space mission trajectories. The former provides larger cumulative momentum change in comparison with the conventional chemical propulsion whereas the latter results in almost ballistic trajectories with negligible amount of propellant. However, the problem of space trajectory design translates into an optimal control problem which is, in general, time-consuming and very difficult to solve. Therefore, the goal of the thesis is to address the above problem by developing a methodology to simplify and facilitate the process of finding initial low-thrust trajectories in both two-body and multi-body environments. This initial solution will not only provide mission designers with a better understanding of the problem and solution but also serves as a good initial guess for high-fidelity optimal control solvers and increases their convergence rate. Almost all of the high-fidelity solvers enjoy the existence of an initial guess that already satisfies the equations of motion and some of the most important constraints. Despite the nonlinear nature of the problem, it is sought to find a robust technique for a wide range of typical low-thrust transfers with reduced computational intensity. Another important aspect of our developed methodology is the representation of low-thrust trajectories by Fourier series with which the number of design variables reduces significantly. Emphasis is given on simplifying the equations of motion to the possible extent and avoid approximating the controls. These facts contribute to speeding up the solution finding procedure. Several example

  13. Quantifying multi-dimensional functional trait spaces of trees: empirical versus theoretical approaches

    Ogle, K.; Fell, M.; Barber, J. J.


    Empirical, field studies of plant functional traits have revealed important trade-offs among pairs or triplets of traits, such as the leaf (LES) and wood (WES) economics spectra. Trade-offs include correlations between leaf longevity (LL) vs specific leaf area (SLA), LL vs mass-specific leaf respiration rate (RmL), SLA vs RmL, and resistance to breakage vs wood density. Ordination analyses (e.g., PCA) show groupings of traits that tend to align with different life-history strategies or taxonomic groups. It is unclear, however, what underlies such trade-offs and emergent spectra. Do they arise from inherent physiological constraints on growth, or are they more reflective of environmental filtering? The relative importance of these mechanisms has implications for predicting biogeochemical cycling, which is influenced by trait distributions of the plant community. We address this question using an individual-based model of tree growth (ACGCA) to quantify the theoretical trait space of trees that emerges from physiological constraints. ACGCA's inputs include 32 physiological, anatomical, and allometric traits, many of which are related to the LES and WES. We fit ACGCA to 1.6 million USFS FIA observations of tree diameters and heights to obtain vectors of trait values that produce realistic growth, and we explored the structure of this trait space. No notable correlations emerged among the 496 trait pairs, but stepwise regressions revealed complicated multi-variate structure: e.g., relationships between pairs of traits (e.g., RmL and SLA) are governed by other traits (e.g., LL, radiation-use efficiency [RUE]). We also simulated growth under various canopy gap scenarios that impose varying degrees of environmental filtering to explore the multi-dimensional trait space (hypervolume) of trees that died vs survived. The centroid and volume of the hypervolumes differed among dead and live trees, especially under gap conditions leading to low mortality. Traits most predictive

  14. A High Fidelity Approach to Data Simulation for Space Situational Awareness Missions

    Hagerty, S.; Ellis, H., Jr.


    Space Situational Awareness (SSA) is vital to maintaining our Space Superiority. A high fidelity, time-based simulation tool, PROXOR™ (Proximity Operations and Rendering), supports SSA by generating realistic mission scenarios including sensor frame data with corresponding truth. This is a unique and critical tool for supporting mission architecture studies, new capability (algorithm) development, current/future capability performance analysis, and mission performance prediction. PROXOR™ provides a flexible architecture for sensor and resident space object (RSO) orbital motion and attitude control that simulates SSA, rendezvous and proximity operations scenarios. The major elements of interest are based on the ability to accurately simulate all aspects of the RSO model, viewing geometry, imaging optics, sensor detector, and environmental conditions. These capabilities enhance the realism of mission scenario models and generated mission image data. As an input, PROXOR™ uses a library of 3-D satellite models containing 10+ satellites, including low-earth orbit (e.g., DMSP) and geostationary (e.g., Intelsat) spacecraft, where the spacecraft surface properties are those of actual materials and include Phong and Maxwell-Beard bidirectional reflectance distribution function (BRDF) coefficients for accurate radiometric modeling. We calculate the inertial attitude, the changing solar and Earth illumination angles of the satellite, and the viewing angles from the sensor as we propagate the RSO in its orbit. The synthetic satellite image is rendered at high resolution and aggregated to the focal plane resolution resulting in accurate radiometry even when the RSO is a point source. The sensor model includes optical effects from the imaging system [point spread function (PSF) includes aberrations, obscurations, support structures, defocus], detector effects (CCD blooming, left/right bias, fixed pattern noise, image persistence, shot noise, read noise, and quantization

  15. Uncertainty evaluation for IIR (infinite impulse response) filtering using a state-space approach

    Link, Alfred; Elster, Clemens


    A novel method is proposed for evaluating the uncertainty associated with the output of a discrete-time IIR filter when the input signal is corrupted by additive noise and the filter coefficients are uncertain. This task arises, for instance, when the noise-corrupted output of a measurement system is compensated by a digital filter which has been designed on the basis of the characteristics of the measurement system. We assume that the noise is either stationary or uncorrelated, and we presume knowledge about its autocovariance function or its time-dependent variances, respectively. Uncertainty evaluation is considered in line with the 'Guide to the Expression of Uncertainty in Measurement'. A state-space representation is used to derive a calculation scheme which allows the uncertainties to be evaluated in an easy way and also enables real-time applications. The proposed procedure is illustrated by an example

  16. Six-point remainder function in multi-Regge-kinematics: an efficient approach in momentum space

    Broedel, Johannes [Institut für Theoretische Physik, Eidgenössische Technische Hochschule Zürich,Wolfgang-Pauli-Strasse 27, 8093 Zürich (Switzerland); Institut für Mathematik und Institut für Physik, Humboldt-Universität zu Berlin,IRIS Adlershof, Zum Großen Windkanal 6, 12489 Berlin (Germany); Sprenger, Martin [Institut für Theoretische Physik, Eidgenössische Technische Hochschule Zürich,Wolfgang-Pauli-Strasse 27, 8093 Zürich (Switzerland)


    Starting from the known all-order expressions for the BFKL eigenvalue and impact factor, we establish a formalism allowing the direct calculation of the six-point remainder function in N=4 super-Yang-Mills theory in momentum space to — in principle — all orders in perturbation theory. Based upon identities which relate different integrals contributing to the inverse Fourier-Mellin transform recursively, the formalism allows to easily access the full remainder function in multi-Regge kinematics up to 7 loops and up to 10 loops in the fourth logarithmic order. Using the formalism, we prove the all-loop formula for the leading logarithmic approximation proposed by Pennington and investigate the behavior of several newly calculated functions.

  17. Space-group approach to two-electron states in unconventional superconductors

    Yarzhemsky, V. G.


    The direct application of the space-group representation theory, makes possible to obtain limitations for the symmetry of SOP on lines and planes of symmetry in one-electron Brillouin zone. In the case of highly symmetric UPt 3 only theoretical nodal structure of IR E 2u is in agreement with all the experimental results. On the other hand, in the case of high-T c superconductors the two electron description of Cooper pairs in D 2h symmetry is not sufficient to describe experimental nodal structure. It was shown that in this case, the nodal structure is the result of underlying interactions between two-electron states and hidden symmetry D-4 h . (author)

  18. A computational approach to extinction events in chemical reaction networks with discrete state spaces.

    Johnston, Matthew D


    Recent work of Johnston et al. has produced sufficient conditions on the structure of a chemical reaction network which guarantee that the corresponding discrete state space system exhibits an extinction event. The conditions consist of a series of systems of equalities and inequalities on the edges of a modified reaction network called a domination-expanded reaction network. In this paper, we present a computational implementation of these conditions written in Python and apply the program on examples drawn from the biochemical literature. We also run the program on 458 models from the European Bioinformatics Institute's BioModels Database and report our results. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. A Year of Progress: NASA's Space Launch System Approaches Critical Design Review

    Askins, Bruce; Robinson, Kimberly


    NASA's Space Launch System (SLS) made significant progress on the manufacturing floor and on the test stand in 2014 and positioned itself for a successful Critical Design Review in mid-2015. SLS, the world's only exploration-class heavy lift rocket, has the capability to dramatically increase the mass and volume of human and robotic exploration. Additionally, it will decrease overall mission risk, increase safety, and simplify ground and mission operations - all significant considerations for crewed missions and unique high-value national payloads. Development now is focused on configuration with 70 metric tons (t) of payload to low Earth orbit (LEO), more than double the payload of the retired Space Shuttle program or current operational vehicles. This "Block 1" design will launch NASA's Orion Multi-Purpose Crew Vehicle (MPCV) on an uncrewed flight beyond the Moon and back and the first crewed flight around the Moon. The current design has a direct evolutionary path to a vehicle with a 130t lift capability that offers even more flexibility to reduce planetary trip times, simplify payload design cycles, and provide new capabilities such as planetary sample returns. Every major element of SLS has successfully completed its Critical Design Review and now has hardware in production or testing. In fact, the SLS MPCV-to-Stage-Adapter (MSA) flew successfully on the Exploration Flight Test (EFT) 1 launch of a Delta IV and Orion spacecraft in December 2014. The SLS Program is currently working toward vehicle Critical Design Review in mid-2015. This paper will discuss these and other technical and programmatic successes and challenges over the past year and provide a preview of work ahead before the first flight of this new capability.

  20. Global height datum unification: a new approach in gravity potential space

    Ardalan, A. A.; Safari, A.


    The problem of “global height datum unification” is solved in the gravity potential space based on: (1) high-resolution local gravity field modeling, (2) geocentric coordinates of the reference benchmark, and (3) a known value of the geoid’s potential. The high-resolution local gravity field model is derived based on a solution of the fixed-free two-boundary-value problem of the Earth’s gravity field using (a) potential difference values (from precise leveling), (b) modulus of the gravity vector (from gravimetry), (c) astronomical longitude and latitude (from geodetic astronomy and/or combination of (GNSS) Global Navigation Satellite System observations with total station measurements), (d) and satellite altimetry. Knowing the height of the reference benchmark in the national height system and its geocentric GNSS coordinates, and using the derived high-resolution local gravity field model, the gravity potential value of the zero point of the height system is computed. The difference between the derived gravity potential value of the zero point of the height system and the geoid’s potential value is computed. This potential difference gives the offset of the zero point of the height system from geoid in the “potential space”, which is transferred into “geometry space” using the transformation formula derived in this paper. The method was applied to the computation of the offset of the zero point of the Iranian height datum from the geoid’s potential value W 0=62636855.8 m2/s2. According to the geometry space computations, the height datum of Iran is 0.09 m below the geoid.

  1. An Integrated Approach to Parameter Learning in Infinite-Dimensional Space

    Boyd, Zachary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wendelberger, Joanne Roth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    The availability of sophisticated modern physics codes has greatly extended the ability of domain scientists to understand the processes underlying their observations of complicated processes, but it has also introduced the curse of dimensionality via the many user-set parameters available to tune. Many of these parameters are naturally expressed as functional data, such as initial temperature distributions, equations of state, and controls. Thus, when attempting to find parameters that match observed data, being able to navigate parameter-space becomes highly non-trivial, especially considering that accurate simulations can be expensive both in terms of time and money. Existing solutions include batch-parallel simulations, high-dimensional, derivative-free optimization, and expert guessing, all of which make some contribution to solving the problem but do not completely resolve the issue. In this work, we explore the possibility of coupling together all three of the techniques just described by designing user-guided, batch-parallel optimization schemes. Our motivating example is a neutron diffusion partial differential equation where the time-varying multiplication factor serves as the unknown control parameter to be learned. We find that a simple, batch-parallelizable, random-walk scheme is able to make some progress on the problem but does not by itself produce satisfactory results. After reducing the dimensionality of the problem using functional principal component analysis (fPCA), we are able to track the progress of the solver in a visually simple way as well as viewing the associated principle components. This allows a human to make reasonable guesses about which points in the state space the random walker should try next. Thus, by combining the random walker's ability to find descent directions with the human's understanding of the underlying physics, it is possible to use expensive simulations more efficiently and more quickly arrive at the

  2. Randomized Approaches for Nearest Neighbor Search in Metric Space When Computing the Pairwise Distance Is Extremely Expensive

    Wang, Lusheng; Yang, Yong; Lin, Guohui

    Finding the closest object for a query in a database is a classical problem in computer science. For some modern biological applications, computing the similarity between two objects might be very time consuming. For example, it takes a long time to compute the edit distance between two whole chromosomes and the alignment cost of two 3D protein structures. In this paper, we study the nearest neighbor search problem in metric space, where the pair-wise distance between two objects in the database is known and we want to minimize the number of distances computed on-line between the query and objects in the database in order to find the closest object. We have designed two randomized approaches for indexing metric space databases, where objects are purely described by their distances with each other. Analysis and experiments show that our approaches only need to compute O(logn) objects in order to find the closest object, where n is the total number of objects in the database.

  3. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors

    Cenek, Martin; Dahl, Spencer K.


    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  4. Approaches to veterinary education--tracking versus a final year broad clinical experience. Part one: effects on career outcome.

    Klosterman, E S; Kass, P H; Walsh, D A


    This is the first of two papers that provide extensive data and analysis on the two major approaches to clinical veterinary education, which either provide students with experience of a broad range of species (often defined as omni/general clinical competence), or just a few species (sometimes just one), usually termed 'tracking'. Together the two papers provide a detailed analysis of these two approaches for the first time. The responsibilities of veterinary medicine and veterinary education are rapidly increasing throughoutthe globe. It is critical for all in veterinary education to reassess the approaches that have been used, and evaluate on a school-by-school basis which may best meet its expanding and ever-deepening responsibilities.

  5. An Innovative Approach to Balancing Chemical-Reaction Equations: A Simplified Matrix-Inversion Technique for Determining The Matrix Null Space

    Thorne, Lawrence R.


    I propose a novel approach to balancing equations that is applicable to all chemical-reaction equations; it is readily accessible to students via scientific calculators and basic computer spreadsheets that have a matrix-inversion application. The new approach utilizes the familiar matrix-inversion operation in an unfamiliar and innovative way; its purpose is not to identify undetermined coefficients as usual, but, instead, to compute a matrix null space (or matrix kernel). The null space then...

  6. Uncertainty Quantification for Complex RF-structures Using the State-space Concatenation Approach

    Heller, Johann; Schmidt, Christian; Van Rienen, Ursula


    as well as to employ robust optimizations, a so-called uncertainty quantification (UQ) is applied. For large and complex structures such computations are heavily demanding and cannot be carried out using standard brute-force approaches. In this paper, we propose a combination of established techniques to perform UQ for long and complex structures, where the uncertainty is located only in parts of the structure. As exemplary structure, we investigate the third-harmonic cavity, which is being used at the FLASH accelerator at DESY, assuming an uncertain...

  7. Real-Space Analysis of Scanning Tunneling Microscopy Topography Datasets Using Sparse Modeling Approach

    Miyama, Masamichi J.; Hukushima, Koji


    A sparse modeling approach is proposed for analyzing scanning tunneling microscopy topography data, which contain numerous peaks originating from the electron density of surface atoms and/or impurities. The method, based on the relevance vector machine with L1 regularization and k-means clustering, enables separation of the peaks and peak center positioning with accuracy beyond the resolution of the measurement grid. The validity and efficiency of the proposed method are demonstrated using synthetic data in comparison with the conventional least-squares method. An application of the proposed method to experimental data of a metallic oxide thin-film clearly indicates the existence of defects and corresponding local lattice distortions.

  8. Investigation of tt-bar in the full hadronic final state at CDF with a neural network approach

    Sidoti, A.; Azzi, P.; Busetto, G.; Castro, A.; Dusini, S.; Lazzizzera, I.; Wyss, J.L.


    In this work we present the results of a neural network (NN) approach to the measurement of the tt-bar production cross-section and top mass in the all-hadronic channel, analyzing data collected at the Collider Detector at Fermilab (CDF) experiment. We have used a hardware implementation of a feed forward neural network, TOTEM, the product of a collaboration of INFN (Istituto Nazionale Fisica Nucleare) - IRST (Istituto per la Ricerca Scientifica e Tecnologica) - University of Trento, Italy. Particular attention has been paid to the evaluation of the systematics specifically related to the NN approach. The results are consistent with those obtained at CDF by conventional data selection techniques

  9. New approach for measuring 3D space by using Advanced SURF Algorithm

    Youm, Minkyo; Min, Byungil; Suh, Kyungsuk [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lee, Backgeun [Sungkyunkwan Univ., Suwon (Korea, Republic of)


    The nuclear disasters compared to natural disaster create a more extreme condition for analyzing and evaluating. In this paper, measuring 3D space and modeling was studied by simple pictures in case of small sand dune. The suggested method can be used for the acquisition of spatial information by robot at the disaster area. As a result, these data are helpful for identify the damaged part, degree of damage and determination of recovery sequences. In this study we are improving computer vision algorithm for 3-D geo spatial information measurement. And confirm by test. First, we can get noticeable improvement of 3-D geo spatial information result by SURF algorithm and photogrammetry surveying. Second, we can confirm not only decrease algorithm running time, but also increase matching points through epi polar line filtering. From the study, we are extracting 3-D model by open source algorithm and delete miss match point by filtering method. However on characteristic of SURF algorithm, it can't find match point if structure don't have strong feature. So we will need more study about find feature point if structure don't have strong feature.

  10. Dimensions of design space: a decision-theoretic approach to optimal research design.

    Conti, Stefano; Claxton, Karl


    Bayesian decision theory can be used not only to establish the optimal sample size and its allocation in a single clinical study but also to identify an optimal portfolio of research combining different types of study design. Within a single study, the highest societal payoff to proposed research is achieved when its sample sizes and allocation between available treatment options are chosen to maximize the expected net benefit of sampling (ENBS). Where a number of different types of study informing different parameters in the decision problem could be conducted, the simultaneous estimation of ENBS across all dimensions of the design space is required to identify the optimal sample sizes and allocations within such a research portfolio. This is illustrated through a simple example of a decision model of zanamivir for the treatment of influenza. The possible study designs include: 1) a single trial of all the parameters, 2) a clinical trial providing evidence only on clinical endpoints, 3) an epidemiological study of natural history of disease, and 4) a survey of quality of life. The possible combinations, samples sizes, and allocation between trial arms are evaluated over a range of cost-effectiveness thresholds. The computational challenges are addressed by implementing optimization algorithms to search the ENBS surface more efficiently over such large dimensions.

  11. Impacts of the 2011 Tohoku earthquake on electricity demand in Japan. State space approach

    Honjo, Keita; Ashina, Shuichi


    Some papers report that consumers' electricity saving behavior (Setsuden) after the 2011 Tohoku Earthquake resulted in the reduction of the domestic electricity demand. However, time variation of the electricity saving effect (ESE) has not yet been sufficiently investigated. In this study, we develop a state space model of monthly electricity demand using long-term data, and estimate time variation of the ESE. We also estimate time variation of CO_2 emissions caused by Setsuden. Our result clearly indicates that Setsuden after the earthquake was not temporary but became established as a habit. Between March 2011 and October 2015, the ESE on power demand ranged from 2.9% to 6.9%, and the ESE on light demand ranged from 2.6% to 9.0%. The ESE on the total electricity demand was 3.2%-7.5%. Setsuden also contributed to the reduction of CO_2 emissions, but it could not offset the emissions increase caused by the shutdown of nuclear power plants. (author)

  12. Structural health monitoring using DOG multi-scale space: an approach for analyzing damage characteristics

    Guo, Tian; Xu, Zili


    Measurement noise is inevitable in practice; thus, it is difficult to identify defects, cracks or damage in a structure while suppressing noise simultaneously. In this work, a novel method is introduced to detect multiple damage in noisy environments. Based on multi-scale space analysis for discrete signals, a method for extracting damage characteristics from the measured displacement mode shape is illustrated. Moreover, the proposed method incorporates a data fusion algorithm to further eliminate measurement noise-based interference. The effectiveness of the method is verified by numerical and experimental methods applied to different structural types. The results demonstrate that there are two advantages to the proposed method. First, damage features are extracted by the difference of the multi-scale representation; this step is taken such that the interference of noise amplification can be avoided. Second, a data fusion technique applied to the proposed method provides a global decision, which retains the damage features while maximally eliminating the uncertainty. Monte Carlo simulations are utilized to validate that the proposed method has a higher accuracy in damage detection.

  13. Exploring Psychological and Aesthetic Approaches of Bio-Retention Facilities in the Urban Open Space

    Suyeon Kim


    Full Text Available Over the last decades, a number of bio-retention facilities have been installed in urban areas for flood control and green amenity purposes. As urban amenity facilities for citizens, bio-retentions have a lot potential; however, the literature on bio-retentions focused mostly on physiochemical aspects like water quality and runoffs. Hence, this paper aims to explore psychological aspects of bio-retentions such as perceptions and landscape aesthetic value for visitors. In order to achieve this purpose, the study employed on-site interviews and questionnaires in the chosen three case studies as research methodology. For the 3 different locations of bio-retention facilities, interviews and questionnaires were carried out. The surveys of 100 bio-retention users were conducted, investigating their general perceptions and landscape aesthetics of the bio-retention facilities. The paper found that only 34% of the interviewees recognised bio-detention facilities, illustrating that most visitors were not aware of such facilities and were unable to distinguish the differences between bio-retention and conventional gardens. On the other hand, the majority of interviewees strongly supported the concept and function of bio-retentions, especially those who recognised the differences in planting species with conventional urban open spaces. Such main findings also encourage further studies of seeking quantitative values by conducting a correlation analysis between the functions and aesthetics of bio-retention facilities.

  14. Cellular Spacing Selection During the Directional Solidification of Binary Alloys. A Numerical Approach

    Catalina, Adrian V.; Sen, S.; Rose, M. Franklin (Technical Monitor)


    The evolution of cellular solid/liquid interfaces from an initially unstable planar front was studied by means of a two-dimensional computer simulation. The developed numerical model makes use of an interface tracking procedure and has the capability to describe the dynamics of the interface morphology based on local changes of the thermodynamic conditions. The fundamental physics of this formulation was validated against experimental microgravity results and the predictions of the analytical linear stability theory. The performed simulations revealed that in certain conditions, based on a competitive growth mechanism, an interface could become unstable to random perturbations of infinitesimal amplitude even at wavelengths smaller than the neutral wavelength, lambda(sub c), predicted by the linear stability theory. Furthermore, two main stages of spacing selection have been identified. In the first stage, at low perturbations amplitude, the selection mechanism is driven by the maximum growth rate of instabilities while in the second stage the selection is influenced by nonlinear phenomena caused by the interactions between the neighboring cells. Comparison of these predictions with other existing theories of pattern formation and experimental results will be discussed.

  15. New approach to 3-D, high sensitivity, high mass resolution space plasma composition measurements

    McComas, D.J.; Nordholt, J.E.


    This paper describes a new type of 3-D space plasma composition analyzer. The design combines high sensitivity, high mass resolution measurements with somewhat lower mass resolution but even higher sensitivity measurements in a single compact and robust design. While the lower resolution plasma measurements are achieved using conventional straight-through time-of-flight mass spectrometry, the high mass resolution measurements are made by timing ions reflected in a linear electric field (LEF), where the restoring force that an ion experiences is proportional to the depth it travels into the LEF region. Consequently, the ion's equation of motion in that dimension is that of a simple harmonic oscillator and its travel time is simply proportional to the square root of the ion's mass/charge (m/q). While in an ideal LEF, the m/q resolution can be arbitrarily high, in a real device the resolution is limited by the field linearity which can be achieved. In this paper we describe how a nearly linear field can be produced and discuss how the design can be optimized for various different plasma regimes and spacecraft configurations

  16. A scale space approach for unsupervised feature selection in mass spectra classification for ovarian cancer detection.

    Ceccarelli, Michele; d'Acierno, Antonio; Facchiano, Angelo


    Mass spectrometry spectra, widely used in proteomics studies as a screening tool for protein profiling and to detect discriminatory signals, are high dimensional data. A large number of local maxima (a.k.a. peaks) have to be analyzed as part of computational pipelines aimed at the realization of efficient predictive and screening protocols. With this kind of data dimensions and samples size the risk of over-fitting and selection bias is pervasive. Therefore the development of bio-informatics methods based on unsupervised feature extraction can lead to general tools which can be applied to several fields of predictive proteomics. We propose a method for feature selection and extraction grounded on the theory of multi-scale spaces for high resolution spectra derived from analysis of serum. Then we use support vector machines for classification. In particular we use a database containing 216 samples spectra divided in 115 cancer and 91 control samples. The overall accuracy averaged over a large cross validation study is 98.18. The area under the ROC curve of the best selected model is 0.9962. We improved previous known results on the problem on the same data, with the advantage that the proposed method has an unsupervised feature selection phase. All the developed code, as MATLAB scripts, can be downloaded from

  17. Strain profiles in ion implanted ceramic polycrystals: An approach based on reciprocal-space crystal selection

    Palancher, H., E-mail:; Martin, G.; Fouet, J. [CEA, DEN, DEC, F-13108 Saint Paul lez Durance (France); Goudeau, P. [Institut Pprime, CNRS-Université de Poitiers–ENSMA, SP2MI, F-86360 Chasseneuil (France); Boulle, A. [Science des Procédés Céramiques et Traitements de Surface (SPCTS), CNRS UMR 7315, Centre Européen de la Céramique, 12 rue Atlantis, 87068 Limoges (France); Rieutord, F. [CEA, DSM, INAC, F-38054 Grenoble Cedex 9 (France); Favre-Nicolin, V. [Université Grenoble-Alpes, F-38041 Grenoble, France, Institut Universitaire de France, F-75005 Paris (France); Blanc, N. [Institut NEEL, CNRS-Univ Grenoble Alpes, F-38042 Grenoble (France); Onofri, C. [CEA, DEN, DEC, F-13108 Saint Paul lez Durance (France); CEMES, CNRS UPR 8011, 29 rue Jeanne Marvig, BP 94347, 31055 Toulouse Cedex 4 (France)


    The determination of the state of strain in implanted materials is a key issue in the study of their mechanical stability. Whereas this question is nowadays relatively easily solved in the case of single crystals, it remains a challenging task in the case of polycrystalline materials. In this paper, we take benefit of the intense and parallel beams provided by third generation synchrotron sources combined with a two-dimensional detection system to analyze individual grains in polycrystals, hence obtaining “single crystal-like” data. The feasibility of the approach is demonstrated with implanted UO{sub 2} polycrystals where the in-depth strain profile is extracted for individual grains using numerical simulations of the diffracted signal. The influence of the implantation dose is precisely analyzed for several diffracting planes and grains. This work suggests that, at low fluences, the development of strain is mainly due to ballistic effects with little effect from He ions, independently from the crystallographic orientation. At higher fluences, the evolution of the strain profiles suggests a partial and anisotropic plastic relaxation. With the present approach, robust and reliable structural information can be obtained, even from complex polycrystalline ceramic materials.


    The U.S. contingent of the U.S.-German Bilateral Working Group is developing Sustainable Management Approaches and Revitalization Tools-electronic (SMARTe). SMARTe is a web-based, decision support system designed to assist stakeholders in developing and evaluating alternative reu...

  19. A new approach to performance assessment of barriers in a repository. Executive summary, draft, technical appendices. Final report

    Mueller-Hoeppe, N.; Krone, J.; Niehues, N.; Poehler, M.; Raitz von Frentz, R.; Gauglitz, R.


    Multi-barrier systems are accepted as the basic approach for long term environmental safe isolation of radioactive waste in geological repositories. Assessing the performance of natural and engineered barriers is one of the major difficulties in producing evidence of environmental safety for any radioactive waste disposal facility, due to the enormous complexity of scenarios and uncertainties to be considered. This report outlines a new methodological approach originally developed basically for a repository in salt, but that can be transferred with minor modifications to any other host rock formation. The approach is based on the integration of following elements: (1) Implementation of a simple method and efficient criteria to assess and prove the tightness of geological and engineered barriers; (2) Using the method of Partial Safety Factors in order to assess barrier performance at certain reasonable level of confidence; (3) Integration of a diverse geochemical barrier in the near field of waste emplacement limiting systematically the radiological consequences from any radionuclide release in safety investigations and (4) Risk based approach for the assessment of radionuclide releases. Indicative calculations performed with extremely conservative assumptions allow to exclude any radiological health consequences from a HLW repository in salt to a reference person with a safety level of 99,9999% per year. (orig.)

  20. Final Verification Success Story Using the Triad Approach at the Oak Ridge National Laboratory's Melton Valley Soils and Sediment Project

    King, D.A.; Haas, D.A.; Cange, J.B.


    The United States Environmental Protection Agency recently published guidance on the Triad approach, which supports the use of smarter, faster, and better technologies and work strategies during environmental site assessment, characterization, and cleanup. The Melton Valley Soils and Sediment Project (Project) at the Oak Ridge National Laboratory embraced this three-pronged approach to characterize contaminants in soil/sediment across the 1000-acre Melton Valley Watershed. Systematic Project Planning is the first of three prongs in the Triad approach. Management initiated Project activities by identifying key technical personnel, included regulators early in the planning phase, researched technologies, and identified available resources necessary to meet Project objectives. Dynamic Work Strategies is the second prong of the Triad approach. Core Team members, including State and Federal regulators, helped develop a Sampling and Analysis Plan that allowed experienced field managers to make real-time, in-the-field decisions and, thus, to adjust to conditions unanticipated during the planning phase. Real-time Measurement Technologies is the third and last prong of the Triad approach. To expedite decision-making, the Project incorporated multiple in-field technologies, including global positioning system equipment integrated with field screening instrumentation, magnetometers for utility clearance, and an on-site gamma spectrometer (spec) for rapid contaminant speciation and quantification. As a result of a relatively complex but highly efficient program, a Project field staff of eight collected approximately 1900 soil samples for on-site gamma spec analysis (twenty percent were also shipped for off-site analyses), 4.7 million gamma radiation measurements, 1000 systematic beta radiation measurements, and 3600 systematic dose rate measurements between July 1, 2004, and October 31, 2005. The site database previously contained results for less than 500 soil samples dating

  1. Exercise in space: the European Space Agency approach to in-flight exercise countermeasures for long-duration missions on ISS.

    Petersen, Nora; Jaekel, Patrick; Rosenberger, Andre; Weber, Tobias; Scott, Jonathan; Castrucci, Filippo; Lambrecht, Gunda; Ploutz-Snyder, Lori; Damann, Volker; Kozlovskaya, Inessa; Mester, Joachim


    To counteract microgravity (µG)-induced adaptation, European Space Agency (ESA) astronauts on long-duration missions (LDMs) to the International Space Station (ISS) perform a daily physical exercise countermeasure program. Since the first ESA crewmember completed an LDM in 2006, the ESA countermeasure program has strived to provide efficient protection against decreases in body mass, muscle strength, bone mass, and aerobic capacity within the operational constraints of the ISS environment and the changing availability of on-board exercise devices. The purpose of this paper is to provide a description of ESA's individualised approach to in-flight exercise countermeasures and an up-to-date picture of how exercise is used to counteract physiological changes resulting from µG-induced adaptation. Changes in the absolute workload for resistive exercise, treadmill running and cycle ergometry throughout ESA's eight LDMs are also presented, and aspects of pre-flight physical preparation and post-flight reconditioning outlined. With the introduction of the advanced resistive exercise device (ARED) in 2009, the relative contribution of resistance exercise to total in-flight exercise increased (33-46 %), whilst treadmill running (42-33 %) and cycle ergometry (26-20 %) decreased. All eight ESA crewmembers increased their in-flight absolute workload during their LDMs for resistance exercise and treadmill running (running speed and vertical loading through the harness), while cycle ergometer workload was unchanged across missions. Increased or unchanged absolute exercise workloads in-flight would appear contradictory to typical post-flight reductions in muscle mass and strength, and cardiovascular capacity following LDMs. However, increased absolute in-flight workloads are not directly linked to changes in exercise capacity as they likely also reflect the planned, conservative loading early in the mission to allow adaption to µG exercise, including personal comfort issues

  2. A New Solution Assessment Approach and Its Application to Space Geodesy Data Analysis

    Hu, X.; Huang, C.; Liao, X.


    The statistics of the residuals are used in this paper to perform a quality assessment of the solutions from space geodesy data analysis. With the stochastic estimation and the relatively arbitrary empirical parameters being employed to absorb unmodelled errors, it has long been noticed that different estimate combinations or analysis strategies may achieve the same level of fitting yet result in significantly different solutions. Based on the postulate that no conceivable signals should remain in the residuals, solutions of the same level of root mean square error (RMS) and variance-covariance may be differentiated in the sense that for reasonable solutions, the residuals are virtually identical with noise. While it is possible to develop complex noise models, the Gaussian white noise model simplifies the solution interpretation and implies the unmodelled errors have been smoothed out. Statistical moments of the residuals as well as the Pearson chi-square are computed in this paper to measure the discrepancies between the residuals and Gaussian white noise. Applying to both satellite laser ranging (SLR) and global positioning system (GPS) data analysis, we evaluate different parameter estimate combinations and/or different strategies that would be hardly discriminated by the level of fitting. Unlike most solution assessment methods broadly termed as external comparison, no information independent of the data analyzed is required. This makes the immediate solution assessment possible and easy to carry out. While the external comparison is the best and most convincing quality assessment of the solution, the statistics of the residuals provide important information on the solutions and, in some cases as discussed in this paper, can be supported with external comparison.

  3. Environmental health of Spanish parks: An approach to the allergenic potential of urban green spaces

    Paloma Cariñanos


    Full Text Available Urban parks are green infrastructure elements that should contribute to improving the quality of life and well-being of citizens. In this work there are presented the results of applying a new index to estimate the potential allergenicity of parks located in 20 Spanish cities. This index, which considers intrinsic biological and biometric parameters of existing plant species in parks, allows the allergenic risk thereof to be calculated on a scale ranging from 0 to 1, depending on whether to the park’s allergenicity is zero or has a high risk for the population. The parks selected for this study have different typologies, sizes, species richness and biodiversities, which has yielded highly variable index values. Almost half of the analysed parks have an index value higher than 0.30, a threshold considered having a moderate to high risk, and therefore, enough to cause allergy symptoms in the population. Conversely, most of the parks had an index value below this threshold, so that the risk of suffering allergies is low or very low. The formula also allows the species that most contribute to the resulting value for allergenicity to be known, which are those having an anemophilous strategy of pollination, extended periods of flowering, and a referenced high allergenicity. These requirements are met by all species of the Betulaceae, Cupressaceae and Moraceae families, and to a lesser extent by Oleaceae and Platanaceae. It can be concluded that the development of an index to estimate the allergenicity of urban green spaces constitutes a useful tool to minimize the impact of pollen allergy on the population.

  4. Magnetized target fusion: An ultra high energy approach in an unexplored parameter space

    Lindemuth, I.R.


    Magnetized target fusion is a concept that may lead to practical fusion applications in a variety of settings. However, the crucial first step is to demonstrate that it works as advertised. Among the possibilities for doing this is an ultrahigh energy approach to magnetized target fusion, one powered by explosive pulsed power generators that have become available for application to thermonuclear fusion research. In a collaborative effort between Los Alamos and the All-Russian Scientific Institute for Experimental Physics (VNIIEF) a very powerful helical generator with explosive power switching has been used to produce an energetic magnetized plasma. Several diagnostics have been fielded to ascertain the properties of this plasma. We are intensively studying the results of the experiments and calculationally analyzing the performance of this experiment

  5. An Integrated Approach to Thermal Management of International Space Station Logistics Flights, Improving the Efficiency

    Holladay, Jon; Day, Greg; Roberts, Barry; Leahy, Frank


    The efficiency of re-useable aerospace systems requires a focus on the total operations process rather than just orbital performance. For the Multi-Purpose Logistics Module this activity included special attention to terrestrial conditions both pre-launch and post-landing and how they inter-relate to the mission profile. Several of the efficiencies implemented for the MPLM Mission Engineering were NASA firsts and all served to improve the overall operations activities. This paper will provide an explanation of how various issues were addressed and the resulting solutions. Topics range from statistical analysis of over 30 years of atmospheric data at the launch and landing site to a new approach for operations with the Shuttle Carrier Aircraft. In each situation the goal was to "tune" the thermal management of the overall flight system for minimizing requirement risk while optimizing power and energy performance.

  6. Interpolation of quasi-Banach spaces

    Tabacco Vignati, A.M.


    This dissertation presents a method of complex interpolation for familities of quasi-Banach spaces. This method generalizes the theory for families of Banach spaces, introduced by others. Intermediate spaces in several particular cases are characterized using different approaches. The situation when all the spaces have finite dimensions is studied first. The second chapter contains the definitions and main properties of the new interpolation spaces, and an example concerning the Schatten ideals associated with a separable Hilbert space. The case of L/sup P/ spaces follows from the maximal operator theory contained in Chapter III. Also introduced is a different method of interpolation for quasi-Banach lattices of functions, and conditions are given to guarantee that the two techniques yield the same result. Finally, the last chapter contains a different, and more direct, approach to the case of Hardy spaces

  7. Optimal Decision-Making in Fuzzy Economic Order Quantity (EOQ Model under Restricted Space: A Non-Linear Programming Approach

    M. Pattnaik


    Full Text Available In this paper the concept of fuzzy Non-Linear Programming Technique is applied to solve an economic order quantity (EOQ model under restricted space. Since various types of uncertainties and imprecision are inherent in real inventory problems they are classically modeled using the approaches from the probability theory. However, there are uncertainties that cannot be appropriately treated by usual probabilistic models. The questions how to define inventory optimization tasks in such environment how to interpret optimal solutions arise. This paper allows the modification of the Single item EOQ model in presence of fuzzy decision making process where demand is related to the unit price and the setup cost varies with the quantity produced/Purchased. This paper considers the modification of objective function and storage area in the presence of imprecisely estimated parameters. The model is developed for the problem by employing different modeling approaches over an infinite planning horizon. It incorporates all concepts of a fuzzy arithmetic approach, the quantity ordered and the demand per unit compares both fuzzy non linear and other models. Investigation of the properties of an optimal solution allows developing an algorithm whose validity is illustrated through an example problem and ugh MATLAB (R2009a version software, the two and three dimensional diagrams are represented to the application. Sensitivity analysis of the optimal solution is also studied with respect to changes in different parameter values and to draw managerial insights of the decision problem.

  8. Novel approaches to ultrasonography of the lung and pleural space: where are we now?

    Daniel Lichtenstein


    To understand that the use of lung ultrasound, although long standardised, still needs educational efforts for its best use, a suitable machine, a suitable universal probe and an appropriate culture. To be able to use a terminology that has been fully standardised to avoid any confusion of useless wording. To understand the logic of the BLUE points, three points of interest enabling expedition of a lung ultrasound examination in acute respiratory failure. To be able to cite, in the correct hierarchy, the seven criteria of the B-line, then those of interstitial syndrome. To understand the sequential thinking when making ultrasound diagnosis of pneumothorax. To be able to use the BLUE protocol for building profiles of pneumonia (or acute respiratory distress syndrome and understand their limitations. To understand that lung ultrasound can be used for the direct analysis of an acute respiratory failure (the BLUE protocol, an acute circulatory failure (the FALLS protocol and even a cardiac arrest (SESAME protocol, following a pathophysiological approach. To understand that the first sequential target in the SESAME protocol (search first for pneumothorax in cardiac arrest can also be used in countless more quiet settings of countless disciplines, making lung ultrasound in the critically ill cost-, time- and radiation-saving. To be able to perform a BLUE protocol in challenging patients, understanding how the best lung ultrasound can be obtained from bariatric or agitated, dyspnoeic patients.

  9. Group theoretical approach to quantum fields in de Sitter space II. The complementary and discrete series

    Joung, Euihun; Mourad, Jihad; Parentani, Renaud


    We use an algebraic approach based on representations of de Sitter group to construct covariant quantum fields in arbitrary dimensions. We study the complementary and the discrete series which correspond to light and massless fields and which lead new feature with respect to the massive principal series we previously studied (hep-th/0606119). When considering the complementary series, we make use of a non-trivial scalar product in order to get local expressions in the position representation. Based on these, we construct a family of covariant canonical fields parametrized by SU(1, 1)/U(1). Each of these correspond to the dS invariant alpha-vacua. The behavior of the modes at asymptotic times brings another difficulty as it is incompatible with the usual definition of the in and out vacua. We propose a generalized notion of these vacua which reduces to the usual conformal vacuum in the conformally massless limit. When considering the massless discrete series we find that no covariant field obeys the canonical commutation relations. To further analyze this singular case, we consider the massless limit of the complementary scalar fields we previously found. We obtain canonical fields with a deformed representation by zero modes. The zero modes have a dS invariant vacuum with singular norm. We propose a regularization by a compactification of the scalar field and a dS invariant definition of the vertex operators. The resulting two-point functions are dS invariant and have a universal logarithmic infrared divergence

  10. Tensor representation techniques for full configuration interaction: A Fock space approach using the canonical product format.

    Böhm, Karl-Heinz; Auer, Alexander A; Espig, Mike


    In this proof-of-principle study, we apply tensor decomposition techniques to the Full Configuration Interaction (FCI) wavefunction in order to approximate the wavefunction parameters efficiently and to reduce the overall computational effort. For this purpose, the wavefunction ansatz is formulated in an occupation number vector representation that ensures antisymmetry. If the canonical product format tensor decomposition is then applied, the Hamiltonian and the wavefunction can be cast into a multilinear product form. As a consequence, the number of wavefunction parameters does not scale to the power of the number of particles (or orbitals) but depends on the rank of the approximation and linearly on the number of particles. The degree of approximation can be controlled by a single threshold for the rank reduction procedure required in the algorithm. We demonstrate that using this approximation, the FCI Hamiltonian matrix can be stored with N(5) scaling. The error of the approximation that is introduced is below Millihartree for a threshold of ϵ = 10(-4) and no convergence problems are observed solving the FCI equations iteratively in the new format. While promising conceptually, all effort of the algorithm is shifted to the required rank reduction procedure after the contraction of the Hamiltonian with the coefficient tensor. At the current state, this crucial step is the bottleneck of our approach and even for an optimistic estimate, the algorithm scales beyond N(10) and future work has to be directed towards reduction-free algorithms.

  11. A space and time scale-dependent nonlinear geostatistical approach for downscaling daily precipitation and temperature

    Jha, Sanjeev Kumar


    A geostatistical framework is proposed to downscale daily precipitation and temperature. The methodology is based on multiple-point geostatistics (MPS), where a multivariate training image is used to represent the spatial relationship between daily precipitation and daily temperature over several years. Here, the training image consists of daily rainfall and temperature outputs from the Weather Research and Forecasting (WRF) model at 50 km and 10 km resolution for a twenty year period ranging from 1985 to 2004. The data are used to predict downscaled climate variables for the year 2005. The result, for each downscaled pixel, is daily time series of precipitation and temperature that are spatially dependent. Comparison of predicted precipitation and temperature against a reference dataset indicates that both the seasonal average climate response together with the temporal variability are well reproduced. The explicit inclusion of time dependence is explored by considering the climate properties of the previous day as an additional variable. Comparison of simulations with and without inclusion of time dependence shows that the temporal dependence only slightly improves the daily prediction because the temporal variability is already well represented in the conditioning data. Overall, the study shows that the multiple-point geostatistics approach is an efficient tool to be used for statistical downscaling to obtain local scale estimates of precipitation and temperature from General Circulation Models. This article is protected by copyright. All rights reserved.

  12. Passive Thermal Design Approach for the Space Communications and Navigation (SCaN) Testbed Experiment on the International Space Station (ISS)

    Siamidis, John; Yuko, Jim


    The Space Communications and Navigation (SCaN) Program Office at NASA Headquarters oversees all of NASAs space communications activities. SCaN manages and directs the ground-based facilities and services provided by the Deep Space Network (DSN), Near Earth Network (NEN), and the Space Network (SN). Through the SCaN Program Office, NASA GRC developed a Software Defined Radio (SDR) testbed experiment (SCaN testbed experiment) for use on the International Space Station (ISS). It is comprised of three different SDR radios, the Jet Propulsion Laboratory (JPL) radio, Harris Corporation radio, and the General Dynamics Corporation radio. The SCaN testbed experiment provides an on-orbit, adaptable, SDR Space Telecommunications Radio System (STRS) - based facility to conduct a suite of experiments to advance the Software Defined Radio, Space Telecommunications Radio Systems (STRS) standards, reduce risk (Technology Readiness Level (TRL) advancement) for candidate Constellation future space flight hardware software, and demonstrate space communication links critical to future NASA exploration missions. The SCaN testbed project provides NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, software defined radio platforms and the STRS Architecture.The SCaN testbed is resident on the P3 Express Logistics Carrier (ELC) on the exterior truss of the International Space Station (ISS). The SCaN testbed payload launched on the Japanese Aerospace Exploration Agency (JAXA) H-II Transfer Vehicle (HTV) and was installed on the ISS P3 ELC located on the inboard RAM P3 site. The daily operations and testing are managed out of NASA GRC in the Telescience Support Center (TSC).

  13. A Systems Genetic Approach to Identify Low Dose Radiation-Induced Lymphoma Susceptibility/DOE2013FinalReport

    Balmain, Allan [University of California, San Francisco; Song, Ihn Young [University of California, San Francisco


    The ultimate goal of this project is to identify the combinations of genetic variants that confer an individual's susceptibility to the effects of low dose (0.1 Gy) gamma-radiation, in particular with regard to tumor development. In contrast to the known effects of high dose radiation in cancer induction, the responses to low dose radiation (defined as 0.1 Gy or less) are much less well understood, and have been proposed to involve a protective anti-tumor effect in some in vivo scientific models. These conflicting results confound attempts to develop predictive models of the risk of exposure to low dose radiation, particularly when combined with the strong effects of inherited genetic variants on both radiation effects and cancer susceptibility. We have used a Systems Genetics approach in mice that combines genetic background analysis with responses to low and high dose radiation, in order to develop insights that will allow us to reconcile these disparate observations. Using this comprehensive approach we have analyzed normal tissue gene expression (in this case the skin and thymus), together with the changes that take place in this gene expression architecture a) in response to low or high- dose radiation and b) during tumor development. Additionally, we have demonstrated that using our expression analysis approach in our genetically heterogeneous/defined radiation-induced tumor mouse models can uniquely identify genes and pathways relevant to human T-ALL, and uncover interactions between common genetic variants of genes which may lead to tumor susceptibility.

  14. Unit operation optimization for the manufacturing of botanical injections using a design space approach: a case study of water precipitation.

    Gong, Xingchu; Chen, Huali; Chen, Teng; Qu, Haibin


    Quality by design (QbD) concept is a paradigm for the improvement of botanical injection quality control. In this work, water precipitation process for the manufacturing of Xueshuantong injection, a botanical injection made from Notoginseng Radix et Rhizoma, was optimized using a design space approach as a sample. Saponin recovery and total saponin purity (TSP) in supernatant were identified as the critical quality attributes (CQAs) of water precipitation using a risk assessment for all the processes of Xueshuantong injection. An Ishikawa diagram and experiments of fractional factorial design were applied to determine critical process parameters (CPPs). Dry matter content of concentrated extract (DMCC), amount of water added (AWA), and stirring speed (SS) were identified as CPPs. Box-Behnken designed experiments were carried out to develop models between CPPs and process CQAs. Determination coefficients were higher than 0.86 for all the models. High TSP in supernatant can be obtained when DMCC is low and SS is high. Saponin recoveries decreased as DMCC increased. Incomplete collection of supernatant was the main reason for the loss of saponins. Design space was calculated using a Monte-Carlo simulation method with acceptable probability of 0.90. Recommended normal operation region are located in DMCC of 0.38-0.41 g/g, AWA of 3.7-4.9 g/g, and SS of 280-350 rpm, with a probability more than 0.919 to attain CQA criteria. Verification experiment results showed that operating DMCC, SS, and AWA within design space can attain CQA criteria with high probability.

  15. Unit operation optimization for the manufacturing of botanical injections using a design space approach: a case study of water precipitation.

    Xingchu Gong

    Full Text Available Quality by design (QbD concept is a paradigm for the improvement of botanical injection quality control. In this work, water precipitation process for the manufacturing of Xueshuantong injection, a botanical injection made from Notoginseng Radix et Rhizoma, was optimized using a design space approach as a sample. Saponin recovery and total saponin purity (TSP in supernatant were identified as the critical quality attributes (CQAs of water precipitation using a risk assessment for all the processes of Xueshuantong injection. An Ishikawa diagram and experiments of fractional factorial design were applied to determine critical process parameters (CPPs. Dry matter content of concentrated extract (DMCC, amount of water added (AWA, and stirring speed (SS were identified as CPPs. Box-Behnken designed experiments were carried out to develop models between CPPs and process CQAs. Determination coefficients were higher than 0.86 for all the models. High TSP in supernatant can be obtained when DMCC is low and SS is high. Saponin recoveries decreased as DMCC increased. Incomplete collection of supernatant was the main reason for the loss of saponins. Design space was calculated using a Monte-Carlo simulation method with acceptable probability of 0.90. Recommended normal operation region are located in DMCC of 0.38-0.41 g/g, AWA of 3.7-4.9 g/g, and SS of 280-350 rpm, with a probability more than 0.919 to attain CQA criteria. Verification experiment results showed that operating DMCC, SS, and AWA within design space can attain CQA criteria with high probability.

  16. Analysis of the intellectual structure of human space exploration research using a bibliometric approach: Focus on human related factors

    Lee, Tai Sik; Lee, Yoon-Sun; Lee, Jaeho; Chang, Byung Chul


    Human space exploration (HSE) is an interdisciplinary field composed of a range of subjects that have developed dramatically over the last few decades. This paper investigates the intellectual structure of HSE research with a focus on human related factors. A bibliometric approach with quantitative analytical techniques is applied to study the development and growth of the research. This study retrieves 1921 papers on HSE related to human factors from the year 1990 to the year 2016 from Web of Science and constructs a critical citation network composed of 336 papers. Edge-betweenness-based clustering is used to classify the citation network into twelve distinct research clusters based on four research themes: "biological risks from space radiation," "health and performance during long-duration spaceflight," "program and in-situ resources for HSE missions," and "habitat and life support systems in the space environment." These research themes are also similar to the classification results of a co-occurrence analysis on keywords for a total of 1921 papers. Papers with high centrality scores are identified as important papers in terms of knowledge flow. Moreover, the intermediary role of papers in exchanging knowledge between HSE sub-areas is identified using brokerage analysis. The key-route main path highlights the theoretical development trajectories. Due to the recent dramatic increase in investment by international governments and the private sector, the theoretical development trajectories of key research themes have been expanding from furthering scientific and technical knowledge to include various social and economic issues, thus encouraging massive public participation. This study contributes to an understanding of research trends and popular issues in the field of HSE by introducing a powerful way of determining major research themes and development trajectories. This study will help researchers seek the underlying knowledge diffusion flow from multifaceted

  17. A New Approach to Reducing Search Space and Increasing Efficiency in Simulation Optimization Problems via the Fuzzy-DEA-BCC

    Rafael de Carvalho Miranda


    Full Text Available The development of discrete-event simulation software was one of the most successful interfaces in operational research with computation. As a result, research has been focused on the development of new methods and algorithms with the purpose of increasing simulation optimization efficiency and reliability. This study aims to define optimum variation intervals for each decision variable through a proposed approach which combines the data envelopment analysis with the Fuzzy logic (Fuzzy-DEA-BCC, seeking to improve the decision-making units’ distinction in the face of uncertainty. In this study, Taguchi’s orthogonal arrays were used to generate the necessary quantity of DMUs, and the output variables were generated by the simulation. Two study objects were utilized as examples of mono- and multiobjective problems. Results confirmed the reliability and applicability of the proposed method, as it enabled a significant reduction in search space and computational demand when compared to conventional simulation optimization techniques.

  18. State space approach for the vibration of nanobeams based on the nonlocal thermoelasticity theory without energy dissipation

    Zenkour, A. M.; Alnefaie, K. A.; Abu-Hamdeh, N. H.; Aljinaid, A. A.; Aifanti, E. C. [King Abdulaziz University, Jeddah (Saudi Arabia); Abouelregal, A. E. [Mansoura University, Mansoura (Egypt)


    In this article, an Euler-Bernoulli beam model based upon nonlocal thermoelasticity theory without energy dissipation is used to study the vibration of a nanobeam subjected to ramp-type heating. Classical continuum theory is inherently size independent, while nonlocal elasticity exhibits size dependence. Among other things, this leads to a new expression for the effective nonlocal bending moment as contrasted to its classical counterpart. The thermal problem is addressed in the context of the Green-Naghdi (GN) theory of heat transport without energy dissipation. The governing partial differential equations are solved in the Laplace transform domain by the state space approach of modern control theory. Inverse of Laplace transforms are computed numerically using Fourier expansion techniques. The effects of nonlocality and ramping time parameters on the lateral vibration, temperature, displacement and bending moment are discussed.


    Ankita P. Dadhich


    Full Text Available The development oriented growth and accelerated industrializati on had been rapidly worsening the environmental quality of urban centers. Jaipur, c apital of Rajasthan India, the 10 th largest metropolitan city of India, is also facing the increas ing pressure on land due to urbanization and de mographic factors. Therefore, in this study an integrated approach of remote sensing and GIS (Geographic Infor mation System was applied to examine the relationship among spatial variables suc h as impervious area, land consumption rate and air po llutants concentration within a n urban context of Jaipur city. The urban sprawl over the period of five years (20 08–2013 is determined by computing the impervious area or built up area using IRS (In dian Remote Sensing Resourcesat 2 satellite data for Jaipur urban region. Thereafte r, Land Consumption Rate (LCR and Land Absorption Coefficient (LAC were quantifie d to evaluate the impervious area growth in differe nt wards of the Jaipur city. T he temporal variations in gaseous and particulate pollutants were also investigated to as certain the degree of association between air pollutants and impervious area. It has been observed that there is 74.89% increase in impervious area during 2008 to 2013.The z onal distribution of impervious area clearly indicates the increase in number of war ds under Zone 5 (80- 100% category from 2008 to 2013. The spatial distribution of L CR reveals very high land consumption rate (>0.012 in outskirts of the city ie. Vid hyadhar Nagar, Jhotwara and Jagatpura area. The LAC is zero in wards of Kishanpole area and high (>0.06 for the wards of Civil lines, Jagatpura, and Jhotwara area of the c ity. The urban air quality pattern (2008-2013 results indicate that PM 10 and SPM concentration have the greatest effects on the air environment in comparison to gaseous polluta nts (SO 2 and NO x . The association between particulate pollution and impervious area i ndicate strong correlation

  20. An Adaptive Regulator for Space Teleoperation System in Task Space

    Chao Ge


    Full Text Available The problem of the gravity information which can not be obtained in advance for bilateral teleoperation is studied. In outer space exploration, the gravity term changes with the position changing of the slave manipulator. So it is necessary to design an adaptive regulator controller to compensate for the unknown gravity signal. Moreover, to get a more accurate position tracking performance, the controller is designed in the task space instead of the joint space. Additionally, the time delay considered in this paper is not only time varying but also unsymmetrical. Finally, simulations are presented to show the effectiveness of the proposed approach.

  1. On the road to becoming a responsible leader: A simulation-based training approach for final year medical students.

    Schmidt-Huber, Marion; Netzel, Janine; Kiesewetter, Jan


    Background and objective: There is a need for young physicians to take a responsible role in clinical teams, comparable to a leadership role. However, today's medical curricula barely consider the development of leadership competencies. Acquisition of leadership skills are currently a by-product of medical education, even though it seems to be a competency relevant for physicians' success. Therefore, an innovative leadership training program for young physicians was developed and validated. Training conceptualisation were based upon findings of critical incidents interviews ( N =19) with relevant personnel (e.g. experienced doctors/nurses, residents) and upon evidence-based leadership contents focusing on ethical leadership behaviors. Method: The training consists of four sessions (3-4 hours each) and provided evidence-based lectures of leadership theory and effective leader behaviors, interactive training elements and a simulation-based approach with professional role players focusing on interprofessional collaboration with care staff. Training evaluation was assessed twice after completion of the program ( N =37). Assessments included items from validated and approved evaluation instruments regarding diverse learning outcomes (satisfaction/reaction, learning, self-efficacy, and application/transfer) and transfer indicators. Furthermore, training success predictors were assessed based on stepwise regression analysis. In addition, long-term trainings effects and behavioral changes were analysed. Results: Various learning outcomes are achieved (self-reported training satisfaction, usefulness of the content and learning effects) and results show substantial transfer effects of the training contents and a strengthened awareness for the leadership role (e.g. self-confidence, ideas dealing with work-related problems in a role as responsible physician). We identified competence of trainer, training of applied tools, awareness of job expectations, and the opportunity to

  2. On the road to becoming a responsible leader: A simulation-based training approach for final year medical students

    Schmidt-Huber, Marion


    Full Text Available Background and objective: There is a need for young physicians to take a responsible role in clinical teams, comparable to a leadership role. However, today’s medical curricula barely consider the development of leadership competencies. Acquisition of leadership skills are currently a by-product of medical education, even though it seems to be a competency relevant for physicians’ success. Therefore, an innovative leadership training program for young physicians was developed and validated. Training conceptualisation were based upon Method: The training consists of four sessions (3-4 hours each and provided evidence-based lectures of leadership theory and effective leader behaviors, interactive training elements and a simulation-based approach with professional role players focusing on interprofessional collaboration with care staff. Training evaluation was assessed twice after completion of the program (=37. Assessments included items from validated and approved evaluation instruments regarding diverse learning outcomes (satisfaction/reaction, learning, self-efficacy, and application/transfer and transfer indicators. Furthermore, training success predictors were assessed based on stepwise regression analysis. In addition, long-term trainings effects and behavioral changes were analysed. Results: Various learning outcomes are achieved (self-reported training satisfaction, usefulness of the content and learning effects and results show substantial transfer effects of the training contents and a strengthened awareness for the leadership role (e.g. self-confidence, ideas dealing with work-related problems in a role as responsible physician. We identified competence of trainer, training of applied tools, awareness of job expectations, and the opportunity to learn from experiences of other participants as predictors of training success. Additionally, we found long-term training effects and participants reported an increase in specific

  3. Charmless non-leptonic Bs decays to PP, PV and VV final states in the pQCD approach

    Ali, A.; Kramer, G.


    We calculate the CP-averaged branching ratios and CP-violating asymmetries of a number of two-body charmless hadronic decays B s 0 →PP,PV,VV in the perturbative QCD (pQCD) approach to leading order in α s (here P and V denote light pseudoscalar and vector mesons, respectively). The mixinginduced CP violation parameters are also calculated for these decays. We also predict the polarization fractions of B s →VV decays and find that the transverse polarizations are enhanced in some penguin dominated decays such as B s 0 →K * K * , K * ρ. Some of the predictions worked out here can already be confronted with the recently available data from the CDF collaboration on the branching ratios for the decays B s 0 →K + π - , B s 0 →K + K - and the CP-asymmetry in the decay B s 0 →K + π - , and are found to be in agreement within the current errors. A large number of predictions for the branching ratios, CP-asymmetries and vector-meson polarizations in B s 0 decays, presented in this paper and compared with the already existing results in other theoretical frameworks, will be put to stringent experimental tests in forthcoming experiments at Fermilab, LHC and Super B-factories. (orig.)

  4. Can a multisensory teaching approach impart the necessary knowledge, skills, and confidence in final year medical students to manage epistaxis?


    Objective The purpose of this study is to evaluate the efficacy of a multisensory teaching approach in imparting the knowledge, skills, and confidence to manage epistaxis in a cohort of fourth year medical students. Methods One hundred and thirty four fourth year medical students were recruited into the study from Aug 2011 to February 2012 in four groups. Students listened to an audio presentation (PODcast) about epistaxis and viewed a video presentation on the technical skills (VODcast). Following this, students completed a 5-minute Individual Readiness Assessment Test (IRAT) to test knowledge accrued from the PODcast and VODcast. Next, students observed a 10-minute expert demonstration of the technical skills on a human cadaver and spent half an hour practicing these techniques on cadaver simulators with expert guidance. The students’ confidence was assessed with Confidence Level Questionnaires (CLQs) before and after their laboratory session. The skill level of a subset of students was also assessed with a pre- and post-laboratory Objective Structured Assessment of Technical Skills (OSATS). Results Eighty two percent of the participants achieved a score of at least 80% on the IRAT. The CLQ instrument was validated in the study. There was a statistically significant improvement between the pre- and post-laboratory CLQ scores (pepistaxis. PMID:24479815

  5. Combinatorial Approach for the Discovery of New Scintillating Materials SBIR Phase I Final Report DOE/ER/84310

    Cronin, J.P.; Agrawal, A.; Tonazzi, J.C.


    The combinatorial approach for the discovery of new scintillating materials has been investigated using the wet-chemical (sol-gel) synthesis methods. Known scintillating compounds Lu 2 SiO 5 (LSO) and (LuAl)O 3 (LAO) and solid solutions in the systems of Lu 2 O 3 -Y 2 O 3 --SiO 2 (CeO 2 -doped) (LYSO) and Lu 2 O 3 -Y 2 O 3 --Al 2 O 3 (CeO 2 -doped) (LYAO) were synthesized from sol-gel precursors. Sol-gel precursors were formulated from alkoxides and nitrates and acetates of the cations. Sol-gel solution precursors were formulated for the printing of microdot arrays of different compositions in the above oxide systems. Microdot arrays were successfully printed on C-cut and R-cut sapphire substrates using Biodot printer at Los Alamos National Laboratory (LANL). The microdot arrays were adherent and stable after heat-treating at 1665 C and had an average thickness of around 2 (micro)m. X-ray fluorescence elemental mapping showed the arrays to be of the correct chemical composition. Sintered microdots were found to be highly crystalline by microscopic observation and X-ray diffraction. Scintillation was not clearly detectable by visual observation under UV illumination and by video observation under the scanning electron beam of an SEM. The microdots were either poorly scintillating or not scintillating under the present synthesis and testing conditions. Further improvements in the synthesis and processing of the microdot arrays as well as extensive scintillation testing are needed

  6. Filling Knowledge Gaps in Biological Networks: integrating global approaches to understand H2 metabolism in Chlamydomonas reinhardtii - Final Report

    Posewitz, Matthew C


    The green alga Chlamydomonas reinhardtii (Chlamydomonas) has numerous genes encoding enzymes that function in fermentative pathways. Among these genes, are the [FeFe]-hydrogenases, pyruvate formate lyase, pyruvate ferredoxin oxidoreductase, acetate kinase, and phosphotransacetylase. We have systematically undertaken a series of targeted mutagenesis approaches to disrupt each of these key genes and omics techniques to characterize alterations in metabolic flux. Funds from DE-FG02-07ER64423 were specifically leveraged to generate mutants with disruptions in the genes encoding the [FeFe]-hydrogenases HYDA1 and HYDA2, pyruvate formate lyase (PFL1), and in bifunctional alcohol/aldehyde alcohol dehydrogenase (ADH1). Additionally funds were used to conduct global transcript profiling experiments of wildtype Chlamydomonas cells, as well as of the hydEF-1 mutant, which is unable to make H2 due to a lesion in the [FeFe]-hydrogenase biosynthetic pathway. In the wildtype cells, formate, acetate and ethanol are the dominant fermentation products with traces of CO2 and H2 also being produced. In the hydEF-1 mutant, succinate production is increased to offset the loss of protons as a terminal electron acceptor. In the pfl-1 mutant, lactate offsets the loss of formate production, and in the adh1-1 mutant glycerol is made instead of ethanol. To further probe the system, we generated a double mutant (pfl1-1 adh1) that is unable to synthesize both formate and ethanol. This strain, like the pfl1 mutants, secreted lactate, but also exhibited a significant increase in the levels of extracellular glycerol, acetate, and intracellular reduced sugars, and a decline in dark, fermentative H2 production. Whereas wild-type Chlamydomonas fermentation primarily produces formate and ethanol, the double mutant performs a complete rerouting of the glycolytic carbon to lactate and glycerol. Lastly, transcriptome data have been analysed for both the wildtype and hydEF-1, that correlate with our

  7. Final Project Report: Development of Micro-Structural Mitigation Strategies for PEM Fuel Cells: Morphological Simulations and Experimental Approaches

    Wessel, Silvia [Ballard Materials Products; Harvey, David [Ballard Materials Products


    The durability of PEM fuel cells is a primary requirement for large scale commercialization of these power systems in transportation and stationary market applications that target operational lifetimes of 5,000 hours and 40,000 hours by 2015, respectively. Key degradation modes contributing to fuel cell lifetime limitations have been largely associated with the platinum-based cathode catalyst layer. Furthermore, as fuel cells are driven to low cost materials and lower catalyst loadings in order to meet the cost targets for commercialization, the catalyst durability has become even more important. While over the past few years significant progress has been made in identifying the underlying causes of fuel cell degradation and key parameters that greatly influence the degradation rates, many gaps with respect to knowledge of the driving mechanisms still exist; in particular, the acceleration of the mechanisms due to different structural compositions and under different fuel cell conditions remains an area not well understood. The focus of this project was to address catalyst durability by using a dual path approach that coupled an extensive range of experimental analysis and testing with a multi-scale modeling approach. With this, the major technical areas/issues of catalyst and catalyst layer performance and durability that were addressed are: 1. Catalyst and catalyst layer degradation mechanisms (Pt dissolution, agglomeration, Pt loss, e.g. Pt in the membrane, carbon oxidation and/or corrosion). a. Driving force for the different degradation mechanisms. b. Relationships between MEA performance, catalyst and catalyst layer degradation and operational conditions, catalyst layer composition, and structure. 2. Materials properties a. Changes in catalyst, catalyst layer, and MEA materials properties due to degradation. 3. Catalyst performance a. Relationships between catalyst structural changes and performance. b. Stability of the three-phase boundary and its effect on

  8. Fast Pyrolysis Oil Stabilization: An Integrated Catalytic and Membrane Approach for Improved Bio-oils. Final Report

    Huber, George W.; Upadhye, Aniruddha A.; Ford, David M.; Bhatia, Surita R.; Badger, Phillip C.


    This University of Massachusetts, Amherst project, "Fast Pyrolysis Oil Stabilization: An Integrated Catalytic and Membrane Approach for Improved Bio-oils" started on 1st February 2009 and finished on August 31st 2011. The project consisted following tasks: Task 1.0: Char Removal by Membrane Separation Technology The presence of char particles in the bio-oil causes problems in storage and end-use. Currently there is no well-established technology to remove char particles less than 10 micron in size. This study focused on the application of a liquid-phase microfiltration process to remove char particles from bio-oil down to slightly sub-micron levels. Tubular ceramic membranes of nominal pore sizes 0.5 and 0.8m were employed to carry out the microfiltration, which was conducted in the cross-flow mode at temperatures ranging from 38 to 45 C and at three different trans-membrane pressures varying from 1 to 3 bars. The results demonstrated the removal of the major quantity of char particles with a significant reduction in overall ash content of the bio-oil. The results clearly showed that the cake formation mechanism of fouling is predominant in this process. Task 2.0 Acid Removal by Membrane Separation Technology The feasibility of removing small organic acids from the aqueous fraction of fast pyrolysis bio-oils using nanofiltration (NF) and reverse osmosis (RO) membranes was studied. Experiments were carried out with a single solute solutions of acetic acid and glucose, binary solute solutions containing both acetic acid and glucose, and a model aqueous fraction of bio-oil (AFBO). Retention factors above 90% for glucose and below 0% for acetic acid were observed at feed pressures near 40 bar for single and binary solutions, so that their separation in the model AFBO was expected to be feasible. However, all of the membranes were irreversibly damaged when experiments were conducted with the model AFBO due to the presence of guaiacol in the feed solution. Experiments

  9. A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.

    Daniel Durstewitz


    Full Text Available The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast maximum-likelihood estimation framework for PLRNNs that may enable to recover

  10. VNI version 4.1. Simulation of high-energy particle collisions in QCD: Space-time evolution of e+e-... A + B collisions with parton-cascades, cluster-hadronization, final-state hadron cascades

    Geiger, K.; Longacre, R.


    VNI is a general-purpose Monte-Carlo event-generator, which includes the simulation of lepton-lepton, lepton-hadron, lepton-nucleus, hadron-hadron, hadron-nucleus, and nucleus-nucleus collisions. It uses the real-time evolution of parton cascades in conjunction with a self-consistent hadronization scheme, as well as the development of hadron cascades after hadronization. The causal evolution from a specific initial state (determined by the colliding beam particles) is followed by the time-development of the phase-space densities of partons, pre-hadronic parton clusters, and final-state hadrons, in position-space, momentum-space and color-space. The parton-evolution is described in terms of a space-time generalization of the familiar momentum-space description of multiple (semi)hard interactions in QCD, involving 2 → 2 parton collisions, 2 → 1 parton fusion processes, and 1 → 2 radiation processes. The formation of color-singlet pre-hadronic clusters and their decays into hadrons, on the other hand, is treated by using a spatial criterion motivated by confinement and a non-perturbative model for hadronization. Finally, the cascading of produced prehadronic clusters and of hadrons includes a multitude of 2 → n processes, and is modeled in parallel to the parton cascade description. This paper gives a brief review of the physics underlying VNI, as well as a detailed description of the program itself. The latter program description emphasizes easy-to-use pragmatism and explains how to use the program (including simple examples), annotates input and control parameters, and discusses output data provided by it

  11. Helios1A EoL: A Success. For the first Time a Long Final Thrust Scenario, Respecting the French Law on Space Operations

    Guerry, Agnes; Moussi, Aurelie; Sartine, Christian; Beaumet, Gregory


    HELIOS1A End Of Live (EOL) operations occurred in the early 2012. Through this EOL operation, CNES wanted to make an example of French Space Act compliance. Because the satellite wasn't natively designed for such an EOL phase, the operation was touchy and risky. It was organized as a real full project in order to assess every scenario details with dedicated Mission Analysis, to secure the operations through detailed risk analysis at system level and to consider the major failures that could occur during the EOL. A short scenario allowing to reach several objectives with benefits was eventually selected. The main objective of this project was to preserve space environment. The operations were led on a "best effort" basis. The French Space Operations Act (FSOA) requirements were met: HELIOS-1A EOL operations had been led successfully.

  12. Path integral approach for superintegrable potentials on spaces of non-constant curvature. Pt. 1. Darboux spaces D{sub I} and D{sub II}

    Grosche, C. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Pogosyan, G.S. [Joint Inst. of Nuclear Research, Moscow (Russian Federation). Bogoliubov Lab. of Theoretical Physics]|[Guadalajara Univ., Jalisco (Mexico). Dept. de Matematicas CUCEI; Sissakian, A.N. [Joint Inst. of Nuclear Research, Moscow (Russian Federation). Bogoliubov Lab. of Theoretical Physics


    In this paper the Feynman path integral technique is applied for superintegrable potentials on two-dimensional spaces of non-constant curvature: these spaces are Darboux spaces D{sub I} and D{sub II}, respectively. On D{sub I} there are three and on D{sub II} foru such potentials, respectively. We are able to evaluate the path integral in most of the separating coordinate systems, leading to expressions for the Green functions, the discrete and continuous wave-functions, and the discrete energy-spectra. In some cases, however, the discrete spectrum cannot be stated explicitly, because it is either determined by a transcendental equation involving parabolic cylinder functions (Darboux space I), or by a higher order polynomial equation. The solutions on D{sub I} in particular show that superintegrable systems are not necessarily degenerate. We can also show how the limiting cases of flat space (Constant curvature zero) and the two-dimensional hyperboloid (constant negative curvature) emerge. (Orig.)

  13. Cell genetic processes under space flight conditions: Analysis of two-factor crosses between spore color mutants of Sordaria macrospora. Final report

    Hock, B.; Hahn, A.


    The purpose of the FUNGUS experiment on S/MM05 was to examine the effects of space flight conditions on the hereditary transmission of the spore color genes. The controls consisted of one further experiment in space with a centrifuge and 1 x g acceleration, and a gravitational reference experiment. A statistical analysis revealed no significant differences attributable to the absence of gravitational effects. A significant increase however was observed in the recombination frequencies, due to the fraction of HZE particles in the cosmic radiation. Gravitational reference experiments showed a dose-dependent effect of heavy-ion particle radiation on the post-reduction frequency and thus on the calculated distances between the genes, higher radiation doses increasing the post-reduction frequency. It was possible to derive dose-response curves for comparison with X-radiation and determination of the RBE of the heavy ion radiation with respect to the calculated distances between the genes 1u and r2. The mycelium of the fungi of the space flight experiment was examined for DNA strand breaks at the molecular level by means of a single cell gel electrophoresis assay. No genetic damage could be detected in the specimens of the experiment in space. Attempts at DNA repair in S. macrospora reveal that most of the damage is healed within a few hours. It was possible to determine the maximum doses of ionizing and non-ionizing radiation up to which DNA repair is possible. (orig./CB) [de


    Crowe, Benjamin J. III


    Nucleon-deuteron (Nd) breakup is an important tool for obtaining a better understanding of three-nucleon (3N) dynamics and for developing meson exchange descriptions of nuclear systems. The kinematics of the nd breakup reaction enable observables to be studied in a variety of exit-channel configurations that show sensitivity to realistic nucleon-nucleon (NN) potential models and three-nucleon force (3NF) models. Rigorous 3N calculations give very good descriptions of most 3N reaction data. However, there are still some serious discrepancies between data and theory. The largest discrepancy observed between theory and data for nd breakup is for the cross section for the space-star configuration. This discrepancy is known as the 'Space Star Anomaly'. Several experimental groups have obtained results consistent with the 'Space Star Anomaly', but it is important to note that they all used essentially the same experimental setup and so their experimental results are subject to the same systematic errors. We propose to measure the space-star cross-section at the Triangle Universities Nuclear Laboratory (TUNL) using an experimental technique that is significantly different from the one used in previous breakup experiments. This technique has been used by a research group from the University of Bonn to measure the neutron-neutron scattering length. There are three possible scenarios for the outcome of this work: (1) the new data are consistent with previous measurements; (2) the new data are not in agreement with previous measurements, but are in agreement with theory; and (3) the new data are not in agreement with either theory or previous measurements. Any one of the three scenarios will provide valuable insight on the Space Star Anomaly.

  15. Knowledge-based approach for functional MRI analysis by SOM neural network using prior labels from Talairach stereotaxic space

    Erberich, Stephan G.; Willmes, Klaus; Thron, Armin; Oberschelp, Walter; Huang, H. K.


    Among the methods proposed for the analysis of functional MR we have previously introduced a model-independent analysis based on the self-organizing map (SOM) neural network technique. The SOM neural network can be trained to identify the temporal patterns in voxel time-series of individual functional MRI (fMRI) experiments. The separated classes consist of activation, deactivation and baseline patterns corresponding to the task-paradigm. While the classification capability of the SOM is not only based on the distinctness of the patterns themselves but also on their frequency of occurrence in the training set, a weighting or selection of voxels of interest should be considered prior to the training of the neural network to improve pattern learning. Weighting of interesting voxels by means of autocorrelation or F-test significance levels has been used successfully, but still a large number of baseline voxels is included in the training. The purpose of this approach is to avoid the inclusion of these voxels by using three different levels of segmentation and mapping from Talairach space: (1) voxel partitions at the lobe level, (2) voxel partitions at the gyrus level and (3) voxel partitions at the cell level (Brodmann areas). The results of the SOM classification based on these mapping levels in comparison to training with all brain voxels are presented in this paper.

  16. Development of An Analytic Approach to Determine How Environmental Protection Agency’s Integrated Risk Information System (IRIS) Is Used by Non-EPA Decision Makers (Final Contractor Report)

    EPA announced the availability of the final contractor report entitled, Development of an Analytic Approach to Determine How Environmental Protection Agency’s Integrated Risk Information System (IRIS) Is Used By Non EPA Decision Makers. This contractor report analyzed how ...

  17. An Accelerated Development, Reduced Cost Approach to Lunar/Mars Exploration Using a Modular NTR-Based Space Transportation System

    Borowski, S.; Clark, J.; Sefcik, R.; Corban, R.; Alexander, S.


    The results of integrated systems and mission studies are presented which quantify the benefits and rationale for developing a common, modular lunar/Mars space transportation system (STS) based on nuclear thermal rocket (NTR) technology. At present NASA's Exploration Program Office (ExPO) is considering chemical propulsion for an 'early return to the Moon' and NTR propulsion for the more demanding Mars missions to follow. The time and cost to develop these multiple systems are expected to be significant. The Nuclear Propulsion Office (NPO) has examined a variety of lunar and Mars missions and heavy lift launch vehicle (HLLV) options in an effort to determine a 'standardized' set of engine and stage components capable of satisfying a wide range of Space Exploration Initiative (SEI) missions. By using these components in a 'building block' fashion, a variety of single and multi-engine lunar and Mars vehicles can be configured. For NASA's 'First Lunar Outpost' (FLO) mission, an expendable NTR stage powered by two 50 klbf engines can deliver approximately 96 metric tons (t) to translunar injection (TLI) conditions for an initial mass in low earth orbit (IMLEO) of approximately 198 t compared to 250 t for a cryogenic chemical TLI stage. The NTR stage liquid hydrogen (LH2) tank has a 10 m diameter, 14.5 m length, and 66 t LH2 capacity. The NTR utilizes a UC-ZrC-graphite 'composite' fuel with a specific impulse (Isp) capability of approximately 900 s and an engine thrust-to-weight ratio of approximately 4.3. By extending the size and LH2 capacity of the lunar NTR stage to approximately 20 m and 96 t, respectively, a single launch Mars cargo vehicle capable of delivering approximately 50 t of surface payload is possible. Three 50 klbf NTR engines and the two standardized LH2 tank sizes developed for lunar and Mars cargo vehicle applications would be used to configure the Mars piloted vehicle for a mission as early as 2010. The paper describes the features of the 'common

  18. Advancing Space Sciences through Undergraduate Research Experiences at UC Berkeley's Space Sciences Laboratory - a novel approach to undergraduate internships for first generation community college students

    Raftery, C. L.; Davis, H. B.; Peticolas, L. M.; Paglierani, R.


    The Space Sciences Laboratory at UC Berkeley launched an NSF-funded Research Experience for Undergraduates (REU) program in the summer of 2015. The "Advancing Space Sciences through Undergraduate Research Experiences" (ASSURE) program recruited heavily from local community colleges and universities, and provided a multi-tiered mentorship program for students in the fields of space science and engineering. The program was focussed on providing a supportive environment for 2nd and 3rd year undergraduates, many of whom were first generation and underrepresented students. This model provides three levels of mentorship support for the participating interns: 1) the primary research advisor provides academic and professional support. 2) The program coordinator, who meets with the interns multiple times per week, provides personal support and helps the interns to assimilate into the highly competitive environment of the research laboratory. 3) Returning undergraduate interns provided peer support and guidance to the new cohort of students. The impacts of this program on the first generation students and the research mentors, as well as the lessons learned will be discussed.

  19. Design spaces


    Digital technologies and media are becoming increasingly embodied and entangled in the spaces and places at work and at home. However, our material environment is more than a geometric abstractions of space: it contains familiar places, social arenas for human action. For designers, the integration...... of digital technology with space poses new challenges that call for new approaches. Creative alternatives to traditional systems methodologies are called for when designers use digital media to create new possibilities for action in space. Design Spaces explores how design and media art can provide creative...... alternatives for integrating digital technology with space. Connecting practical design work with conceptual development and theorizing, art with technology, and usesr-centered methods with social sciences, Design Spaces provides a useful research paradigm for designing ubiquitous computing. This book...

  20. An approach to developing the market for space shuttle payloads: Business/public policy issues and international marketing considerations

    Krebs, W. A. W.


    The business and public policies were assessed that were determined to be important for NASA to consider in the design of a program for stimulating use of the space transportation system (STS) among potential users in the U.S. private sector and in foreign countries, in preparation for operations of the space shuttle in the early 1980's. Salient factors related to international cooperation in space are identified for special consideration in the development of user potential of the STS.

  1. A systematic approach to the application of Automation, Robotics, and Machine Intelligence Systems /ARAMIS/ to future space projects

    Smith, D. B. S.


    The potential applications of Automation, Robotics, and Machine Intelligence Systems (ARAMIS) to space projects are investigated, through a systematic method. In this method selected space projects are broken down into space project tasks, and 69 of these tasks are selected for study. Candidate ARAMIS options are defined for each task. The relative merits of these options are evaluated according to seven indices of performance. Logical sequences of ARAMIS development are also defined. Based on this data, promising applications of ARAMIS are

  2. Laboratory Evaluation of Gas-Fired Tankless and Storage Water Heater Approaches to Combination Water and Space Heating

    Kingston, T. [Gas Technology Inst., Des Plaines, IL (United States); Scott, S. [Gas Technology Inst., Des Plaines, IL (United States)


    Homebuilders are exploring more cost-effective combined space and water heating systems (combo systems) with major water heater manufacturers that are offering pre-engineered forced air space heating combo systems. In this project, unlike standardized tests, laboratory tests were conducted that subjected condensing tankless and storage water heater based combo systems to realistic, coincidental space and domestic hot water loads and found that the tankless combo system maintained more stable DHW and space heating temperatures than the storage combo system, among other key findings.

  3. Systematic approach to the application of automation, robotics, and machine intelligence systems (aramis) to future space projects

    Smith, D B.S.


    The potential applications of automation, robotics and machine intelligence systems (ARAMIS) to space projects are investigated, through a systematic method. In this method selected space projects are broken down into space project tasks, and 69 of these tasks are selected for study. Candidate ARAMIS options are defined for each task. The relative merits of these options are evaluated according to seven indices of performance. Logical sequences of ARAMIS development are also defined. Based on this data, promising applications of ARAMIS are identified for space project tasks. General conclusions and recommendations for further study are also presented. 6 references.

  4. A Nanotechnology Approach to Lightweight Multifunctional Polyethylene Composite Materials for Use Against the Space Environment, Phase I

    National Aeronautics and Space Administration — Polyethylene-based composite materials are under consideration as multifunctional structural materials, with the expectation that they can provide radiation...

  5. A Novel Approach for Microgrid Protection Based upon Combined ANFIS and Hilbert Space-Based Power Setting

    Ali Hadi Abdulwahid


    Full Text Available Nowadays, the use of distributed generation (DG has increased because of benefits such as increased reliability, reduced losses, improvement in the line capacity, and less environmental pollution. The protection of microgrids, which consist of generation sources, is one of the most crucial concerns of basic distribution operators. One of the key issues in this field is the protection of microgrids against permanent and temporary failures by improving the safety and reliability of the network. The traditional method has a number of disadvantages. The reliability and stability of a power system in a microgrid depend to a great extent on the efficiency of the protection scheme. The application of Artificial Intelligence approaches was introduced recently in the protection of distribution networks. The fault detection method depends on differential relay based on Hilbert Space-Based Power (HSBP theory to achieve fastest primary protection. It is backed up by a total harmonic distortion (THD detection method that takes over in case of a failure in the primary method. The backup protection would be completely independent of the main protection. This is rarely attained in practice. This paper proposes a new algorithm to improve protection performance by adaptive network-based fuzzy inference system (ANFIS. The protection can be obtained in a novel way based on this theory. An advantage of this algorithm is that the protection system operates in fewer than two cycles after the occurrence of the fault. Another advantage is that the error detection is not dependent on the selection of threshold values, and all types of internal fault can identify and show that the algorithm operates correctly for all types of faults while preventing unwanted tripping, even if the data were distorted by current transformer (CT saturation or by data mismatches. The simulation results show that the proposed circuit can identify the faulty phase in the microgrid quickly and

  6. Economic and industrial development. EID - EMPLOY. Final report. Task 1. Review of approaches for employment impact assessment of renewable energy deployment

    Breitschopf, Barbara [Fraunhofer-Institut fuer System- und Innovationsforschung (ISI), Karlsruhe (Germany); Nathani, Carsten; Resch, Gustav


    full picture of the impacts of RE deployment on the total economy - covering all economic activities like production, service and consumption (industries, households). To get the number of additional jobs caused by RE deployment, they compare a situation without RE (baseline or counterfactual) to a situation under a strong RE deployment. In a second step, we characterize the studies inter alia by their scope, activities and impacts and show the relevant positive and negative effects that are included in gross or net impact assessment studies. The effects are briefly described in Table 0-1. While gross studies mainly include the positive effects listed here, net studies in general include positive and negative effects. Third, we distinguish between methodological approaches assessing impacts. We observe that the more effects are incorporated in the approach, the more data are needed, the more complex and demanding the methodological approach becomes and the more the impacts capture effects of and in the whole economy - representing net impacts. A simple approach requires a few data and allows answering simple questions concerning the impact on the RE-industry - representing gross impacts. We identify six main approaches, three for gross and three for net impacts. They are depicted in Figure 0-2. The methodological approaches are characterized by their effects captured, the complexity of model and additional data requirement (besides data on RE investments, capacities and generation) as well as by their depicted impacts reflecting the economic comprehensiveness. A detailed overview of the diverse studies in table form is given in the Annex to this report. Finally, we suggest to elaborate guidelines for the simple EF-approach, the gross IO-modelling and net IO-modelling approach. The first approach enables policy makers to do a quick assessment on gross effects, while the second is a more sophisticated approach for gross effects. The third approach builds on the gross IO

  7. Uncommon primary hydatid cyst occupying the adrenal gland space, treated with laparoscopic surgical approach in an old patient

    Aprea Giovanni


    Full Text Available Hydatid disease (HD is caused by Echinococcus Granulosus (EG, which is a larva endemic in many undeveloped areas. The most common target is the liver (59%–75%. The retroperitoneal space is considered as a rare localization. We report an uncommon case of HD located in the adrenal gland space.

  8. Remote control systems for space heating. Product overview 2010 and recommendations - Final report; Fernsteuerungen fuer Raumheizungen. Produktuebersicht 2010 und Empfehlungen - Schlussbericht

    Geilinger, E.; Bush, E. [Bush Energie GmbH, Felsberg (Switzerland); Venzin, T. [Hochschule fuer Technik und Wirtschaft (HTW) Chur, Chur (Switzerland); Nipkow, J. [Arena, Zuerich (Switzerland)


    Saving space heating energy by remote control: A remote-controlled space heating system allows a person to lower the room temperature in homes that go unoccupied for periods of time to the lowest temperature that's safe to keep the pipes from freezing while they're away. Comfort is guaranteed because the desired room temperature or mode can be activated in time before the guests arrive, via text message, phone or the internet. As most people simply leave unoccupied homes heated, the remote-controlled system saves up to 70% of heating energy when used actively. Market overview and product features: This report presents remote control devices that are currently available on the market. Their advantages and disadvantages are discussed as well as their technical features and function. Most of them are universal remote controls that have various uses, including temperature control. The report also discusses requirements that not all the examined products meet. Some lack an emergency power supply, the possibility for manual control or the ability to check the current temperature of the home from a remote location. Better planning for remote control: The critical issue proved not to be the remote control device itself, but the heating systems. Unfortunately, they often don't provide an option to be extended by remote control. We therefore call on the manufacturers to equip all new heating systems with options for remote control. It would also be helpful and desirable to provide information on the internet or in the technical documentation on how to connect a remote control device and which products are suitable - both for existing and new heating systems. If the system cannot be retrofitted, it should be described whether and how a central remote control with room thermostat can be installed. Improving communication: In this study, remote control and heating suppliers were interviewed as well as planners, installers and users of remote-controlled heating. Their responses

  9. Remote control systems for space heating. Product overview 2010 and recommendations - Final report; Fernsteuerungen fuer Raumheizungen. Produktuebersicht 2010 und Empfehlungen - Schlussbericht

    Geilinger, E.; Bush, E. [Bush Energie GmbH, Felsberg (Switzerland); Venzin, T. [Hochschule fuer Technik und Wirtschaft (HTW) Chur, Chur (Switzerland); Nipkow, J. [Arena, Zuerich (Switzerland)


    Saving space heating energy by remote control: A remote-controlled space heating system allows a person to lower the room temperature in homes that go unoccupied for periods of time to the lowest temperature that's safe to keep the pipes from freezing while they're away. Comfort is guaranteed because the desired room temperature or mode can be activated in time before the guests arrive, via text message, phone or the internet. As most people simply leave unoccupied homes heated, the remote-controlled system saves up to 70% of heating energy when used actively. Market overview and product features: This report presents remote control devices that are currently available on the market. Their advantages and disadvantages are discussed as well as their technical features and function. Most of them are universal remote controls that have various uses, including temperature control. The report also discusses requirements that not all the examined products meet. Some lack an emergency power supply, the possibility for manual control or the ability to check the current temperature of the home from a remote location. Better planning for remote control: The critical issue proved not to be the remote control device itself, but the heating systems. Unfortunately, they often don't provide an option to be extended by remote control. We therefore call on the manufacturers to equip all new heating systems with options for remote control. It would also be helpful and desirable to provide information on the internet or in the technical documentation on how to connect a remote control device and which products are suitable - both for existing and new heating systems. If the system cannot be retrofitted, it should be described whether and how a central remote control with room thermostat can be installed. Improving communication: In this study, remote control and heating suppliers were interviewed as well as planners, installers and users of remote-controlled heating

  10. A state-space modeling approach to estimating canopy conductance and associated uncertainties from sap flux density data.

    Bell, David M; Ward, Eric J; Oishi, A Christopher; Oren, Ram; Flikkema, Paul G; Clark, James S


    Uncertainties in ecophysiological responses to environment, such as the impact of atmospheric and soil moisture conditions on plant water regulation, limit our ability to estimate key inputs for ecosystem models. Advanced statistical frameworks provide coherent methodologies for relating observed data, such as stem sap flux density, to unobserved processes, such as canopy conductance and transpiration. To address this need, we developed a hierarchical Bayesian State-Space Canopy Conductance (StaCC) model linking canopy conductance and transpiration to tree sap flux density from a 4-year experiment in the North Carolina Piedmont, USA. Our model builds on existing ecophysiological knowledge, but explicitly incorporates uncertainty in canopy conductance, internal tree hydraulics and observation error to improve estimation of canopy conductance responses to atmospheric drought (i.e., vapor pressure deficit), soil drought (i.e., soil moisture) and above canopy light. Our statistical framework not only predicted sap flux observations well, but it also allowed us to simultaneously gap-fill missing data as we made inference on canopy processes, marking a substantial advance over traditional methods. The predicted and observed sap flux data were highly correlated (mean sensor-level Pearson correlation coefficient = 0.88). Variations in canopy conductance and transpiration associated with environmental variation across days to years were many times greater than the variation associated with model uncertainties. Because some variables, such as vapor pressure deficit and soil moisture, were correlated at the scale of days to weeks, canopy conductance responses to individual environmental variables were difficult to interpret in isolation. Still, our results highlight the importance of accounting for uncertainty in models of ecophysiological and ecosystem function where the process of interest, canopy conductance in this case, is not observed directly. The StaCC modeling

  11. Robust Constrained Optimization Approach to Control Design for International Space Station Centrifuge Rotor Auto Balancing Control System

    Postma, Barry D


    ...) for a centrifuge rotor to be implemented on the International Space Station. The design goal is to minimize a performance objective of the system, while guaranteeing stability and proper performance for a range of uncertain plants...

  12. Computational Approaches for Developing Active Radiation Dosimeters for Space Applications Based on New Paradigms for Risk Assessment

    National Aeronautics and Space Administration — Exposure to ionizing radiation can cause acute injury or sickness in humans under circumstances of very large doses and it presents the possibility of causing cancer...

  13. Qubits in phase space: Wigner-function approach to quantum-error correction and the mean-king problem

    Paz, Juan Pablo; Roncaglia, Augusto Jose; Saraceno, Marcos


    We analyze and further develop a method to represent the quantum state of a system of n qubits in a phase-space grid of NxN points (where N=2 n ). The method, which was recently proposed by Wootters and co-workers (Gibbons et al., Phys. Rev. A 70, 062101 (2004).), is based on the use of the elements of the finite field GF(2 n ) to label the phase-space axes. We present a self-contained overview of the method, we give insights into some of its features, and we apply it to investigate problems which are of interest for quantum-information theory: We analyze the phase-space representation of stabilizer states and quantum error-correction codes and present a phase-space solution to the so-called mean king problem

  14. Towards a results-based management approach for capacity-building in space science, technology and applications to support the implementation of the 2030 agenda for sustainable development

    Balogh, Werner R.; St-Pierre, Luc; Di Pippo, Simonetta


    The United Nations Office for Outer Space Affairs (UNOOSA) has the mandate to assist Member States with building capacity in using space science, technology and their applications in support of sustainable economic, social and environmental development. From 20 to 21 June 2018 the international community will gather in Vienna for UNISPACE + 50, a special segment of the 61st session of the Committee on the Peaceful Uses of Outer Space (COPUOS), to celebrate the 50th anniversary of the first UNISPACE conference and to reach consensus on a global space agenda for the next two decades. ;Capacity-building for the twenty-first century; is one of the seven thematic priorities of UNISPACE + 50, identified and agreed upon by COPUOS. The Committee has tasked UNOOSA with undertaking the work under this thematic priority and with reporting regularly to the Committee and its Subcommittees on the progress of its work. It is therefore appropriate, in this context, to take stock of the achievements of the capacity-building activities of the Office, to review the relevant mandates and activities and to consider the necessity to strengthen and better align them with the future needs of the World and in particular with the 2030 Agenda for Sustainable Development. This paper describes the efforts on-going at UNOOSA, building on its experiences with implementing the United Nations Programme on Space Applications and the United Nations Platform for Space-based Information for Disaster Management and Emergency Response (UN-SPIDER) and working with Member States and other United Nations entities, to develop a results-based management approach, based on an indicator framework and a database with space solutions, for promoting the use of space-based solutions to help Member States achieve the Sustainable Development Goals (SDGs) and successfully implement the 2030 Agenda for Sustainable Development.

  15. Signal Region Optimisation Studies Based on BDT and Multi-Bin Approaches in the Context of Supersymmetry Searches in Hadronic Final States with the ATLAS Detector

    AUTHOR|(CDS)2097636; Makovec, Nikola; Rúriková, Zuzana

    The searches for supersymmetric phenomena are mostly based on simple Cut & Count methods. One example is the search for squarks and gluinos in final states with multiple jets, missing transverse momentum and without leptons. This analysis, based on $36.1\\,\\text{fb}^{−1}$ of $pp$ collision data at $\\sqrt{s}$ = 13 TeV recorded with the ATLAS detector, uses Cut & Count based methods in the signal regions. In order to improve the analysis sensitivity, the use of sophisticated techniques, such as boosted decision trees (BDT) and Multi-Bin, is being investigated in this thesis. The focus of the study lies on squarks and gluino searches. These techniques are evaluated using Monte Carlo simulation. The goal is to find a new approach which is on the one hand simple but allows for a significant improvement. A gain up to approximately 200 GeV in the neutralino mass and an enhancement of about 200 GeV in the squark and gluino mass is achieved with these new techniques.

  16. An approach for generating trajectory-based dynamics which conserves the canonical distribution in the phase space formulation of quantum mechanics. II. Thermal correlation functions.

    Liu, Jian; Miller, William H


    We show the exact expression of the quantum mechanical time correlation function in the phase space formulation of quantum mechanics. The trajectory-based dynamics that conserves the quantum canonical distribution-equilibrium Liouville dynamics (ELD) proposed in Paper I is then used to approximately evaluate the exact expression. It gives exact thermal correlation functions (of even nonlinear operators, i.e., nonlinear functions of position or momentum operators) in the classical, high temperature, and harmonic limits. Various methods have been presented for the implementation of ELD. Numerical tests of the ELD approach in the Wigner or Husimi phase space have been made for a harmonic oscillator and two strongly anharmonic model problems, for each potential autocorrelation functions of both linear and nonlinear operators have been calculated. It suggests ELD can be a potentially useful approach for describing quantum effects for complex systems in condense phase.

  17. Acoustic Separation Technology; FINAL

    Fred Ahrens; Tim Patterson


    Today's restrictive environmental regulations encourage paper mills to close their water systems. Closed water systems increase the level of contaminants significantly. Accumulations of solid suspensions are detrimental to both the papermaking process and the final products. To remove these solids, technologies such as flotation using dissolved air (DAF), centrifuging, and screening have been developed. Dissolved Air Flotation systems are commonly used to clarify whitewater. These passive systems use high pressure to dissolve air into whitewater. When the pressure is released, air micro-bubbles form and attach themselves to fibers and particles, which then float to the surface where they are mechanically skimmed off. There is an economic incentive to explore alternatives to the DAF technology to drive down the cost of whitewater processing and minimize the use of chemicals. The installed capital cost for a DAF system is significant and a typical DAF system takes up considerable space. An alternative approach, which is the subject of this project, involves a dual method combining the advantages of chemical flocculation and in-line ultrasonic clarification to efficiently remove flocculated contaminants from a water stream

  18. Parametric analysis as a methodical approach that facilitates the exploration of the creative space in low-energy and zero-energy design projects

    Hansen, Hanne Tine Ring; Knudstrup, Mary-Ann


    -energy or zero-energy building design. The paper discusses professional differences between two of the main actors involved in integrated design processes; engineers and architects, as well as a methodical approach to investigation and delimitation of the creative space of a specific project. The paper......Most – if not all - methodical approaches found in literature that describe the creation of environmentally sustainable architecture agree on the importance of inter-disciplinarity and the early integration of building system strategies and the importance of the comfort of the user. Today most...

  19. Detecting space-time disease clusters with arbitrary shapes and sizes using a co-clustering approach

    Sami Ullah


    Full Text Available Ability to detect potential space-time clusters in spatio-temporal data on disease occurrences is necessary for conducting surveillance and implementing disease prevention policies. Most existing techniques use geometrically shaped (circular, elliptical or square scanning windows to discover disease clusters. In certain situations, where the disease occurrences tend to cluster in very irregularly shaped areas, these algorithms are not feasible in practise for the detection of space-time clusters. To address this problem, a new algorithm is proposed, which uses a co-clustering strategy to detect prospective and retrospective space-time disease clusters with no restriction on shape and size. The proposed method detects space-time disease clusters by tracking the changes in space–time occurrence structure instead of an in-depth search over space. This method was utilised to detect potential clusters in the annual and monthly malaria data in Khyber Pakhtunkhwa Province, Pakistan from 2012 to 2016 visualising the results on a heat map. The results of the annual data analysis showed that the most likely hotspot emerged in three sub-regions in the years 2013-2014. The most likely hotspots in monthly data appeared in the month of July to October in each year and showed a strong periodic trend.

  20. Laboratory Evaluation of Gas-Fired Tankless and Storage Water Heater Approaches to Combination Water and Space Heating

    Kingston, T.; Scott, S.


    Homebuilders are exploring more cost effective combined space and water heating systems (combo systems) with major water heater manufacturers that are offering pre-engineered forced air space heating combo systems. In this project, unlike standardized tests, laboratory tests were conducted that subjected condensing tankless and storage water heater based combo systems to realistic, coincidental space and domestic hot water loads with the following key findings: 1) The tankless combo system maintained more stable DHW and space heating temperatures than the storage combo system. 2) The tankless combo system consistently achieved better daily efficiencies (i.e. 84%-93%) than the storage combo system (i.e. 81%- 91%) when the air handler was sized adequately and adjusted properly to achieve significant condensing operation. When condensing operation was not achieved, both systems performed with lower (i.e. 75%-88%), but similar efficiencies. 3) Air handlers currently packaged with combo systems are not designed to optimize condensing operation. More research is needed to develop air handlers specifically designed for condensing water heaters. 4) System efficiencies greater than 90% were achieved only on days where continual and steady space heating loads were required with significant condensing operation. For days where heating was more intermittent, the system efficiencies fell below 90%.