WorldWideScience

Sample records for rapid end-to-end mission

  1. VisualCommander for Rapid End-to-End Mission Design and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal is for the development of a highly extensible and user-configurable software application for end-to-end mission simulation and design. We will leverage...

  2. Composable Mission Framework for Rapid End-to-End Mission Design and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is the Composable Mission Framework (CMF)?a model-based software framework that shall enable seamless continuity of mission design and...

  3. Human Assisted Robotic Vehicle Studies - A conceptual end-to-end mission architecture

    NARCIS (Netherlands)

    Lehner, B.; Mazzotta, D. G.; Teeney, L.; Spina, F.; Filosa, A.; Pou, A. Canals; Schlechten, J.; Campbell, S.; Soriano, P. López

    2017-01-01

    With current space exploration roadmaps indicating the Moon as a proving ground on the way to human exploration of Mars, it is clear that human-robotic partnerships will play a key role for successful future human space missions. This paper details a conceptual end-to-end architecture for an

  4. Human Assisted Robotic Vehicle Studies - A conceptual end-to-end mission architecture

    Science.gov (United States)

    Lehner, B. A. E.; Mazzotta, D. G.; Teeney, L.; Spina, F.; Filosa, A.; Pou, A. Canals; Schlechten, J.; Campbell, S.; Soriano, P. López

    2017-11-01

    With current space exploration roadmaps indicating the Moon as a proving ground on the way to human exploration of Mars, it is clear that human-robotic partnerships will play a key role for successful future human space missions. This paper details a conceptual end-to-end architecture for an exploration mission in cis-lunar space with a focus on human-robot interactions, called Human Assisted Robotic Vehicle Studies (HARVeSt). HARVeSt will build on knowledge of plant growth in space gained from experiments on-board the ISS and test the first growth of plants on the Moon. A planned deep space habitat will be utilised as the base of operations for human-robotic elements of the mission. The mission will serve as a technology demonstrator not only for autonomous tele-operations in cis-lunar space but also for key enabling technologies for future human surface missions. The successful approach of the ISS will be built on in this mission with international cooperation. Mission assets such as a modular rover will allow for an extendable mission and to scout and prepare the area for the start of an international Moon Village.

  5. Portable end-to-end ground system for low-cost mission support

    Science.gov (United States)

    Lam, Barbara

    1996-11-01

    This paper presents a revolutionary architecture of the end-to-end ground system to reduce overall mission support costs. The present ground system of the Jet Propulsion Laboratory (JPL) is costly to operate, maintain, deploy, reproduce, and document. In the present climate of shrinking NASA budgets, this proposed architecture takes on added importance as it should dramatically reduce all of the above costs. Currently, the ground support functions (i.e., receiver, tracking, ranging, telemetry, command, monitor and control) are distributed among several subsystems that are housed in individual rack-mounted chassis. These subsystems can be integrated into one portable laptop system using established Multi Chip Module (MCM) packaging technology and object-based software libraries. The large scale integration of subsystems into a small portable system connected to the World Wide Web (WWW) will greatly reduce operations, maintenance and reproduction costs. Several of the subsystems can be implemented using Commercial Off-The-Shelf (COTS) products further decreasing non-recurring engineering costs. The inherent portability of the system will open up new ways for using the ground system at the "point-of-use" site as opposed to maintaining several large centralized stations. This eliminates the propagation delay of the data to the Principal Investigator (PI), enabling the capture of data in real-time and performing multiple tasks concurrently from any location in the world. Sample applications are to use the portable ground system in remote areas or mobile vessels for real-time correlation of satellite data with earth-bound instruments; thus, allowing near real-time feedback and control of scientific instruments. This end-to-end portable ground system will undoubtedly create opportunities for better scientific observation and data acquisition.

  6. End-To-END Performance of the future MOMA intrument aboard the EXOMARS MISSION

    Science.gov (United States)

    Buch, A.; Pinnick, V. T.; Szopa, C.; Grand, N.; Danell, R.; van Amerom, F. H. W.; Freissinet, C.; Glavin, D. P.; Stalport, F.; Arevalo, R. D., Jr.; Coll, P. J.; Steininger, H.; Raulin, F.; Goesmann, F.; Mahaffy, P. R.; Brinckerhoff, W. B.

    2016-12-01

    After the SAM experiment aboard the curiosity rover, the Mars Organic Molecule Analyzer (MOMA) experiment aboard the future ExoMars mission will be the continuation of the search for the organic composition of the Mars surface with the advantage that the sample will be extracted as deep as 2 meters below the martian surface to minimize effects of radiation and oxidation on organic materials. To analyse the wide range of organic composition (volatile and non volatils compounds) of the martian soil MOMA is composed with an UV laser desorption / ionization (LDI) and a pyrolysis gas chromatography ion trap mass spectrometry (pyr-GC-ITMS). In order to analyse refractory organic compounds and chirality samples which undergo GC-ITMS analysis may be submitted to a derivatization process, consisting of the reaction of the sample components with specific reactants (MTBSTFA [1], DMF-DMA [2] or TMAH [3]). To optimize and test the performance of the GC-ITMS instrument we have performed several coupling tests campaigns between the GC, providing by the French team (LISA, LATMOS, CentraleSupelec), and the MS, providing by the US team (NASA, GSFC). Last campaign has been done with the ITU models wich is similar to the flight model and wich include the oven and the taping station providing by the German team (MPS). The results obtained demonstrate the current status of the end-to-end performance of the gas chromatography-mass spectrometry mode of operation. References:[1] Buch, A. et al. (2009) J chrom. A, 43, 143-151. [2] Freissinet et al. (2011) J Chrom A, 1306, 59-71. [3] Geffroy-Rodier, C. et al. (2009) JAAP, 85, 454-459. Acknowledgements: Funding provided by the Mars Exploration Program (point of contact, George Tahu, NASA/HQ). MOMA is a collaboration between NASA and ESA (PI Goesmann, MPS). MOMA-GC team acknowledges support from the French Space Agency (CNES), French National Programme of Planetology (PNP), National French Council (CNRS), Pierre Simon Laplace Institute.

  7. The MARS pathfinder end-to-end information system: A pathfinder for the development of future NASA planetary missions

    Science.gov (United States)

    Cook, Richard A.; Kazz, Greg J.; Tai, Wallace S.

    1996-01-01

    The development of the Mars pathfinder is considered with emphasis on the End-to-End Information System (EEIS) development approach. The primary mission objective is to successfully develop and deliver a single flight system to the Martian surface, demonstrating entry, descent and landing. The EEIS is a set of functions distributed throughout the flight, ground and Mission Operation Systems (MOS) that inter-operate in order to control, collect, transport, process, store and analyze the uplink and downlink information flows of the mission. Coherence between the mission systems is achieved though the EEIS architecture. The key characteristics of the system are: a concurrent engineering approach for the development of flight, ground and mission operation systems; the fundamental EEIS architectural heuristics; a phased incremental EEIS development and test approach, and an EEIS design deploying flight, ground and MOS operability features, including integrated ground and flight based toolsets.

  8. An end-to-end microfluidic platform for engineering life supporting microbes in space exploration missions Project

    Data.gov (United States)

    National Aeronautics and Space Administration — HJ Science & Technology proposes a programmable, low-cost, and compact microfluidic platform capable of running automated end-to-end processes and optimization...

  9. End to End Travel

    Data.gov (United States)

    US Agency for International Development — E2 Solutions is a web based end-to-end travel management tool that includes paperless travel authorization and voucher document submissions, document approval...

  10. End-to-End Trajectory for Conjunction Class Mars Missions Using Hybrid Solar-Electric/Chemical Transportation System

    Science.gov (United States)

    Chai, Patrick R.; Merrill, Raymond G.; Qu, Min

    2016-01-01

    NASA's Human Spaceflight Architecture Team is developing a reusable hybrid transportation architecture in which both chemical and solar-electric propulsion systems are used to deliver crew and cargo to exploration destinations. By combining chemical and solar-electric propulsion into a single spacecraft and applying each where it is most effective, the hybrid architecture enables a series of Mars trajectories that are more fuel efficient than an all chemical propulsion architecture without significant increases to trip time. The architecture calls for the aggregation of exploration assets in cislunar space prior to departure for Mars and utilizes high energy lunar-distant high Earth orbits for the final staging prior to departure. This paper presents the detailed analysis of various cislunar operations for the EMC Hybrid architecture as well as the result of the higher fidelity end-to-end trajectory analysis to understand the implications of the design choices on the Mars exploration campaign.

  11. WARP (workflow for automated and rapid production): a framework for end-to-end automated digital print workflows

    Science.gov (United States)

    Joshi, Parag

    2006-02-01

    Publishing industry is experiencing a major paradigm shift with the advent of digital publishing technologies. A large number of components in the publishing and print production workflow are transformed in this shift. However, the process as a whole requires a great deal of human intervention for decision making and for resolving exceptions during job execution. Furthermore, a majority of the best-of-breed applications for publishing and print production are intrinsically designed and developed to be driven by humans. Thus, the human-intensive nature of the current prepress process accounts for a very significant amount of the overhead costs in fulfillment of jobs on press. It is a challenge to automate the functionality of applications built with the model of human driven exectution. Another challenge is to orchestrate various components in the publishing and print production pipeline such that they work in a seamless manner to enable the system to perform automatic detection of potential failures and take corrective actions in a proactive manner. Thus, there is a great need for a coherent and unifying workflow architecture that streamlines the process and automates it as a whole in order to create an end-to-end digital automated print production workflow that does not involve any human intervention. This paper describes an architecture and building blocks that lay the foundation for a plurality of automated print production workflows.

  12. The Swarm End-to-End mission simulator study: A demonstration of separating the various contributions to Earth's magnetic field using synthetic data

    DEFF Research Database (Denmark)

    Olsen, Nils; Haagmans, R.; Sabaka, T.J.

    2006-01-01

    Swarm, a satellite constellation to measure Earth's magnetic field with unpreceded accuracy, has been selected by ESA for launch in 2009. The mission will provide the best ever survey of the geomagnetic field and its temporal evolution, in order to gain new insights into the Earth system by impro......Swarm, a satellite constellation to measure Earth's magnetic field with unpreceded accuracy, has been selected by ESA for launch in 2009. The mission will provide the best ever survey of the geomagnetic field and its temporal evolution, in order to gain new insights into the Earth system...... by improving our understanding of the Earth's interior and climate. An End-to-End mission performance simulation was carried out during Phase A of the mission, with the aim of analyzing the key system requirements, particularly with respect to the number of Swarm satellites and their orbits related...... applied to the synthetic data to analyze various aspects of field recovery in relation to different number of satellites, different constellations and realistic noise sources. This paper gives an overview of the study activities, describes the generation of the synthetic data, and assesses the obtained...

  13. End-to-end verifiability

    OpenAIRE

    Ryan, Peter; Benaloh, Josh; Rivest, Ronald; Stark, Philip; Teague, Vanessa; Vora, Poorvi

    2016-01-01

    This pamphlet describes end-to-end election verifiability (E2E-V) for a nontechnical audience: election officials, public policymakers, and anyone else interested in secure, transparent, evidence - based electronic elections. This work is part of the Overseas Vote Foundation’s End-to-End Verifiable Internet Voting: Specification and Feasibility Assessment Study (E2E VIV Project), funded by the Democracy Fund.

  14. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The development of the Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large complex systems engineering challenge being addressed in part by focusing on the specific subsystems handling of off-nominal mission and fault tolerance. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML), the Mission and Fault Management (M&FM) algorithms are crafted and vetted in specialized Integrated Development Teams composed of multiple development disciplines. NASA also has formed an M&FM team for addressing fault management early in the development lifecycle. This team has developed a dedicated Vehicle Management End-to-End Testbed (VMET) that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. The flexibility of VMET enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the algorithms utilizing actual subsystem models. The intent is to validate the algorithms and substantiate them with performance baselines for each of the vehicle subsystems in an independent platform exterior to flight software test processes. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test processes. Risk reduction is addressed by working with other organizations such as S

  15. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASAs Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The engineering development of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS) requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The nominal and off-nominal characteristics of SLS's elements and subsystems must be understood and matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex systems engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model-based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model-based algorithms and their development lifecycle from inception through FSW certification are an important focus of SLS's development effort to further ensure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. To test and validate these M&FM algorithms a dedicated test-bed was developed for full Vehicle Management End-to-End Testing (VMET). For addressing fault management (FM

  16. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    Science.gov (United States)

    Trevino, Luis; Patterson, Jonathan; Teare, David; Johnson, Stephen

    2015-01-01

    integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. Additionally, the team has developed processes for implementing and validating these algorithms for concept validation and risk reduction for the SLS program. The flexibility of the Vehicle Management End-to-end Testbed (VMET) enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the developed algorithms utilizing actual subsystem models such as MPS. The intent of VMET is to validate the M&FM algorithms and substantiate them with performance baselines for each of the target vehicle subsystems in an independent platform exterior to the flight software development infrastructure and its related testing entities. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test cases into flight software compounded with potential human errors throughout the development lifecycle. Risk reduction is addressed by the M&FM analysis group working with other organizations such as S&MA, Structures and Environments, GNC, Orion, the Crew Office, Flight Operations, and Ground Operations by assessing performance of the M&FM algorithms in terms of their ability to reduce Loss of Mission and Loss of Crew probabilities. In addition, through state machine and diagnostic modeling, analysis efforts investigate a broader suite of failure effects and associated detection and responses that can be tested in VMET to ensure that failures can be detected, and confirm that responses do not create additional risks or cause undesired states through interactive dynamic effects with other algorithms and systems. VMET further contributes to risk reduction by prototyping and exercising the M&FM algorithms early in their implementation and without any inherent hindrances such as meeting FSW

  17. Arcus end-to-end simulations

    Science.gov (United States)

    Wilms, Joern; Guenther, H. Moritz; Dauser, Thomas; Huenemoerder, David P.; Ptak, Andrew; Smith, Randall; Arcus Team

    2018-01-01

    We present an overview of the end-to-end simulation environment that we are implementing as part of the Arcus phase A Study. With the rcus simulator, we aim to to model the imaging, detection, and event reconstruction properties of the spectrometer. The simulator uses a Monte Carlo ray-trace approach, projecting photons onto the Arcus focal plane from the silicon pore optic mirrors and critical-angle transmission gratings. We simulate the detection and read-out of the photons in the focal plane CCDs with software originally written for the eROSITA and Athena-WFI detectors; we include all relevant detector physics, such as charge splitting, and effects of the detector read-out, such as out of time events. The output of the simulation chain is an event list that closely resembles the data expected during flight. This event list is processed using a prototype event reconstruction chain for the order separation, wavelength calibration, and effective area calibration. The output is compatible with standard X-ray astronomical analysis software.During phase A, the end-to-end simulation approach is used to demonstrate the overall performance of the mission, including a full simulation of the calibration effort. Continued development during later phases of the mission will ensure that the simulator remains a faithful representation of the true mission capabilities, and will ultimately be used as the Arcus calibration model.

  18. SU-F-P-39: End-To-End Validation of a 6 MV High Dose Rate Photon Beam, Configured for Eclipse AAA Algorithm Using Golden Beam Data, for SBRT Treatments Using RapidArc

    Energy Technology Data Exchange (ETDEWEB)

    Ferreyra, M; Salinas Aranda, F; Dodat, D; Sansogne, R; Arbiser, S [Vidt Centro Medico, Ciudad Autonoma De Buenos Aires, Ciudad Autonoma de Buenos Aire (Argentina)

    2016-06-15

    Purpose: To use end-to-end testing to validate a 6 MV high dose rate photon beam, configured for Eclipse AAA algorithm using Golden Beam Data (GBD), for SBRT treatments using RapidArc. Methods: Beam data was configured for Varian Eclipse AAA algorithm using the GBD provided by the vendor. Transverse and diagonals dose profiles, PDDs and output factors down to a field size of 2×2 cm2 were measured on a Varian Trilogy Linac and compared with GBD library using 2% 2mm 1D gamma analysis. The MLC transmission factor and dosimetric leaf gap were determined to characterize the MLC in Eclipse. Mechanical and dosimetric tests were performed combining different gantry rotation speeds, dose rates and leaf speeds to evaluate the delivery system performance according to VMAT accuracy requirements. An end-to-end test was implemented planning several SBRT RapidArc treatments on a CIRS 002LFC IMRT Thorax Phantom. The CT scanner calibration curve was acquired and loaded in Eclipse. PTW 31013 ionization chamber was used with Keithley 35617EBS electrometer for absolute point dose measurements in water and lung equivalent inserts. TPS calculated planar dose distributions were compared to those measured using EPID and MapCheck, as an independent verification method. Results were evaluated with gamma criteria of 2% dose difference and 2mm DTA for 95% of points. Results: GBD set vs. measured data passed 2% 2mm 1D gamma analysis even for small fields. Machine performance tests show results are independent of machine delivery configuration, as expected. Absolute point dosimetry comparison resulted within 4% for the worst case scenario in lung. Over 97% of the points evaluated in dose distributions passed gamma index analysis. Conclusion: Eclipse AAA algorithm configuration of the 6 MV high dose rate photon beam using GBD proved efficient. End-to-end test dose calculation results indicate it can be used clinically for SBRT using RapidArc.

  19. TROPOMI end-to-end performance studies

    Science.gov (United States)

    Voors, Robert; de Vries, Johan; Veefkind, Pepijn; Gloudemans, Annemieke; Mika, Àgnes; Levelt, Pieternel

    2008-10-01

    The TROPOspheric Monitoring Instrument (TROPOMI) is a UV/VIS/NIR/SWIR non-scanning nadir viewing imaging spectrometer that combines a wide swath (110°) with high spatial resolution (8 x 8 km). Its main heritages are from the Ozone Monitoring Instrument (OMI) and from SCIAMACHY. Since its launch in 2004 OMI has been providing, on a daily basis and on a global scale, a wealth of data on ozone, NO2 and minor trace gases, aerosols and local pollution, a scanning spectrometer launched in 2004. The TROPOMI UV/VIS/NIR and SWIR heritage is a combination of OMI and SCIAMACHY. In the framework of development programs for a follow-up mission for the successful Ozone Monitoring Instrument, we have developed the so-called TROPOMI Integrated Development Environment. This is a GRID based software simulation tool for OMI follow-up missions. It includes scene generation, an instrument simulator, a level 0-1b processing chain, as well as several level 1b-2 processing chains. In addition it contains an error-analyzer, i.e. a tool to feedback the level 2 results to the input of the scene generator. The paper gives a description of the TROPOMI instrument and focuses on design aspects as well as on the performance, as tested in the end-to-end development environment TIDE.

  20. Optimizing end-to-end system performance for millimeter and submillimeter spectroscopy of protostars : wideband heterodyne receivers and sideband-deconvolution techniques for rapid molecular-line surveys

    Science.gov (United States)

    Sumner, Matthew Casey

    This thesis describes the construction, integration, and use of a new 230-GHz ultra-wideband heterodyne receiver, as well as the development and testing of a new sideband-deconvolution algorithm, both designed to enable rapid, sensitive molecular-line surveys. The 230-GHz receiver, known as Z-Rex, is the first of a new generation of wideband receivers to be installed at the Caltech Submillimeter Observatory (CSO). Intended as a proof-of-concept device, it boasts an ultra-wide IF output range of sim 6 - 18 GHz, offering as much as a twelvefold increase in the spectral coverage that can be achieved with a single LO setting. A similarly wideband IF system has been designed to couple this receiver to an array of WASP2 spectrometers, allowing the full bandwidth of the receiver to be observed at low resolution, ideal for extra-galactic redshift surveys. A separate IF system feeds a high-resolution 4-GHz AOS array frequently used for performing unbiased line surveys of galactic objects, particularly star-forming regions. The design and construction of the wideband IF system are presented, as is the work done to integrate the receiver and the high-resolution spectrometers into a working system. The receiver is currently installed at the CSO where it is available for astronomers' use. In addition to demonstrating wideband design principles, the receiver also serves as a testbed for a synthesizer-driven, active LO chain that is under consideration for future receiver designs. Several lessons have been learned, including the importance of driving the final amplifier of the LO chain into saturation and the absolute necessity of including a high-Q filter to remove spurious signals from the synthesizer output. The on-telescope performance of the synthesizer-driven LO chain is compared to that of the Gunn-oscillator units currently in use at the CSO. Although the frequency agility of the synthesized LO chain gives it a significant advantage for unbiased line surveys, the cleaner

  1. Standardizing an End-to-end Accounting Service

    Science.gov (United States)

    Greenberg, Edward; Kazz, Greg

    2006-01-01

    Currently there are no space system standards available for space agencies to accomplish end-to-end accounting. Such a standard does not exist for spacecraft operations nor for tracing the relationship between the mission planning activities, the command sequences designed to perform those activities, the commands formulated to initiate those activities and the mission data and specifically the mission data products created by those activities. In order for space agencies to cross-support one another for data accountability/data tracing and for inter agency spacecraft to interoperate with each other, an international CCSDS standard for end-to-end data accountability/tracing needs to be developed. We will first describe the end-to-end accounting service model and functionality that supports the service. This model will describe how science plans that are ultimately transformed into commands can be associated with the telemetry products generated as a result of their execution. Moreover, the interaction between end-to-end accounting and service management will be explored. Finally, we will show how the standard end-to-end accounting service can be applied to a real life flight project i.e., the Mars Reconnaissance Orbiter project.

  2. End-to-End Security for Personal Telehealth

    NARCIS (Netherlands)

    Koster, R.P.; Asim, M.; Petkovic, M.

    2011-01-01

    Personal telehealth is in rapid development with innovative emerging applications like disease management. With personal telehealth people participate in their own care supported by an open distributed system with health services. This poses new end-to-end security and privacy challenges. In this

  3. Model Scaling Approach for the GOCE End to End Simulator

    Science.gov (United States)

    Catastini, G.; De Sanctis, S.; Dumontel, M.; Parisch, M.

    2007-08-01

    The Gravity field and steady-state Ocean Circulation Explorer (GOCE) is the first core Earth explorer of ESA's Earth observation programme of satellites for research in the Earth sciences. The objective of the mission is to produce high-accuracy, high-resolution, global measurements of the Earth's gravity field, leading to improved geopotential and geoid (the equipotential surface corresponding to the steady-state sea level) models for use in a wide range of geophysical applications. More precisely, the GOCE mission is designed to provide a global reconstruction of the geo- potential model and geoid with high spatial resolution (better than 0.1 cm at the degree and order l = 50 and better than 1.0 cm at degree and order l = 200). Such scientific performance scenario requires at least the computation of 200 harmonics of the gravitational field and a simulated time span covering a minimum of 60 days (corresponding to a full coverage of the Earth surface). Thales Alenia Space Italia (TAS-I) is responsible, as Prime Contractor, of the GOCE Satellite. The GOCE mission objective is the high-accuracy retrieval of the Earth gravity field. The idea of an End-to-End simulator (E2E) was conceived in the early stages of the GOCE programme, as an essential tool for supporting the design and verification activities as well as for assessing the satellite system performance. The simulator in its present form has been developed at TAS-I for ESA since the beginning of Phase B and is currently used for: checking the consistency of spacecraft and payload specifications with the overall system requirements supporting trade-off, sensitivity and worst-case analyses supporting design and pre-validation testing of the Drag-Free and Attitude Control (DFAC) laws preparing and testing the on-ground and in-flight gradiometer calibration concepts prototyping the post-processing algorithms, transforming the scientific data from Level 0 (raw telemetry format) to Level 1B (i.e. geo-located gravity

  4. Utilizing Domain Knowledge in End-to-End Audio Processing

    DEFF Research Database (Denmark)

    Tax, Tycho; Antich, Jose Luis Diez; Purwins, Hendrik

    2017-01-01

    End-to-end neural network based approaches to audio modelling are generally outperformed by models trained on high-level data representations. In this paper we present preliminary work that shows the feasibility of training the first layers of a deep convolutional neural network (CNN) model...

  5. Measurements and analysis of end-to-end Internet dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Paxson, Vern [Univ. of California, Berkeley, CA (United States). Computer Science Division

    1997-04-01

    Accurately characterizing end-to-end Internet dynamics - the performance that a user actually obtains from the lengthy series of network links that comprise a path through the Internet - is exceptionally difficult, due to the network`s immense heterogeneity. At the heart of this work is a `measurement framework` in which a number of sites around the Internet host a specialized measurement service. By coordinating `probes` between pairs of these sites one can measure end-to-end behavior along O(N2) paths for a framework consisting of N sites. Consequently, one obtains a superlinear scaling that allows measuring a rich cross-section of Internet behavior without requiring huge numbers of observation points. 37 sites participated in this study, allowing the author to measure more than 1,000 distinct Internet paths. The first part of this work looks at the behavior of end-to-end routing: the series of routers over which a connection`s packets travel. Based on 40,000 measurements made using this framework, the author analyzes: routing `pathologies` such as loops, outages, and flutter; the stability of routes over time; and the symmetry of routing along the two directions of an end-to-end path. The author finds that pathologies increased significantly over the course of 1995 and that Internet paths are heavily dominated by a single route. The second part of this work studies end-to-end Internet packet dynamics. The author analyzes 20,000 TCP transfers of 100 Kbyte each to investigate the performance of both the TCP endpoints and the Internet paths. The measurements used for this part of the study are much richer than those for the first part, but require a great degree of attention to issues of calibration, which are addressed by applying self-consistency checks to the measurements whenever possible. The author finds that packet filters are capable of a wide range of measurement errors, some of which, if undetected, can significantly taint subsequent analysis.

  6. End-to-end plasma bubble PIC simulations on GPUs

    Science.gov (United States)

    Germaschewski, Kai; Fox, William; Matteucci, Jackson; Bhattacharjee, Amitava

    2017-10-01

    Accelerator technologies play a crucial role in eventually achieving exascale computing capabilities. The current and upcoming leadership machines at ORNL (Titan and Summit) employ Nvidia GPUs, which provide vast computational power but also need specifically adapted computational kernels to fully exploit them. In this work, we will show end-to-end particle-in-cell simulations of the formation, evolution and coalescence of laser-generated plasma bubbles. This work showcases the GPU capabilities of the PSC particle-in-cell code, which has been adapted for this problem to support particle injection, a heating operator and a collision operator on GPUs.

  7. Toward End-to-End Face Recognition Through Alignment Learning

    Science.gov (United States)

    Zhong, Yuanyi; Chen, Jiansheng; Huang, Bo

    2017-08-01

    Plenty of effective methods have been proposed for face recognition during the past decade. Although these methods differ essentially in many aspects, a common practice of them is to specifically align the facial area based on the prior knowledge of human face structure before feature extraction. In most systems, the face alignment module is implemented independently. This has actually caused difficulties in the designing and training of end-to-end face recognition models. In this paper we study the possibility of alignment learning in end-to-end face recognition, in which neither prior knowledge on facial landmarks nor artificially defined geometric transformations are required. Specifically, spatial transformer layers are inserted in front of the feature extraction layers in a Convolutional Neural Network (CNN) for face recognition. Only human identity clues are used for driving the neural network to automatically learn the most suitable geometric transformation and the most appropriate facial area for the recognition task. To ensure reproducibility, our model is trained purely on the publicly available CASIA-WebFace dataset, and is tested on the Labeled Face in the Wild (LFW) dataset. We have achieved a verification accuracy of 99.08\\% which is comparable to state-of-the-art single model based methods.

  8. End-to-end network/application performance troubleshooting methodology

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Wenji; Bobyshev, Andrey; Bowden, Mark; Crawford, Matt; Demar, Phil; Grigaliunas, Vyto; Grigoriev, Maxim; Petravick, Don; /Fermilab

    2007-09-01

    The computing models for HEP experiments are globally distributed and grid-based. Obstacles to good network performance arise from many causes and can be a major impediment to the success of the computing models for HEP experiments. Factors that affect overall network/application performance exist on the hosts themselves (application software, operating system, hardware), in the local area networks that support the end systems, and within the wide area networks. Since the computer and network systems are globally distributed, it can be very difficult to locate and identify the factors that are hurting application performance. In this paper, we present an end-to-end network/application performance troubleshooting methodology developed and in use at Fermilab. The core of our approach is to narrow down the problem scope with a divide and conquer strategy. The overall complex problem is split into two distinct sub-problems: host diagnosis and tuning, and network path analysis. After satisfactorily evaluating, and if necessary resolving, each sub-problem, we conduct end-to-end performance analysis and diagnosis. The paper will discuss tools we use as part of the methodology. The long term objective of the effort is to enable site administrators and end users to conduct much of the troubleshooting themselves, before (or instead of) calling upon network and operating system 'wizards,' who are always in short supply.

  9. End-to-End Simulation and Verification of Rendezvous and Docking/Berthing Systems using Robotics

    OpenAIRE

    Benninghoff, Heike

    2016-01-01

    The rendezvous and docking/berthing (RvD/B) phase is one of the most complex and critical parts of future on-orbit servicing missions. Especially the operations during the final approach (separation distance < 20m) have to be verified and tested in detail. Such tests involve on-board systems, communication systems and ground systems. In the framework of an end-to-end simulation of the final approach to and capture of a tumbling client satellite, the necessary components are developed and t...

  10. Probing end-to-end cyclization beyond Willemski and Fixman.

    Science.gov (United States)

    Chen, Shaohua; Duhamel, Jean; Winnik, Mitchell A

    2011-04-07

    A series of poly(ethylene oxide)s labeled at both ends with pyrene, (PEO(X)-Py(2), where X represents the number average molecular weight (M(n)) of the PEO chains and equals 2, 5, 10, and 16.5 K) was prepared together with one-pyrene-monolabeled PEO (PEO(2K)-Py). The process of end-to-end cyclization (EEC) was investigated by monitoring intramolecular excimer formation in seven organic solvents with viscosities (η) ranging from 0.32 to 1.92 mPa·s. The steady-state fluorescence spectra showed that excimer formation of PEO(X)-Py(2) decreased strongly with increasing η and M(n). The monomer and excimer time-resolved fluorescence decays were analyzed according to the traditional Birks' scheme. Birks' scheme analysis indicated that the decrease in excimer formation with increasing M(n) and η was due partly to a decrease in the rate constant of EEC, but most importantly, to a large increase in the fraction of pyrenes that did not form excimer (f(Mfree)). This result is in itself incompatible with Birks' scheme analysis which requires that f(Mfree) be the molar fraction of chains bearing a single pyrene at one chain end; in short, f(Mfree) does not depend on M(n) and η within the framework of Birks' scheme analysis. In turn, this unexpected result agrees with the framework of the fluorescence blob model (FBM) which predicts that quenching takes place inside a blob, which is the finite volume probed by an excited chromophore during its lifetime. Increasing M(n) and η results in a larger fraction of chains having a conformation where the quencher is located outside the blob resulting in an increase in f(Mfree). Equations were derived to apply the FBM analysis, originally designed to study randomly labeled polymers, to the end-labeled PEO(X)-Py(2) series. FBM analysis was found to describe satisfyingly the data obtained with the longer PEO(X)-Py(2) samples.

  11. OGC standards for end-to-end sensor network integration

    Science.gov (United States)

    Headley, K. L.; Broering, A.; O'Reilly, T. C.; Toma, D.; Del Rio, J.; Bermudez, L. E.; Zedlitz, J.; Johnson, G.; Edgington, D.

    2010-12-01

    technology, and can communicate with any sensor whose protocol can be described by a SID. The SID interpreter transfers retrieved sensor data to a Sensor Observation Service, and transforms tasks submitted to a Sensor Planning Service to actual sensor commands. The proposed SWE PUCK protocol complements SID by providing a standard way to associate a sensor with a SID, thereby completely automating the sensor integration process. PUCK protocol is implemented in sensor firmware, and provides a means to retrieve a universally unique identifer, metadata and other information from the device itself through its communication interface. Thus the SID interpreter can retrieve a SID directly from the sensor through PUCK protocol. Alternatively the interpreter can retrieve the sensor’s SID from an external source, based on the unique sensor ID provided by PUCK protocol. In this presentation, we describe the end-to-end integration of several commercial oceanographic instruments into a sensor network using PUCK, SID and SWE services. We also present a user-friendly, graphical tool to generate SIDs and tools to visualize sensor data.

  12. The end-to-end testbed of the optical metrology system on-board LISA Pathfinder

    Energy Technology Data Exchange (ETDEWEB)

    Steier, F; Cervantes, F Guzman; Marin, A F GarcIa; Heinzel, G; Danzmann, K [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut) and Universitaet Hannover (Germany); Gerardi, D, E-mail: frank.steier@aei.mpg.d [EADS Astrium Satellites GmbH, Friedrichshafen (Germany)

    2009-05-07

    LISA Pathfinder is a technology demonstration mission for the Laser Interferometer Space Antenna (LISA). The main experiment on-board LISA Pathfinder is the so-called LISA Technology Package (LTP) which has the aim to measure the differential acceleration between two free-falling test masses with an accuracy of 3 x 10{sup -14} ms{sup -2} Hz{sup -1/2} between 1 mHz and 30 mHz. This measurement is performed interferometrically by the optical metrology system (OMS) on-board LISA Pathfinder. In this paper, we present the development of an experimental end-to-end testbed of the entire OMS. It includes the interferometer and its sub-units, the interferometer backend which is a phasemeter and the processing of the phasemeter output data. Furthermore, three-axes piezo-actuated mirrors are used instead of the free-falling test masses for the characterization of the dynamic behaviour of the system and some parts of the drag-free and attitude control system (DFACS) which controls the test masses and the satellite. The end-to-end testbed includes all parts of the LTP that can reasonably be tested on earth without free-falling test masses. At its present status it consists mainly of breadboard components. Some of those have already been replaced by engineering models of the LTP experiment. In the next steps, further engineering and flight models will also be inserted in this testbed and tested against well-characterized breadboard components. The presented testbed is an important reference for the unit tests and can also be used for validation of the on-board experiment during the mission.

  13. Comparison of postoperative motility in hand-sewn end-to-end anastomosis and functional end-to-end anastomosis: an experimental study in conscious dogs.

    Science.gov (United States)

    Toyomasu, Yoshitaka; Mochiki, Erito; Ando, Hiroyuki; Yanai, Mitsuhiro; Ogata, Kyoichi; Tabe, Yuichi; Ohno, Tetsuro; Aihara, Ryuusuke; Kuwano, Hiroyuki

    2010-09-01

    The objective of this study is to compare the postoperative motility between hand-sewn end-to-end anastomosis and functional end-to-end anastomosis. Fifteen conscious dogs were divided into three groups: normal intact dog group, end-to-end anastomosis group (EE), and functional end-to-end anastomosis group (FEE). In the EE and FEE groups, the dogs underwent a transection of the jejunum 30 cm distal to the Treitz ligament and anastomosis in each method. To compare the gastrointestinal motility, the time to the appearance and the rate of propagation of interdigestive migrating motor contractions (IMC) across the anastomosis, as well as the motility index (MI) at the oral and anal sides of the anastomosis, were measured using strain gauge force transducers. Furthermore, the histological examination of intrinsic nerve fibers was evaluated. The time to the appearance of propagation of IMC in the EE and FEE was not significantly different. The propagation rates of IMC in the EE and FEE completely recovered within 4 weeks of the surgery. The MI in the EE and FEE was not significantly different. In addition, no continuity of intrinsic nerve fibers across the anastomosis could be identified in either group. In the present study, there are no significant differences between the EE and FEE with regard to the time of the appearance and the rate of propagation of IMC. These results suggest that the effect of functional end-to-end anastomosis on postoperative motility is not different from that of hand-sewn end-to-end anastomosis.

  14. Experimental demonstration of software defined data center optical networks with Tbps end-to-end tunability

    Science.gov (United States)

    Zhao, Yongli; Zhang, Jie; Ji, Yuefeng; Li, Hui; Wang, Huitao; Ge, Chao

    2015-10-01

    The end-to-end tunability is important to provision elastic channel for the burst traffic of data center optical networks. Then, how to complete the end-to-end tunability based on elastic optical networks? Software defined networking (SDN) based end-to-end tunability solution is proposed for software defined data center optical networks, and the protocol extension and implementation procedure are designed accordingly. For the first time, the flexible grid all optical networks with Tbps end-to-end tunable transport and switch system have been online demonstrated for data center interconnection, which are controlled by OpenDayLight (ODL) based controller. The performance of the end-to-end tunable transport and switch system has been evaluated with wavelength number tuning, bit rate tuning, and transmit power tuning procedure.

  15. End-To-End Simulation of Launch Vehicle Trajectories Including Stage Separation Dynamics

    Science.gov (United States)

    Albertson, Cindy W.; Tartabini, Paul V.; Pamadi, Bandu N.

    2012-01-01

    The development of methodologies, techniques, and tools for analysis and simulation of stage separation dynamics is critically needed for successful design and operation of multistage reusable launch vehicles. As a part of this activity, the Constraint Force Equation (CFE) methodology was developed and implemented in the Program to Optimize Simulated Trajectories II (POST2). The objective of this paper is to demonstrate the capability of POST2/CFE to simulate a complete end-to-end mission. The vehicle configuration selected was the Two-Stage-To-Orbit (TSTO) Langley Glide Back Booster (LGBB) bimese configuration, an in-house concept consisting of a reusable booster and an orbiter having identical outer mold lines. The proximity and isolated aerodynamic databases used for the simulation were assembled using wind-tunnel test data for this vehicle. POST2/CFE simulation results are presented for the entire mission, from lift-off, through stage separation, orbiter ascent to orbit, and booster glide back to the launch site. Additionally, POST2/CFE stage separation simulation results are compared with results from industry standard commercial software used for solving dynamics problems involving multiple bodies connected by joints.

  16. End-to-end models for marine ecosystems: Are we on the precipice of a significant advance or just putting lipstick on a pig?

    Directory of Open Access Journals (Sweden)

    Kenneth A. Rose

    2012-02-01

    Full Text Available There has been a rapid rise in the development of end-to-end models for marine ecosystems over the past decade. Some reasons for this rise include need for predicting effects of climate change on biota and dissatisfaction with existing models. While the benefits of a well-implemented end-to-end model are straightforward, there are many challenges. In the short term, my view is that the major role of end-to-end models is to push the modelling community forward, and to identify critical data so that these data can be collected now and thus be available for the next generation of end-to-end models. I think we should emulate physicists and build theoretically-oriented models first, and then collect the data. In the long-term, end-to-end models will increase their skill, data collection will catch up, and end-to-end models will move towards site-specific applications with forecasting and management capabilities. One pathway into the future is individual efforts, over-promise, and repackaging of poorly performing component submodels (“lipstick on a pig”. The other pathway is a community-based collaborative effort, with appropriate caution and thoughtfulness, so that the needed improvements are achieved (“significant advance”. The promise of end-to-end modelling is great. We should act now to avoid missing a great opportunity.

  17. Understanding TCP over TCP: effects of TCP tunneling on end-to-end throughput and latency

    Science.gov (United States)

    Honda, Osamu; Ohsaki, Hiroyuki; Imase, Makoto; Ishizuka, Mika; Murayama, Junichi

    2005-10-01

    TCP tunnel is a technology that aggregates and transfers packets sent between end hosts as a single TCP connection. By using a TCP tunnel, the fairness among aggregated flows can be improved and several protocols can be transparently transmitted through a firewall. Currently, many applications such as SSH, VTun, and HTun use a TCP tunnel. However, since most applications running on end hosts generally use TCP, two TCP congestion controls (i.e., end-to-end TCP and tunnel TCP) operate simultaneously and interfere each other. Under certain conditions, it has been known that using a TCP tunnel severely degrades the end-to-end TCP performance. Namely, it has known that using a TCP tunnel drastically degrades the end-to-end TCP throughput for some time, which is called TCP meltdown problem. On the contrary, under other conditions, it has been known that using a TCP tunnel significantly improves the end-to-end TCP performance. However, it is still an open issue --- how, when, and why is a TCP tunnel malicious for end-to-end TCP performance? In this paper, we therefore investigate effect of TCP tunnel on end-to-end TCP performance using simulation experiments. Specifically, we quantitatively reveal effects of several factors (e.g., the propagation delay, usage of SACK option, TCP socket buffer size, and sender buffer size of TCP tunnel) on performance of end-to-end TCP and tunnel TCP.

  18. End-to-End Image Simulator for Optical Imaging Systems: Equations and Simulation Examples

    Directory of Open Access Journals (Sweden)

    Peter Coppo

    2013-01-01

    Full Text Available The theoretical description of a simplified end-to-end software tool for simulation of data produced by optical instruments, starting from either synthetic or airborne hyperspectral data, is described and some simulation examples of hyperspectral and panchromatic images for existing and future design instruments are also reported. High spatial/spectral resolution images with low intrinsic noise and the sensor/mission specifications are used as inputs for the simulations. The examples reported in this paper show the capabilities of the tool for simulating target detection scenarios, data quality assessment with respect to classification performance and class discrimination, impact of optical design on image quality, and 3D modelling of optical performances. The simulator is conceived as a tool (during phase 0/A for the specification and early development of new Earth observation optical instruments, whose compliance to user’s requirements is achieved through a process of cost/performance trade-off. The Selex Galileo simulator, as compared with other existing image simulators for phase C/D projects of space-borne instruments, implements all modules necessary for a complete panchromatic and hyper spectral image simulation, and it allows excellent flexibility and expandability for new integrated functions because of the adopted IDL-ENVI software environment.

  19. Atlantis model outputs - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  20. Model outputs - Developing end-to-end models of the Gulf of California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the northern Gulf of California, linking oceanography, biogeochemistry, food web...

  1. Physical oceanography - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  2. Rapid Automated Mission Planning System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed innovation is an automated UAS mission planning system that will rapidly identify emergency (contingency) landing sites, manage contingency routing, and...

  3. Urban Biomining Meets Printable Electronics: End-To-End at Destination Biological Recycling and Reprinting

    Science.gov (United States)

    Rothschild, Lynn J. (Principal Investigator); Koehne, Jessica; Gandhiraman, Ram; Navarrete, Jesica; Spangle, Dylan

    2017-01-01

    Space missions rely utterly on metallic components, from the spacecraft to electronics. Yet, metals add mass, and electronics have the additional problem of a limited lifespan. Thus, current mission architectures must compensate for replacement. In space, spent electronics are discarded; on earth, there is some recycling but current processes are toxic and environmentally hazardous. Imagine instead an end-to-end recycling of spent electronics at low mass, low cost, room temperature, and in a non-toxic manner. Here, we propose a solution that will not only enhance mission success by decreasing upmass and providing a fresh supply of electronics, but in addition has immediate applications to a serious environmental issue on the Earth. Spent electronics will be used as feedstock to make fresh electronic components, a process we will accomplish with so-called 'urban biomining' using synthetically enhanced microbes to bind metals with elemental specificity. To create new electronics, the microbes will be used as 'bioink' to print a new IC chip, using plasma jet electronics printing. The plasma jet electronics printing technology will have the potential to use martian atmospheric gas to print and to tailor the electronic and chemical properties of the materials. Our preliminary results have suggested that this process also serves as a purification step to enhance the proportion of metals in the 'bioink'. The presence of electric field and plasma can ensure printing in microgravity environment while also providing material morphology and electronic structure tunabiity and thus optimization. Here we propose to increase the TRL level of the concept by engineering microbes to dissolve the siliceous matrix in the IC, extract copper from a mixture of metals, and use the microbes as feedstock to print interconnects using mars gas simulant. To assess the ability of this concept to influence mission architecture, we will do an analysis of the infrastructure required to execute

  4. A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Michelle M. [Southern Illinois Univ., Carbondale, IL (United States); Wu, Chase Q. [Univ. of Memphis, TN (United States)

    2013-11-07

    Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization for this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.

  5. Automated Design of Propellant-Optimal, End-to-End, Low-Thrust Trajectories for Trojan Asteroid Tours

    Science.gov (United States)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2013-01-01

    The Sun-Jupiter Trojan asteroids are celestial bodies of great scientific interest as well as potential resources offering water and other mineral resources for longterm human exploration of the solar system. Previous investigations under this project have addressed the automated design of tours within the asteroid swarm. This investigation expands the current automation scheme by incorporating options for a complete trajectory design approach to the Trojan asteroids. Computational aspects of the design procedure are automated such that end-to-end trajectories are generated with a minimum of human interaction after key elements and constraints associated with a proposed mission concept are specified.

  6. Astra: Interdisciplinary study on enhancement of the end-to-end accuracy for spacecraft tracking techniques

    Science.gov (United States)

    Iess, Luciano; Di Benedetto, Mauro; James, Nick; Mercolino, Mattia; Simone, Lorenzo; Tortora, Paolo

    2014-02-01

    Navigation of deep-space probes is accomplished through a variety of different radio observables, namely Doppler, ranging and Delta-Differential One-Way Ranging (Delta-DOR). The particular mix of observations used for navigation mainly depends on the available on-board radio system, the mission phase and orbit determination requirements. The accuracy of current ESA and NASA tracking systems is at level of 0.1 mm/s at 60 s integration time for Doppler, 1-5 m for ranging and 6-15 nrad for Delta-DOR measurements in a wide range of operational conditions. The ASTRA study, funded under ESA's General Studies Programme (GSP), addresses the ways to improve the end-to-end accuracy of Doppler, ranging and Delta-DOR systems by roughly a factor of 10. The target accuracies were set to 0.01 mm/s at 60 s integration time for Doppler, 20 cm for ranging and 1 nrad for Delta-DOR. The companies and universities that took part in the study were the University of Rome Sapienza, ALMASpace, BAE Systems and Thales Alenia Space Italy. The analysis of an extensive data set of radio-metric observables and dedicated tests of the ground station allowed consolidating the error budget for each measurement technique. The radio-metric data set comprises X/X, X/Ka and Ka/Ka range and Doppler observables from the Cassini and Rosetta missions. It includes also measurements from the Advanced Media Calibration System (AMCS) developed by JPL for the radio science experiments of the Cassini mission. The error budget for the three radio-metric observables was consolidated by comparing the statistical properties of the data set with the expected error models. The analysis confirmed the contribution from some error sources, but revealed also some discrepancies and ultimately led to improved error models. The error budget reassessment provides adequate information for building guidelines and strategies to effectively improve the navigation accuracies of future deep space missions. We report both on updated

  7. An end-to-end communications architecture for condition-based maintenance applications

    Science.gov (United States)

    Kroculick, Joseph

    2014-06-01

    This paper explores challenges in implementing an end-to-end communications architecture for Condition-Based Maintenance Plus (CBM+) data transmission which aligns with the Army's Network Modernization Strategy. The Army's Network Modernization strategy is based on rolling out network capabilities which connect the smallest unit and Soldier level to enterprise systems. CBM+ is a continuous improvement initiative over the life cycle of a weapon system or equipment to improve the reliability and maintenance effectiveness of Department of Defense (DoD) systems. CBM+ depends on the collection, processing and transport of large volumes of data. An important capability that enables CBM+ is an end-to-end network architecture that enables data to be uploaded from the platform at the tactical level to enterprise data analysis tools. To connect end-to-end maintenance processes in the Army's supply chain, a CBM+ network capability can be developed from available network capabilities.

  8. End-to-end requirements management for multiprojects in the construction industry

    DEFF Research Database (Denmark)

    Wörösch, Michael

    industries. The research gives at the same time managers of construction projects a tool with which to manage their requirements end-to-end. In order to investigate how construction companies handle requirements, a case project – a Danish construction syndicate producing sandwich elements made from High...... Performance Concrete and insulation materials – is used. By means of action research and interviews of case project staff it has become evident that many elements of formalized requirements management are missing in the case project. To fill those gaps and be able to manage requirements end-to-end...

  9. Development of an End-to-End Active Debris Removal (ADR) Mission Strategic Plan Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Since the majority of the potential ADR targets are large (>meters) upper stages and payloads between 800 and 1100 km altitude, they are relatively bright, with...

  10. End-to-End simulation study of a full magnetic gradiometry mission

    DEFF Research Database (Denmark)

    Kotsiaros, Stavros; Olsen, Nils

    2014-01-01

    of a simulated low Earth orbiting satellite. The observations are synthesized from realistic models based upon a combination of the major sources contributing to the Earth’s magnetic field. From those synthetic data, we estimate field models using either the magnetic vector field observations only or the full......In this paper, we investigate space magnetic gradiometry as a possible path for future exploration of the Earth’s magnetic field with satellites. Synthetic observations of the magnetic field vector and of six elements of the magnetic gradient tensor are calculated for times and positions...

  11. Security Considerations around End-to-End Security in the IP-based Internet of Things

    NARCIS (Netherlands)

    Brachmann, M.; Garcia-Mochon, O.; Keoh, S.L.; Kumar, S.S.

    2012-01-01

    The IP-based Internet of Things refers to the interconnection of smart objects in a Low-power and Lossy Network (LLN) with the Internetby means of protocols such as 6LoWPAN or CoAP. The provisioning of an end-to-end security connection is the key to ensure basic functionalities such as software

  12. Performance Enhancements of UMTS networks using end-to-end QoS provisioning

    DEFF Research Database (Denmark)

    Wang, Haibo; Prasad, Devendra; Teyeb, Oumer

    2005-01-01

    This paper investigates the end-to-end(E2E) quality of service(QoS) provisioning approaches for UMTS networks together with DiffServ IP network. The effort was put on QoS classes mapping from DiffServ to UMTS, Access Control(AC), buffering and scheduling optimization. The DiffServ Code Point (DSCP...

  13. Hardware Support for Malware Defense and End-to-End Trust

    Science.gov (United States)

    2017-02-01

    HARDWARE SUPPORT FOR MALWARE DEFENSE AND END-TO- END TRUST INTERNATIONAL BUSINESS MACHINES CORPORATION ( IBM ) FEBRUARY 2017 FINAL TECHNICAL REPORT...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) International Business Machines Corporation T.J. Watson Research Center 1101 Kitchawan Rd Yorktown...for bare bones ACM firmware .................................................. 37 iii ACKNOWLEDGMENTS The following staff members at IBM Research

  14. Coupling of a single quantum emitter to end-to-end aligned silver nanowires

    DEFF Research Database (Denmark)

    Kumar, Shailesh; Huck, Alexander; Chen, Yuntian

    2013-01-01

    We report on the observation of coupling a single nitrogen vacancy (NV) center in a nanodiamond crystal to a propagating plasmonic mode of silver nanowires. The nanocrystal is placed either near the apex of a single silver nanowire or in the gap between two end-to-end aligned silver nanowires. We...

  15. Strategic design issues of IMS versus end-to-end architectures

    NARCIS (Netherlands)

    Braet, O.; Ballon, P.

    2007-01-01

    Purpose - The paper aims to discuss the business issues surrounding the choice between the end-to-end internet architecture, in particular peer-to-peer networks, versus managed telecommunications architectures, in particular IMS, for the migration towards a next-generation mobile system.

  16. End-to-End Availability Analysis of IMS-Based Networks

    DEFF Research Database (Denmark)

    Kamyod, Chayapol; Nielsen, Rasmus Hjorth; Prasad, Neeli R.

    2013-01-01

    Generation Networks (NGNs). In this paper, an end-to-end availability model is proposed and evaluated using a combination of Reliability Block Diagrams (RBD) and a proposed five-state Markov model. The overall availability for intra- and inter domain communication in IMS is analyzed, and the state...

  17. Integrated Information and Network Management for End-to-End Quality of Service

    Science.gov (United States)

    2011-11-01

    Integrated Information and Network Management for End-to-End Quality of Service Marco Carvalho and Adrian Granados Florida Institute for...Carvalho, M., Granados A., Naqwi, W., Brothers, A., Hanna, J., and Turck, K., “A cross-layer communications substrate for tactical information management

  18. End-to-end Configuration of Wireless Realtime Communication over Heterogeneous Protocols

    DEFF Research Database (Denmark)

    Malinowsky, B.; Grønbæk, Jesper; Schwefel, Hans-Peter

    2015-01-01

    This paper describes a wireless real-time communication system design using two Time Division Multiple Access (TDMA) protocols. Messages are subject to prioritization and queuing. For this interoperation scenario, we show a method for end-to-end configuration of protocols and queue sizes...

  19. Benefits of Intraluminal Agarose Stents during End-to-End Intestinal Anastomosis in New Zealand White Rabbits.

    Science.gov (United States)

    Kuo, Wen-Yao; Huang, Hsiao-Chun; Huang, Shih-Wei; Yu, Kuan-Hua; Cheng, Feng-Pang; Wang, Jiann-Hsiung; Wu, Jui-Te

    2017-12-01

    In the present study, we evaluated the utility of an intraluminal agarose stent (IAS) for end-to-end intestinal anastomoses in rabbits. Female New Zealand white rabbits (n = 14) underwent conventional sutured anastomosis (CSA) with or without an IAS. IAS were used to maintain the luminal diameter for more rapid and accurate suturing, and then was squeezed transluminally to crush it into fragments, which passed through the intestines and were eliminated. The rabbits were euthanized on postoperative day 21. At necropsy, the anastomoses were assessed for adhesion formation, stenosis, and bursting pressure and were examined histologically for collagen content and blood vessel formation. Anastamosis surgery took less time in the IAS group (15.0 ± 2.6 min) than in the CSA-only group (30.1 ± 7.9 min). Only 1 postoperative death occurred (in the CSA group), and postmortem examination revealed evidence of anastomotic leakage. Adhesion formation and stenosis did not differ between groups, but bursting pressure, collagen content, and blood vessel formation were all significantly increased in the IAS group. IAS may decrease the operative time by maintaining a clear surgical field at the anastomotic site. In addition, the use of IAS promotes rapid healing and maintains the luminal diameter during end-to-end intestinal anastomosis.

  20. End-to-end delay analysis in wireless sensor networks with service vacation

    KAUST Repository

    Alabdulmohsin, Ibrahim

    2014-04-01

    In this paper, a delay-sensitive multi-hop wireless sensor network is considered, employing an M/G/1 with vacations framework. Sensors transmit measurements to a predefined data sink subject to maximum end-to-end delay constraint. In order to prolong the battery lifetime, a sleeping scheme is adopted throughout the network nodes. The objective of our proposed framework is to present an expression for maximum hop-count as well as an approximate expression of the probability of blocking at the sink node upon violating certain end-to-end delay threshold. Using numerical simulations, we validate the proposed analytical model and demonstrate that the blocking probability of the system for various vacation time distributions matches the simulation results.

  1. Building dialogue POMDPs from expert dialogues an end-to-end approach

    CERN Document Server

    Chinaei, Hamidreza

    2016-01-01

    This book discusses the Partially Observable Markov Decision Process (POMDP) framework applied in dialogue systems. It presents POMDP as a formal framework to represent uncertainty explicitly while supporting automated policy solving. The authors propose and implement an end-to-end learning approach for dialogue POMDP model components. Starting from scratch, they present the state, the transition model, the observation model and then finally the reward model from unannotated and noisy dialogues. These altogether form a significant set of contributions that can potentially inspire substantial further work. This concise manuscript is written in a simple language, full of illustrative examples, figures, and tables. Provides insights on building dialogue systems to be applied in real domain Illustrates learning dialogue POMDP model components from unannotated dialogues in a concise format Introduces an end-to-end approach that makes use of unannotated and noisy dialogue for learning each component of dialogue POM...

  2. Increasing Army Supply Chain Performance: Using an Integrated End to End Metrics System

    Science.gov (United States)

    2017-01-01

    Berkowitz Army Materiel Command and the University of Alabama in Huntsville part- nered to develop an integrated end-to-end performance metrics system...their supply chains, Army Materiel Command (AMC), headquar- tered in Huntsville, Alabama, partnered with University of Alabama–Huntsville (UAH) in...chain metrics system. This research proj- ect, named the Enterprise Supply Chain Analysis & Logistics Engine (eSCALE) project, was man- aged by AMC

  3. Financing the End-to-end Supply Chain: A Reference Guide to Supply Chain Finance

    OpenAIRE

    Templar, Simon; Hofmann, Erik; Findlay, Charles

    2016-01-01

    Financing the End to End Supply Chain provides readers with a real insight into the increasingly important area of supply chain finance. It demonstrates the importance of the strategic relationship between the physical supply of goods and services and the associated financial flows. The book provides a clear introduction, demonstrating the importance of the strategic relationship between supply chain and financial communities within an organization. It contains vital information on how supply...

  4. End-to-end security in telemedical networks--a practical guideline.

    Science.gov (United States)

    Wozak, Florian; Schabetsberger, Thomas; Ammmenwerth, Elske

    2007-01-01

    The interconnection of medical networks in different healthcare institutions will be constantly increasing over the next few years, which will require concepts for securing medical data during transfer, since transmitting patient related data via potentially insecure public networks is considered a violation of data privacy. The aim of our work was to develop a model-based approach towards end-to-end security which is defined as continuous security from point of origin to point of destination in a communication process. We show that end-to-end security must be seen as a holistic security concept, which comprises the following three major parts: authentication and access control, transport security, as well as system security. For integration into existing security infrastructures abuse case models were used, which extend UML use cases, by elements necessary to describe abusive interactions. Abuse case models can be constructed for each part mentioned above, allowing for potential security risks in communication from point of origin to point of destination to be identified and counteractive measures to be directly derived from the abuse case models. The model-based approach is a guideline to continuous risk assessment and improvement of end-to-end security in medical networks. Validity and relevance to practice will be systematically evaluated using close-to-reality test networks as well as in production environments.

  5. Direct muscle neurotization after end-to end and end-to-side neurorrhaphy

    Science.gov (United States)

    Papalia, Igor; Ronchi, Giulia; Muratori, Luisa; Mazzucco, Alessandra; Magaudda, Ludovico; Geuna, Stefano

    2012-01-01

    The need for the continuous research of new tools for improving motor function recovery after nerve injury is justified by the still often unsatisfactory clinical outcome in these patients. It has been previously shown that the combined use of two reconstructive techniques, namely end-to-side neurorrhaphy and direct muscle neurotization in the rat hindlimb model, can lead to good results in terms of skeletal muscle reinnervation. Here we show that, in the rat forelimb model, the combined use of direct muscle neurotization with either end-to-end or end-to-side neurorrhaphy to reinnervate the denervated flexor digitorum muscles, leads to muscle atrophy prevention over a long postoperative time lapse (10 months). By contrast, very little motor recovery (in case of end-to-end neurorrhaphy) and almost no motor recovery (in case of end-to-side neurorrhaphy) were observed in the grasping activity controlled by flexor digitorum muscles. It can thus be concluded that, at least in the rat, direct muscle neurotization after both end-to-end and end-to-side neurorrhaphy represents a good strategy for preventing denervation-related muscle atrophy but not for regaining the lost motor function. PMID:25538749

  6. End to end adaptive congestion control in TCP/IP networks

    CERN Document Server

    Houmkozlis, Christos N

    2012-01-01

    This book provides an adaptive control theory perspective on designing congestion controls for packet-switching networks. Relevant to a wide range of disciplines and industries, including the music industry, computers, image trading, and virtual groups, the text extensively discusses source oriented, or end to end, congestion control algorithms. The book empowers readers with clear understanding of the characteristics of packet-switching networks and their effects on system stability and performance. It provides schemes capable of controlling congestion and fairness and presents real-world app

  7. WiMAX security and quality of service an end-to-end perspective

    CERN Document Server

    Tang, Seok-Yee; Sharif, Hamid

    2010-01-01

    WiMAX is the first standard technology to deliver true broadband mobility at speeds that enable powerful multimedia applications such as Voice over Internet Protocol (VoIP), online gaming, mobile TV, and personalized infotainment. WiMAX Security and Quality of Service, focuses on the interdisciplinary subject of advanced Security and Quality of Service (QoS) in WiMAX wireless telecommunication systems including its models, standards, implementations, and applications. Split into 4 parts, Part A of the book is an end-to-end overview of the WiMAX architecture, protocol, and system requirements.

  8. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    Science.gov (United States)

    Feinberg, Lee; Bolcar, Matt; Liu, Alice; Guyon, Olivier; Stark,Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance.

  9. On end-to-end performance of MIMO multiuser in cognitive radio networks

    KAUST Repository

    Yang, Yuli

    2011-12-01

    In this paper, a design for the multiple-input-multiple-output (MIMO) multiuser transmission in the cognitive radio network is developed and its end-to-end performance is investigated under spectrum-sharing constraints. Firstly, the overall average packet error rate is analyzed by considering the channel state information feedback delay and the multiuser scheduling. Then, we provide corresponding numerical results to measure the performance evaluation for several separate scenarios, which presents a convenient tool for the cognitive radio network design with multiple secondary MIMO users. © 2011 IEEE.

  10. END-TO-END DEPTH FROM MOTION WITH STABILIZED MONOCULAR VIDEOS

    Directory of Open Access Journals (Sweden)

    C. Pinard

    2017-08-01

    Full Text Available We propose a depth map inference system from monocular videos based on a novel dataset for navigation that mimics aerial footage from gimbal stabilized monocular camera in rigid scenes. Unlike most navigation datasets, the lack of rotation implies an easier structure from motion problem which can be leveraged for different kinds of tasks such as depth inference and obstacle avoidance. We also propose an architecture for end-to-end depth inference with a fully convolutional network. Results show that although tied to camera inner parameters, the problem is locally solvable and leads to good quality depth prediction.

  11. The RapidEye mission design

    Science.gov (United States)

    Tyc, George; Tulip, John; Schulten, Daniel; Krischke, Manfred; Oxfort, Michael

    2005-01-01

    The RapidEye mission is a commercial remote sensing mission by the German Company RapidEye AG. The RapidEye mission will deliver information products for various customers in the agricultural insurance market, large producers, international institutions and cartography. The mission consists of a constellation of five identical small satellites and a sophisticated ground infrastructure based on proven systems. The five satellites will be placed in a single sun-synchronous orbit of approximately 620 km, with the satellites equally spaced over the orbit. The RapidEye system has the unique ability to image any area on earth once per day and can also provide large area coverage within 5 days. The satellites will each carry a 5 band multi-spectral optical imager with a ground sampling distance of 6.5 m at nadir and a swath width of 80 km. These capabilities along with the processing throughput of the ground segment allows the system to deliver the information products needed by the customers reliably and in a time frame that meets their specific needs.

  12. Analytical Framework for End-to-End Delay Based on Unidirectional Highway Scenario

    Directory of Open Access Journals (Sweden)

    Aslinda Hassan

    2015-01-01

    Full Text Available In a sparse vehicular ad hoc network, a vehicle normally employs a carry and forward approach, where it holds the message it wants to transmit until the vehicle meets other vehicles or roadside units. A number of analyses in the literature have been done to investigate the time delay when packets are being carried by vehicles on both unidirectional and bidirectional highways. However, these analyses are focusing on the delay between either two disconnected vehicles or two disconnected vehicle clusters. Furthermore, majority of the analyses only concentrate on the expected value of the end-to-end delay when the carry and forward approach is used. Using regression analysis, we establish the distribution model for the time delay between two disconnected vehicle clusters as an exponential distribution. Consequently, a distribution is newly derived to represent the number of clusters on a highway using a vehicular traffic model. From there, we are able to formulate end-to-end delay model which extends the time delay model for two disconnected vehicle clusters to multiple disconnected clusters on a unidirectional highway. The analytical results obtained from the analytical model are then validated through simulation results.

  13. The end-to-end simulator for the E-ELT HIRES high resolution spectrograph

    Science.gov (United States)

    Genoni, M.; Landoni, M.; Riva, M.; Pariani, G.; Mason, E.; Di Marcantonio, P.; Disseau, K.; Di Varano, I.; Gonzalez, O.; Huke, P.; Korhonen, H.; Li Causi, Gianluca

    2017-06-01

    We present the design, architecture and results of the End-to-End simulator model of the high resolution spectrograph HIRES for the European Extremely Large Telescope (E-ELT). This system can be used as a tool to characterize the spectrograph both by engineers and scientists. The model allows to simulate the behavior of photons starting from the scientific object (modeled bearing in mind the main science drivers) to the detector, considering also calibration light sources, and allowing to perform evaluation of the different parameters of the spectrograph design. In this paper, we will detail the architecture of the simulator and the computational model which are strongly characterized by modularity and flexibility that will be crucial in the next generation astronomical observation projects like E-ELT due to of the high complexity and long-time design and development. Finally, we present synthetic images obtained with the current version of the End-to-End simulator based on the E-ELT HIRES requirements (especially high radial velocity accuracy). Once ingested in the Data reduction Software (DRS), they will allow to verify that the instrument design can achieve the radial velocity accuracy needed by the HIRES science cases.

  14. End-to-End Beam Dynamics Simulations for the ANL-RIA Driver Linac

    CERN Document Server

    Ostroumov, P N

    2004-01-01

    The proposed Rare Isotope Accelerator (RIA) Facility consists of a superconducting (SC) 1.4 GV driver linac capable of producing 400 kW beams of any ion from hydrogen to uranium. The driver is configured as an array of ~350 SC cavities, each with independently controllable rf phase. For the end-to-end beam dynamics design and simulation we use a dedicated code, TRACK. The code integrates ion motion through the three-dimensional fields of all elements of the driver linac beginning from the exit of the electron cyclotron resonance (ECR) ion source to the production targets. TRACK has been parallelized and is able to track large number of particles in randomly seeded accelerators with misalignments and a comprehensive set of errors. The simulation starts with multi-component dc ion beams extracted from the ECR. Beam losses are obtained by tracking up to million particles in hundreds of randomly seeded accelerators. To control beam losses a set of collimators is applied in designated areas. The end-to-end simulat...

  15. An End-To-End Test of A Simulated Nuclear Electric Propulsion System

    Science.gov (United States)

    VanDyke, Melissa; Hrbud, Ivana; Goddfellow, Keith; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    The Safe Affordable Fission Engine (SAFE) test series addresses Phase I Space Fission Systems issues in it particular non-nuclear testing and system integration issues leading to the testing and non-nuclear demonstration of a 400-kW fully integrated flight unit. The first part of the SAFE 30 test series demonstrated operation of the simulated nuclear core and heat pipe system. Experimental data acquired in a number of different test scenarios will validate existing computational models, demonstrated system flexibility (fast start-ups, multiple start-ups/shut downs), simulate predictable failure modes and operating environments. The objective of the second part is to demonstrate an integrated propulsion system consisting of a core, conversion system and a thruster where the system converts thermal heat into jet power. This end-to-end system demonstration sets a precedent for ground testing of nuclear electric propulsion systems. The paper describes the SAFE 30 end-to-end system demonstration and its subsystems.

  16. End-to-End Airplane Detection Using Transfer Learning in Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Zhong Chen

    2018-01-01

    Full Text Available Airplane detection in remote sensing images remains a challenging problem due to the complexity of backgrounds. In recent years, with the development of deep learning, object detection has also obtained great breakthroughs. For object detection tasks in natural images, such as the PASCAL (Pattern Analysis, Statistical Modelling and Computational Learning VOC (Visual Object Classes Challenge, the major trend of current development is to use a large amount of labeled classification data to pre-train the deep neural network as a base network, and then use a small amount of annotated detection data to fine-tune the network for detection. In this paper, we use object detection technology based on deep learning for airplane detection in remote sensing images. In addition to using some characteristics of remote sensing images, some new data augmentation techniques have been proposed. We also use transfer learning and adopt a single deep convolutional neural network and limited training samples to implement end-to-end trainable airplane detection. Classification and positioning are no longer divided into multistage tasks; end-to-end detection attempts to combine them for optimization, which ensures an optimal solution for the final stage. In our experiment, we use remote sensing images of airports collected from Google Earth. The experimental results show that the proposed algorithm is highly accurate and meaningful for remote sensing object detection.

  17. Cyberinfrastructure to support Real-time, End-to-End, High Resolution, Localized Forecasting

    Science.gov (United States)

    Ramamurthy, M. K.; Lindholm, D.; Baltzer, T.; Domenico, B.

    2004-12-01

    From natural disasters such as flooding and forest fires to man-made disasters such as toxic gas releases, the impact of weather-influenced severe events on society can be profound. Understanding, predicting, and mitigating such local, mesoscale events calls for a cyberinfrastructure to integrate multidisciplinary data, tools, and services as well as the capability to generate and use high resolution data (such as wind and precipitation) from localized models. The need for such end to end systems -- including data collection, distribution, integration, assimilation, regionalized mesoscale modeling, analysis, and visualization -- has been realized to some extent in many academic and quasi-operational environments, especially for atmospheric sciences data. However, many challenges still remain in the integration and synthesis of data from multiple sources and the development of interoperable data systems and services across those disciplines. Over the years, the Unidata Program Center has developed several tools that have either directly or indirectly facilitated these local modeling activities. For example, the community is using Unidata technologies such as the Internet Data Distribution (IDD) system, Local Data Manger (LDM), decoders, netCDF libraries, Thematic Realtime Environmental Distributed Data Services (THREDDS), and the Integrated Data Viewer (IDV) in their real-time prediction efforts. In essence, these technologies for data reception and processing, local and remote access, cataloging, and analysis and visualization coupled with technologies from others in the community are becoming the foundation of a cyberinfrastructure to support an end-to-end regional forecasting system. To build on these capabilities, the Unidata Program Center is pleased to be a significant contributor to the Linked Environments for Atmospheric Discovery (LEAD) project, a NSF-funded multi-institutional large Information Technology Research effort. The goal of LEAD is to create an

  18. Sieving of H2 and D2 Through End-to-End Nanotubes

    Science.gov (United States)

    Devagnik, Dasgupta; Debra, J. Searles; Lamberto, Rondoni; Stefano, Bernardi

    2014-10-01

    We study the quantum molecular sieving of H2 and D2 through two nanotubes placed end-to-end. An analytic treatment, assuming that the particles have classical motion along the axis of the nanotube and are confined in a potential well in the radial direction, is considered. Using this idealistic model, and under certain conditions, it is found that this device can act as a complete sieve, allowing chemically pure deuterium to be isolated from an isotope mixture. We also consider a more realistic model of two carbon nanotubes and carry out molecular dynamics simulations using a Feynman—Hibbs potential to model the quantum effects on the dynamics of H2 and D2. Sieving is also observed in this case, but is caused by a different process.

  19. End-to-end interoperability and workflows from building architecture design to one or more simulations

    Science.gov (United States)

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  20. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    Science.gov (United States)

    Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Oliver; Stark, Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and Exo-Earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling it.

  1. End-to-end simulations of the E-ELT/METIS coronagraphs

    Science.gov (United States)

    Carlomagno, Brunella; Absil, Olivier; Kenworthy, Matthew; Ruane, Garreth; Keller, Christoph U.; Otten, Gilles; Feldt, Markus; Hippler, Stefan; Huby, Elsa; Mawet, Dimitri; Delacroix, Christian; Surdej, Jean; Habraken, Serge; Forsberg, Pontus; Karlsson, Mikael; Vargas Catalan, Ernesto; Brandl, Bernhard R.

    2016-07-01

    The direct detection of low-mass planets in the habitable zone of nearby stars is an important science case for future E-ELT instruments such as the mid-infrared imager and spectrograph METIS, which features vortex phase masks and apodizing phase plates (APP) in its baseline design. In this work, we present end-to-end performance simulations, using Fourier propagation, of several METIS coronagraphic modes, including focal-plane vortex phase masks and pupil-plane apodizing phase plates, for the centrally obscured, segmented E-ELT pupil. The atmosphere and the AO contributions are taken into account. Hybrid coronagraphs combining the advantages of vortex phase masks and APPs are considered to improve the METIS coronagraphic performance.

  2. End-to-End Beam Simulations for the New Muon G-2 Experiment at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Korostelev, Maxim [Cockcroft Inst. Accel. Sci. Tech.; Bailey, Ian [Lancaster U.; Herrod, Alexander [Liverpool U.; Morgan, James [Fermilab; Morse, William [RIKEN BNL; Stratakis, Diktys [RIKEN BNL; Tishchenko, Vladimir [RIKEN BNL; Wolski, Andrzej [Cockcroft Inst. Accel. Sci. Tech.

    2016-06-01

    The aim of the new muon g-2 experiment at Fermilab is to measure the anomalous magnetic moment of the muon with an unprecedented uncertainty of 140 ppb. A beam of positive muons required for the experiment is created by pion decay. Detailed studies of the beam dynamics and spin polarization of the muons are important to predict systematic uncertainties in the experiment. In this paper, we present the results of beam simulations and spin tracking from the pion production target to the muon storage ring. The end-to-end beam simulations are developed in Bmad and include the processes of particle decay, collimation (with accurate representation of all apertures) and spin tracking.

  3. Establishing end-to-end security in a nationwide network for telecooperation.

    Science.gov (United States)

    Staemmler, Martin; Walz, Michael; Weisser, Gerald; Engelmann, Uwe; Weininger, Robert; Ernstberger, Antonio; Sturm, Johannes

    2012-01-01

    Telecooperation is used to support care for trauma patients by facilitating a mutual exchange of treatment and image data in use-cases such as emergency consultation, second-opinion, transfer, rehabilitation and out-patient aftertreatment. To comply with data protection legislation a two-factor authentication using ownership and knowledge has been implemented to assure personalized access rights. End-to-end security is achieved by symmetric encryption in combination with external trusted services which provide the symmetric key solely at runtime. Telecooperation partners may be chosen at departmental level but only individuals of that department, as a result of checking the organizational assignments maintained by LDAP services, are granted access. Data protection officers of a federal state have accepted the data protection means. The telecooperation platform is in routine operation and designed to serve for up to 800 trauma centers in Germany, organized in more than 50 trauma networks.

  4. Kinetics of contact formation and end-to-end distance distributions of swollen disordered peptides.

    Science.gov (United States)

    Soranno, Andrea; Longhi, Renato; Bellini, Tommaso; Buscaglia, Marco

    2009-02-18

    Unstructured polypeptide chains are subject to various degrees of swelling or compaction depending on the combination of solvent condition and amino acid sequence. Highly denatured proteins generally behave like random-coils with excluded volume repulsion, whereas in aqueous buffer more compact conformations have been observed for the low-populated unfolded state of globular proteins as well as for naturally disordered sequences. To quantitatively account for the different mechanisms inducing the swelling of polypeptides, we have examined three 14-residues peptides in aqueous buffer and in denaturant solutions, including the well characterized AGQ repeat as a reference and two variants, in which we have successively introduced charged side chains and removed the glycines. Quenching of the triplet state of tryptophan by close contact with cysteine has been used in conjunction with Förster resonance energy transfer to study the equilibrium and kinetic properties of the peptide chains. The experiments enable accessing end-to-end root mean-square distance, probability of end-to-end contact formation and intrachain diffusion coefficient. The data can be coherently interpreted on the basis of a simple chain model with backbone angles obtained from a library of coil segments of proteins and hard sphere repulsion at each Calpha position. In buffered water, we find that introducing charges in a glycine-rich sequence induces a mild chain swelling and a significant speed-up of the intrachain dynamics, whereas the removal of the glycines results in almost a two-fold increase of the chain volume and a drastic slowing down. In denaturants we observe a pronounced swelling of all the chains, with significant differences between the effect of urea and guanidinium chloride.

  5. Integrating end-to-end threads of control into object-oriented analysis and design

    Science.gov (United States)

    Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.

    1993-01-01

    Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.

  6. Profiling wind and greenhouse gases by infrared-laser occultation: results from end-to-end simulations in windy air

    Directory of Open Access Journals (Sweden)

    A. Plach

    2015-07-01

    Full Text Available The new mission concept of microwave and infrared-laser occultation between low-Earth-orbit satellites (LMIO is designed to provide accurate and long-term stable profiles of atmospheric thermodynamic variables, greenhouse gases (GHGs, and line-of-sight (l.o.s. wind speed with focus on the upper troposphere and lower stratosphere (UTLS. While the unique quality of GHG retrievals enabled by LMIO over the UTLS has been recently demonstrated based on end-to-end simulations, the promise of l.o.s. wind retrieval, and of joint GHG and wind retrieval, has not yet been analyzed in any realistic simulation setting. Here we use a newly developed l.o.s. wind retrieval algorithm, which we embedded in an end-to-end simulation framework that also includes the retrieval of thermodynamic variables and GHGs, and analyze the performance of both stand-alone wind retrieval and joint wind and GHG retrieval. The wind algorithm utilizes LMIO laser signals placed on the inflection points at the wings of the highly symmetric C18OO absorption line near 4767 cm−1 and exploits transmission differences from a wind-induced Doppler shift. Based on realistic example cases for a diversity of atmospheric conditions, ranging from tropical to high-latitude winter, we find that the retrieved l.o.s. wind profiles are of high quality over the lower stratosphere under all conditions, i.e., unbiased and accurate to within about 2 m s−1 over about 15 to 35 km. The wind accuracy degrades into the upper troposphere due to the decreasing signal-to-noise ratio of the wind-induced differential transmission signals. The GHG retrieval in windy air is not vulnerable to wind speed uncertainties up to about 10 m s−1 but is found to benefit in the case of higher speeds from the integrated wind retrieval that enables correction of wind-induced Doppler shift of GHG signals. Overall both the l.o.s. wind and GHG retrieval results are strongly encouraging towards further development and

  7. MO-B-BRB-04: 3D Dosimetry in End-To-End Dosimetry QA

    Energy Technology Data Exchange (ETDEWEB)

    Ibbott, G. [UT MD Anderson Cancer Center (United States)

    2016-06-15

    irradiated volume can help understand interplay effects during TomoTherapy or VMAT. Titania Juang: Special techniques in the clinic and research Understand the potential for 3D dosimetry in validating dose accumulation in deformable systems, and Observe the benefits of high resolution measurements for precision therapy in SRS and in MicroSBRT for small animal irradiators Geoffrey S. Ibbott: 3D Dosimetry in end-to-end dosimetry QA Understand the potential for 3D dosimetry for end-to-end radiation therapy process validation in the in-house and external credentialing setting. Canadian Institutes of Health Research; L. Schreiner, Modus QA, London, ON, Canada; T. Juang, NIH R01CA100835.

  8. End-to-End Printed-Circuit Board Assembly Design Using Altium Designer and Solid Works Systems

    National Research Council Canada - National Science Library

    A. M. Goncharenko; A. E. Kurnosenko; V. G. Kostikov; A. V. Lavrov; V. A. Soloviev

    2015-01-01

    .... The article offers an alternate approach to the end-to-end simultaneous development in Altium Designer / Solid Works CAD/CAE systems, which enables a radically shortened time to design new devices...

  9. End-to-end information extraction without token-level supervision

    DEFF Research Database (Denmark)

    Palm, Rasmus Berg; Hovy, Dirk; Laws, Florian

    2017-01-01

    Most state-of-the-art information extraction approaches rely on token-level labels to find the areas of interest in text. Unfortunately, these labels are time-consuming and costly to create, and consequently, not available for many real-life IE tasks. To make matters worse, token-level labels...... are usually not the desired output, but just an intermediary step. End-to-end (E2E) models, which take raw text as input and produce the desired output directly, need not depend on token-level labels. We propose an E2E model based on pointer networks, which can be trained directly on pairs of raw input...... and output text. We evaluate our model on the ATIS data set, MIT restaurant corpus and the MIT movie corpus and compare to neural baselines that do use token-level labels. We achieve competitive results, within a few percentage points of the baselines, showing the feasibility of E2E information extraction...

  10. Functional Partitioning to Optimize End-to-End Performance on Many-core Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Li, Min [Virginia Polytechnic Institute and State University (Virginia Tech); Vazhkudai, Sudharshan S [ORNL; Butt, Ali R [Virginia Polytechnic Institute and State University (Virginia Tech); Meng, Fei [ORNL; Ma, Xiaosong [ORNL; Kim, Youngjae [ORNL; Engelmann, Christian [ORNL; Shipman, Galen M [ORNL

    2010-01-01

    Scaling computations on emerging massive-core supercomputers is a daunting task, which coupled with the significantly lagging system I/O capabilities exacerbates applications end-to-end performance. The I/O bottleneck often negates potential performance benefits of assigning additional compute cores to an application. In this paper, we address this issue via a novel functional partitioning (FP) runtime environment that allocates cores to specific application tasks - checkpointing, de-duplication, and scientific data format transformation - so that the deluge of cores can be brought to bear on the entire gamut of application activities. The focus is on utilizing the extra cores to support HPC application I/O activities and also leverage solid-state disks in this context. For example, our evaluation shows that dedicating 1 core on an oct-core machine for checkpointing and its assist tasks using FP can improve overall execution time of a FLASH benchmark on 80 and 160 cores by 43.95% and 41.34%, respectively.

  11. End-to-End Multimodal Emotion Recognition Using Deep Neural Networks

    Science.gov (United States)

    Tzirakis, Panagiotis; Trigeorgis, George; Nicolaou, Mihalis A.; Schuller, Bjorn W.; Zafeiriou, Stefanos

    2017-12-01

    Automatic affect recognition is a challenging task due to the various modalities emotions can be expressed with. Applications can be found in many domains including multimedia retrieval and human computer interaction. In recent years, deep neural networks have been used with great success in determining emotional states. Inspired by this success, we propose an emotion recognition system using auditory and visual modalities. To capture the emotional content for various styles of speaking, robust features need to be extracted. To this purpose, we utilize a Convolutional Neural Network (CNN) to extract features from the speech, while for the visual modality a deep residual network (ResNet) of 50 layers. In addition to the importance of feature extraction, a machine learning algorithm needs also to be insensitive to outliers while being able to model the context. To tackle this problem, Long Short-Term Memory (LSTM) networks are utilized. The system is then trained in an end-to-end fashion where - by also taking advantage of the correlations of the each of the streams - we manage to significantly outperform the traditional approaches based on auditory and visual handcrafted features for the prediction of spontaneous and natural emotions on the RECOLA database of the AVEC 2016 research challenge on emotion recognition.

  12. End to End Digitisation and Analysis of Three-Dimensional Coral Models, from Communities to Corallites.

    Science.gov (United States)

    Gutierrez-Heredia, Luis; Benzoni, Francesca; Murphy, Emma; Reynaud, Emmanuel G

    2016-01-01

    Coral reefs hosts nearly 25% of all marine species and provide food sources for half a billion people worldwide while only a very small percentage have been surveyed. Advances in technology and processing along with affordable underwater cameras and Internet availability gives us the possibility to provide tools and softwares to survey entire coral reefs. Holistic ecological analyses of corals require not only the community view (10s to 100s of meters), but also the single colony analysis as well as corallite identification. As corals are three-dimensional, classical approaches to determine percent cover and structural complexity across spatial scales are inefficient, time-consuming and limited to experts. Here we propose an end-to-end approach to estimate these parameters using low-cost equipment (GoPro, Canon) and freeware (123D Catch, Meshmixer and Netfabb), allowing every community to participate in surveys and monitoring of their coral ecosystem. We demonstrate our approach on 9 species of underwater colonies in ranging size and morphology. 3D models of underwater colonies, fresh samples and bleached skeletons with high quality texture mapping and detailed topographic morphology were produced, and Surface Area and Volume measurements (parameters widely used for ecological and coral health studies) were calculated and analysed. Moreover, we integrated collected sample models with micro-photogrammetry models of individual corallites to aid identification and colony and polyp scale analysis.

  13. End to End Digitisation and Analysis of Three-Dimensional Coral Models, from Communities to Corallites.

    Directory of Open Access Journals (Sweden)

    Luis Gutierrez-Heredia

    Full Text Available Coral reefs hosts nearly 25% of all marine species and provide food sources for half a billion people worldwide while only a very small percentage have been surveyed. Advances in technology and processing along with affordable underwater cameras and Internet availability gives us the possibility to provide tools and softwares to survey entire coral reefs. Holistic ecological analyses of corals require not only the community view (10s to 100s of meters, but also the single colony analysis as well as corallite identification. As corals are three-dimensional, classical approaches to determine percent cover and structural complexity across spatial scales are inefficient, time-consuming and limited to experts. Here we propose an end-to-end approach to estimate these parameters using low-cost equipment (GoPro, Canon and freeware (123D Catch, Meshmixer and Netfabb, allowing every community to participate in surveys and monitoring of their coral ecosystem. We demonstrate our approach on 9 species of underwater colonies in ranging size and morphology. 3D models of underwater colonies, fresh samples and bleached skeletons with high quality texture mapping and detailed topographic morphology were produced, and Surface Area and Volume measurements (parameters widely used for ecological and coral health studies were calculated and analysed. Moreover, we integrated collected sample models with micro-photogrammetry models of individual corallites to aid identification and colony and polyp scale analysis.

  14. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  15. An end-to-end assessment of range uncertainty in proton therapy using animal tissues

    Science.gov (United States)

    Zheng, Yuanshui; Kang, Yixiu; Zeidan, Omar; Schreuder, Niek

    2016-11-01

    Accurate assessment of range uncertainty is critical in proton therapy. However, there is a lack of data and consensus on how to evaluate the appropriate amount of uncertainty. The purpose of this study is to quantify the range uncertainty in various treatment conditions in proton therapy, using transmission measurements through various animal tissues. Animal tissues, including a pig head, beef steak, and lamb leg, were used in this study. For each tissue, an end-to-end test closely imitating patient treatments was performed. This included CT scan simulation, treatment planning, image-guided alignment, and beam delivery. Radio-chromic films were placed at various depths in the distal dose falloff region to measure depth dose. Comparisons between measured and calculated doses were used to evaluate range differences. The dose difference at the distal falloff between measurement and calculation depends on tissue type and treatment conditions. The estimated range difference was up to 5, 6 and 4 mm for the pig head, beef steak, and lamb leg irradiation, respectively. Our study shows that the TPS was able to calculate proton range within about 1.5% plus 1.5 mm. Accurate assessment of range uncertainty in treatment planning would allow better optimization of proton beam treatment, thus fully achieving proton beams’ superior dose advantage over conventional photon-based radiation therapy.

  16. End-to-End Optimization of High-Throughput DNA Sequencing.

    Science.gov (United States)

    O'Reilly, Eliza; Baccelli, Francois; De Veciana, Gustavo; Vikalo, Haris

    2016-10-01

    At the core of Illumina's high-throughput DNA sequencing platforms lies a biophysical surface process that results in a random geometry of clusters of homogeneous short DNA fragments typically hundreds of base pairs long-bridge amplification. The statistical properties of this random process and the lengths of the fragments are critical as they affect the information that can be subsequently extracted, that is, density of successfully inferred DNA fragment reads. The ensembles of overlapping DNA fragment reads are then used to computationally reconstruct the much longer target genome sequence. The success of the reconstruction in turn depends on having a sufficiently large ensemble of DNA fragments that are sufficiently long. In this article using stochastic geometry, we model and optimize the end-to-end flow cell synthesis and target genome sequencing process, linking and partially controlling the statistics of the physical processes to the success of the final computational step. Based on a rough calibration of our model, we provide, for the first time, a mathematical framework capturing the salient features of the sequencing platform that serves as a basis for optimizing cost, performance, and/or sensitivity analysis to various parameters.

  17. End-to-end Information Flow Security Model for Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    D. Ju. Chaly

    2015-01-01

    Full Text Available Software-defined networks (SDN are a novel paradigm of networking which became an enabler technology for many modern applications such as network virtualization, policy-based access control and many others. Software can provide flexibility and fast-paced innovations in the networking; however, it has a complex nature. In this connection there is an increasing necessity of means for assuring its correctness and security. Abstract models for SDN can tackle these challenges. This paper addresses to confidentiality and some integrity properties of SDNs. These are critical properties for multi-tenant SDN environments, since the network management software must ensure that no confidential data of one tenant are leaked to other tenants in spite of using the same physical infrastructure. We define a notion of end-to-end security in context of software-defined networks and propose a semantic model where the reasoning is possible about confidentiality, and we can check that confidential information flows do not interfere with non-confidential ones. We show that the model can be extended in order to reason about networks with secure and insecure links which can arise, for example, in wireless environments.The article is published in the authors’ wording.

  18. The optical performance of the PILOT instrument from ground end-to-end tests

    Science.gov (United States)

    Misawa, R.; Bernard, J.-Ph.; Longval, Y.; Ristorcelli, I.; Ade, P.; Alina, D.; André, Y.; Aumont, J.; Bautista, L.; de Bernardis, P.; Boulade, O.; Bousqet, F.; Bouzit, M.; Buttice, V.; Caillat, A.; Chaigneau, M.; Charra, M.; Crane, B.; Douchin, F.; Doumayrou, E.; Dubois, J. P.; Engel, C.; Griffin, M.; Foenard, G.; Grabarnik, S.; Hargrave, P.; Hughes, A.; Laureijs, R.; Leriche, B.; Maestre, S.; Maffei, B.; Marty, C.; Marty, W.; Masi, S.; Montel, J.; Montier, L.; Mot, B.; Narbonne, J.; Pajot, F.; Pérot, E.; Pimentao, J.; Pisano, G.; Ponthieu, N.; Rodriguez, L.; Roudil, G.; Salatino, M.; Savini, G.; Simonella, O.; Saccoccio, M.; Tauber, J.; Tucker, C.

    2017-06-01

    The Polarized Instrument for Long-wavelength Observation of the Tenuous interstellar medium ( PILOT) is a balloon-borne astronomy experiment designed to study the linear polarization of thermal dust emission in two photometric bands centred at wavelengths 240 μm (1.2 THz) and 550 μm (545 GHz), with an angular resolution of a few arcminutes. Several end-to-end tests of the instrument were performed on the ground between 2012 and 2014, in order to prepare for the first scientific flight of the experiment that took place in September 2015 from Timmins, Ontario, Canada. This paper presents the results of those tests, focussing on an evaluation of the instrument's optical performance. We quantify image quality across the extent of the focal plane, and describe the tests that we conducted to determine the focal plane geometry, the optimal focus position, and sources of internal straylight. We present estimates of the detector response, obtained using an internal calibration source, and estimates of the background intensity and background polarization.

  19. Telecommunications end-to-end systems monitoring on TOPEX/Poseidon: Tools and techniques

    Science.gov (United States)

    Calanche, Bruno J.

    1994-01-01

    The TOPEX/Poseidon Project Satellite Performance Analysis Team's (SPAT) roles and responsibilities have grown to include functions that are typically performed by other teams on JPL Flight Projects. In particular, SPAT Telecommunication's role has expanded beyond the nominal function of monitoring, assessing, characterizing, and trending the spacecraft (S/C) RF/Telecom subsystem to one of End-to-End Information Systems (EEIS) monitoring. This has been accomplished by taking advantage of the spacecraft and ground data system structures and protocols. By processing both the received spacecraft telemetry minor frame ground generated CRC flags and NASCOM block poly error flags, bit error rates (BER) for each link segment can be determined. This provides the capability to characterize the separate link segments, determine science data recovery, and perform fault/anomaly detection and isolation. By monitoring and managing the links, TOPEX has successfully recovered approximately 99.9 percent of the science data with an integrity (BER) of better than 1 x 10(exp 8). This paper presents the algorithms used to process the above flags and the techniques used for EEIS monitoring.

  20. Impacts of the Deepwater Horizon oil spill evaluated using an end-to-end ecosystem model.

    Science.gov (United States)

    Ainsworth, Cameron H; Paris, Claire B; Perlin, Natalie; Dornberger, Lindsey N; Patterson, William F; Chancellor, Emily; Murawski, Steve; Hollander, David; Daly, Kendra; Romero, Isabel C; Coleman, Felicia; Perryman, Holly

    2018-01-01

    We use a spatially explicit biogeochemical end-to-end ecosystem model, Atlantis, to simulate impacts from the Deepwater Horizon oil spill and subsequent recovery of fish guilds. Dose-response relationships with expected oil concentrations were utilized to estimate the impact on fish growth and mortality rates. We also examine the effects of fisheries closures and impacts on recruitment. We validate predictions of the model by comparing population trends and age structure before and after the oil spill with fisheries independent data. The model suggests that recruitment effects and fishery closures had little influence on biomass dynamics. However, at the assumed level of oil concentrations and toxicity, impacts on fish mortality and growth rates were large and commensurate with observations. Sensitivity analysis suggests the biomass of large reef fish decreased by 25% to 50% in areas most affected by the spill, and biomass of large demersal fish decreased even more, by 40% to 70%. Impacts on reef and demersal forage caused starvation mortality in predators and increased reliance on pelagic forage. Impacts on the food web translated effects of the spill far away from the oiled area. Effects on age structure suggest possible delayed impacts on fishery yields. Recovery of high-turnover populations generally is predicted to occur within 10 years, but some slower-growing populations may take 30+ years to fully recover.

  1. An end-to-end anastomosis model of guinea pig bile duct: A 6-mo observation

    Science.gov (United States)

    Zhang, Xiao-Qing; Tian, Yuan-Hu; Xu, Zhi; Wang, Li-Xin; Hou, Chun-Sheng; Ling, Xiao-Feng; Zhou, Xiao-Si

    2011-01-01

    AIM: To establish the end-to-end anastomosis (EEA) model of guinea pig bile duct and evaluate the healing process of bile duct. METHODS: Thirty-two male guinea pigs were randomly divided into control group, 2-, 3-, and 6-mo groups after establishment of EEA model. Histological, immunohistochemical and serologic tests as well as measurement of bile contents were performed. The bile duct diameter and the diameter ratio (DR) were measured to assess the formation of relative stricture. RESULTS: Acute and chronic inflammatory reactions occurred throughout the healing process of bile duct. Serology test and bile content measurement showed no formation of persistent stricture in 6-mo group. The DR revealed a transient formation of relative stricture in 2-mo group in comparation to control group (2.94 ± 0.17 vs 1.89 ± 0.27, P = 0.004). However, this relative stricture was released in 6-mo group (2.14 ± 0.18, P = 0.440). CONCLUSION: A simple and reliable EEA model of guinea pig bile duct can be established with a good reproducibility and a satisfactory survival rate. PMID:21390151

  2. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: Unidata's Plans and Directions

    Science.gov (United States)

    Ramamurthy, M.

    2005-12-01

    work together in a fundamentally different way. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational drivers and discuss recent developments in cyberinfrastructure and Unidata's role and directions in providing robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  3. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: A vision for the future

    Science.gov (United States)

    Ramamurthy, M. K.

    2006-05-01

    yet revolutionary way of building applications and methods to connect and exchange information over the Web. This new approach, based on XML - a widely accepted format for exchanging data and corresponding semantics over the Internet - enables applications, computer systems, and information processes to work together in fundamentally different ways. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational landscape, discuss recent developments in cyberinfrastructure, and Unidata's role in and vision for providing easy-to use, robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  4. An anthropomorphic multimodality (CT/MRI) phantom prototype for end-to-end tests in radiation therapy

    CERN Document Server

    Gallas, Raya R; Runz, Armin; Niebuhr, Nina I; Jäkel, Oliver; Greilich, Steffen

    2014-01-01

    With the increasing complexity of external beam therapy, so-called "end-to-end" tests are intended to cover all steps from therapy planning to follow-up to fulfill the high demands on quality assurance. As magnetic resonance imaging (MRI) gains growing importance in the treatment process and established phantoms (such as the Alderson head) cannot be used for those tests, novel multimodality phantoms have to be developed. Here, we present a feasibility study for such a customizable multimodality head phantom. We used a set of patient CT images as the basis for the anthropomorphic head shape. The recipient - consisting of an epoxy resin - was produced using rapid prototyping (3D printing). The phantom recipient includes a nasal air cavity, two soft tissues volumes and cranial bone. Additionally a spherical tumor volume was positioned in the center. The volumes were filled with dipotassium phosphate-based cranial bone surrogate, agarose gel, and distilled water. The tumor volume was filled with normoxic dosimetr...

  5. SPoRT - An End-to-End R2O Activity

    Science.gov (United States)

    Jedlovec, Gary J.

    2009-01-01

    Established in 2002 to demonstrate the weather and forecasting application of real-time EOS measurements, the Short-term Prediction Research and Transition (SPoRT) program has grown to be an end-to-end research to operations activity focused on the use of advanced NASA modeling and data assimilation approaches, nowcasting techniques, and unique high-resolution multispectral observational data applications from EOS satellites to improve short-term weather forecasts on a regional and local scale. SPoRT currently partners with several universities and other government agencies for access to real-time data and products, and works collaboratively with them and operational end users at 13 WFOs to develop and test the new products and capabilities in a "test-bed" mode. The test-bed simulates key aspects of the operational environment without putting constraints on the forecaster workload. Products and capabilities which show utility in the test-bed environment are then transitioned experimentally into the operational environment for further evaluation and assessment. SPoRT focuses on a suite of data and products from MODIS, AMSR-E, and AIRS on the NASA Terra and Aqua satellites, and total lightning measurements from ground-based networks. Some of the observations are assimilated into or used with various versions of the WRF model to provide supplemental forecast guidance to operational end users. SPoRT is enhancing partnerships with NOAA / NESDIS for new product development and data access to exploit the remote sensing capabilities of instruments on the NPOESS satellites to address short term weather forecasting problems. The VIIRS and CrIS instruments on the NPP and follow-on NPOESS satellites provide similar observing capabilities to the MODIS and AIRS instruments on Terra and Aqua. SPoRT will be transitioning existing and new capabilities into the AWIIPS II environment to continue the continuity of its activities.

  6. jade: An End-To-End Data Transfer and Catalog Tool

    Science.gov (United States)

    Meade, P.

    2017-10-01

    The IceCube Neutrino Observatory is a cubic kilometer neutrino telescope located at the Geographic South Pole. IceCube collects 1 TB of data every day. An online filtering farm processes this data in real time and selects 10% to be sent via satellite to the main data center at the University of Wisconsin-Madison. IceCube has two year-round on-site operators. New operators are hired every year, due to the hard conditions of wintering at the South Pole. These operators are tasked with the daily operations of running a complex detector in serious isolation conditions. One of the systems they operate is the data archiving and transfer system. Due to these challenging operational conditions, the data archive and transfer system must above all be simple and robust. It must also share the limited resource of satellite bandwidth, and collect and preserve useful metadata. The original data archive and transfer software for IceCube was written in 2005. After running in production for several years, the decision was taken to fully rewrite it, in order to address a number of structural drawbacks. The new data archive and transfer software (JADE2) has been in production for several months providing improved performance and resiliency. One of the main goals for JADE2 is to provide a unified system that handles the IceCube data end-to-end: from collection at the South Pole, all the way to long-term archive and preservation in dedicated repositories at the North. In this contribution, we describe our experiences and lessons learned from developing and operating the data archive and transfer software for a particle physics experiment in extreme operational conditions like IceCube.

  7. A Mechanistic End-to-End Concussion Model That Translates Head Kinematics to Neurologic Injury

    Directory of Open Access Journals (Sweden)

    Laurel J. Ng

    2017-06-01

    Full Text Available Past concussion studies have focused on understanding the injury processes occurring on discrete length scales (e.g., tissue-level stresses and strains, cell-level stresses and strains, or injury-induced cellular pathology. A comprehensive approach that connects all length scales and relates measurable macroscopic parameters to neurological outcomes is the first step toward rationally unraveling the complexity of this multi-scale system, for better guidance of future research. This paper describes the development of the first quantitative end-to-end (E2E multi-scale model that links gross head motion to neurological injury by integrating fundamental elements of tissue and cellular mechanical response with axonal dysfunction. The model quantifies axonal stretch (i.e., tension injury in the corpus callosum, with axonal functionality parameterized in terms of axonal signaling. An internal injury correlate is obtained by calculating a neurological injury measure (the average reduction in the axonal signal amplitude over the corpus callosum. By using a neurologically based quantity rather than externally measured head kinematics, the E2E model is able to unify concussion data across a range of exposure conditions and species with greater sensitivity and specificity than correlates based on external measures. In addition, this model quantitatively links injury of the corpus callosum to observed specific neurobehavioral outcomes that reflect clinical measures of mild traumatic brain injury. This comprehensive modeling framework provides a basis for the systematic improvement and expansion of this mechanistic-based understanding, including widening the range of neurological injury estimation, improving concussion risk correlates, guiding the design of protective equipment, and setting safety standards.

  8. SME2EM: Smart mobile end-to-end monitoring architecture for life-long diseases.

    Science.gov (United States)

    Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani

    2016-01-01

    Monitoring life-long diseases requires continuous measurements and recording of physical vital signs. Most of these diseases are manifested through unexpected and non-uniform occurrences and behaviors. It is impractical to keep patients in hospitals, health-care institutions, or even at home for long periods of time. Monitoring solutions based on smartphones combined with mobile sensors and wireless communication technologies are a potential candidate to support complete mobility-freedom, not only for patients, but also for physicians. However, existing monitoring architectures based on smartphones and modern communication technologies are not suitable to address some challenging issues, such as intensive and big data, resource constraints, data integration, and context awareness in an integrated framework. This manuscript provides a novel mobile-based end-to-end architecture for live monitoring and visualization of life-long diseases. The proposed architecture provides smartness features to cope with continuous monitoring, data explosion, dynamic adaptation, unlimited mobility, and constrained devices resources. The integration of the architecture׳s components provides information about diseases׳ recurrences as soon as they occur to expedite taking necessary actions, and thus prevent severe consequences. Our architecture system is formally model-checked to automatically verify its correctness against designers׳ desirable properties at design time. Its components are fully implemented as Web services with respect to the SOA architecture to be easy to deploy and integrate, and supported by Cloud infrastructure and services to allow high scalability, availability of processes and data being stored and exchanged. The architecture׳s applicability is evaluated through concrete experimental scenarios on monitoring and visualizing states of epileptic diseases. The obtained theoretical and experimental results are very promising and efficiently satisfy the proposed

  9. A NASA Climate Model Data Services (CDS) End-to-End System to Support Reanalysis Intercomparison

    Science.gov (United States)

    Carriere, L.; Potter, G. L.; McInerney, M.; Nadeau, D.; Shen, Y.; Duffy, D.; Schnase, J. L.; Maxwell, T. P.; Huffer, E.

    2014-12-01

    The NASA Climate Model Data Service (CDS) and the NASA Center for Climate Simulation (NCCS) are collaborating to provide an end-to-end system for the comparative study of the major Reanalysis projects, currently, ECMWF ERA-Interim, NASA/GMAO MERRA, NOAA/NCEP CFSR, NOAA/ESRL 20CR, and JMA JRA25. Components of the system include the full spectrum of Climate Model Data Services; Data, Compute Services, Data Services, Analytic Services and Knowledge Services. The Data includes standard Reanalysis model output, and will be expanded to include gridded observations, and gridded Innovations (O-A and O-F). The NCCS High Performance Science Cloud provides the compute environment (storage, servers, and network). Data Services are provided through an Earth System Grid Federation (ESGF) data node complete with Live Access Server (LAS), Web Map Service (WMS) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) for visualization, as well as a collaborative interface through the Earth System CoG. Analytic Services include UV-CDAT for analysis and MERRA/AS, accessed via the CDS API, for computation services, both part of the CDS Climate Analytics as a Service (CAaaS). Knowledge Services include access to an Ontology browser, ODISEES, for metadata search and data retrieval. The result is a system that provides the ability for both reanalysis scientists and those scientists in need of reanalysis output to identify the data of interest, compare, compute, visualize, and research without the need for transferring large volumes of data, performing time consuming format conversions, and writing code for frequently run computations and visualizations.

  10. An End-to-End System to Enable Quick, Easy and Inexpensive Deployment of Hydrometeorological Stations

    Science.gov (United States)

    Celicourt, P.; Piasecki, M.

    2014-12-01

    The high cost of hydro-meteorological data acquisition, communication and publication systems along with limited qualified human resources is considered as the main reason why hydro-meteorological data collection remains a challenge especially in developing countries. Despite significant advances in sensor network technologies which gave birth to open hardware and software, low-cost (less than $50) and low-power (in the order of a few miliWatts) sensor platforms in the last two decades, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome, and thus expensive task. These factors give rise for the need to develop a affordable, simple to deploy, scalable and self-organizing end-to-end (from sensor to publication) system suitable for deployment in such countries. The design of the envisioned system will consist of a few Sensed-And-Programmed Arduino-based sensor nodes with low-cost sensors measuring parameters relevant to hydrological processes and a Raspberry Pi micro-computer hosting the in-the-field back-end data management. This latter comprises the Python/Django model of the CUAHSI Observations Data Model (ODM) namely DjangODM backed by a PostgreSQL Database Server. We are also developing a Python-based data processing script which will be paired with the data autoloading capability of Django to populate the DjangODM database with the incoming data. To publish the data, the WOFpy (WaterOneFlow Web Services in Python) developed by the Texas Water Development Board for 'Water Data for Texas' which can produce WaterML web services from a variety of back-end database installations such as SQLite, MySQL, and PostgreSQL will be used. A step further would be the development of an appealing online visualization tool using Python statistics and analytics tools (Scipy, Numpy, Pandas) showing the spatial distribution of variables across an entire watershed as a time variant layer on top of a basemap.

  11. Ocean Acidification Scientific Data Stewardship: An approach for end-to-end data management and integration

    Science.gov (United States)

    Arzayus, K. M.; Garcia, H. E.; Jiang, L.; Michael, P.

    2012-12-01

    As the designated Federal permanent oceanographic data center in the United States, NOAA's National Oceanographic Data Center (NODC) has been providing scientific stewardship for national and international marine environmental and ecosystem data for over 50 years. NODC is supporting NOAA's Ocean Acidification Program and the science community by providing end-to-end scientific data management of ocean acidification (OA) data, dedicated online data discovery, and user-friendly access to a diverse range of historical and modern OA and other chemical, physical, and biological oceanographic data. This effort is being catalyzed by the NOAA Ocean Acidification Program, but the intended reach is for the broader scientific ocean acidification community. The first three years of the project will be focused on infrastructure building. A complete ocean acidification data content standard is being developed to ensure that a full spectrum of ocean acidification data and metadata can be stored and utilized for optimal data discovery and access in usable data formats. We plan to develop a data access interface capable of allowing users to constrain their search based on real-time and delayed mode measured variables, scientific data quality, their observation types, the temporal coverage, methods, instruments, standards, collecting institutions, and the spatial coverage. In addition, NODC seeks to utilize the existing suite of international standards (including ISO 19115-2 and CF-compliant netCDF) to help our data producers use those standards for their data, and help our data consumers make use of the well-standardized metadata-rich data sets. These tools will be available through our NODC Ocean Acidification Scientific Data Stewardship (OADS) web page at http://www.nodc.noaa.gov/oceanacidification. NODC also has a goal to provide each archived dataset with a unique ID, to ensure a means of providing credit to the data provider. Working with partner institutions, such as the

  12. SensorKit: An End-to-End Solution for Environmental Sensor Networking

    Science.gov (United States)

    Silva, F.; Graham, E.; Deschon, A.; Lam, Y.; Goldman, J.; Wroclawski, J.; Kaiser, W.; Benzel, T.

    2008-12-01

    Modern day sensor network technology has shown great promise to transform environmental data collection. However, despite the promise, these systems have remained the purview of the engineers and computer scientists who design them rather than a useful tool for the environmental scientists who need them. SensorKit is conceived of as a way to make wireless sensor networks accessible to The People: it is an advanced, powerful tool for sensor data collection that does not require advanced technological know-how. We are aiming to make wireless sensor networks for environmental science as simple as setting up a standard home computer network by providing simple, tested configurations of commercially-available hardware, free and easy-to-use software, and step-by-step tutorials. We designed and built SensorKit using a simplicity-through-sophistication approach, supplying users a powerful sensor to database end-to-end system with a simple and intuitive user interface. Our objective in building SensorKit was to make the prospect of using environmental sensor networks as simple as possible. We built SensorKit from off the shelf hardware components, using the Compact RIO platform from National Instruments for data acquisition due to its modular architecture and flexibility to support a large number of sensor types. In SensorKit, we support various types of analog, digital and networked sensors. Our modular software architecture allows us to abstract sensor details and provide users a common way to acquire data and to command different types of sensors. SensorKit is built on top of the Sensor Processing and Acquisition Network (SPAN), a modular framework for acquiring data in the field, moving it reliably to the scientist institution, and storing it in an easily-accessible database. SPAN allows real-time access to the data in the field by providing various options for long haul communication, such as cellular and satellite links. Our system also features reliable data storage

  13. Unidata's Vision for Providing Comprehensive and End-to-end Data Services

    Science.gov (United States)

    Ramamurthy, M. K.

    2009-05-01

    This paper presents Unidata's vision for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users no matter where they are or how they are connected to the Internetwill be able to find and access a plethora of geosciences data and use Unidata-provided tools and services both productively and creatively in their research and education. What that vision means for the Unidata community is elucidated by drawing a simple analogy. Most of users are familiar with Amazon and eBay e-commerce sites and content sharing sites like YouTube and Flickr. On the eBay marketplace, people can sell practically anything at any time and buyers can share their experience of purchasing a product or the reputation of a seller. Likewise, at Amazon, thousands of merchants sell their goods and millions of customers not only buy those goods, but provide a review or opinion of the products they buy and share their experiences as purchasers. Similarly, YouTube and Flickr are sites tailored to video- and photo-sharing, respectively, where users can upload their own content and share it with millions of other users, including family and friends. What all these sites, together with social-networking applications like MySpace and Facebook, have enabled is a sense of a virtual community in which users can search and browse products or content, comment and rate those products from anywhere, at any time, and via any Internet- enabled device like an iPhone, laptop, or a desktop computer. In essence, these enterprises have fundamentally altered people's buying modes and behavior toward purchases. Unidata believes that similar approaches, appropriately tailored to meet the needs of the scientific

  14. On the importance of risk knowledge for an end-to-end tsunami early warning system

    Science.gov (United States)

    Post, Joachim; Strunz, Günter; Riedlinger, Torsten; Mück, Matthias; Wegscheider, Stephanie; Zosseder, Kai; Steinmetz, Tilmann; Gebert, Niklas; Anwar, Herryal

    2010-05-01

    context has been worked out. The generated results contribute significantly in the fields of (1) warning decision and warning levels, (2) warning dissemination and warning message content, (3) early warning chain planning, (4) increasing response capabilities and protective systems, (5) emergency relief and (6) enhancing communities' awareness and preparedness towards tsunami threats. Additionally examples will be given on the potentials of an operational use of risk information in early warning systems as first experiences exist for the tsunami early warning center in Jakarta, Indonesia. Beside this the importance of linking national level early warning information with tsunami risk information available at the local level (e.g. linking warning message information on expected intensity with respective tsunami hazard zone maps at community level for effective evacuation) will be demonstrated through experiences gained in three pilot areas in Indonesia. The presentation seeks to provide new insights on benefits using risk information in early warning and will provide further evidence that practical use of risk information is an important and indispensable component of end-to-end early warning.

  15. An end-to-end coupled model ROMS-N 2 P 2 Z 2 D 2 -OSMOSE of ...

    African Journals Online (AJOL)

    In order to better understand ecosystem functioning under simultaneous pressures (e.g. both climate change and fishing pressures), integrated modelling approaches are advocated. We developed an end-to-end model of the southern Benguela ecosystem by coupling the high trophic level model OSMOSE with a ...

  16. Sutureless functional end-to-end anastomosis using a linear stapler with polyglycolic acid felt for intestinal anastomoses

    Directory of Open Access Journals (Sweden)

    Masanori Naito, MD, PhD

    2017-05-01

    Conclusion: Sutureless functional end-to-end anastomosis using the Endo GIA™ Reinforced appears to be safe, efficacious, and straightforward. Reinforcement of the crotch site with a bioabsorbable polyglycolic acid sheet appears to mitigate conventional problems with crotch-site vulnerability.

  17. A vision for end-to-end data services to foster international partnerships through data sharing

    Science.gov (United States)

    Ramamurthy, M.; Yoksas, T.

    2009-04-01

    Increasingly, the conduct of science requires scientific partnerships and sharing of knowledge, information, and other assets. This is particularly true in our field where the highly-coupled Earth system and its many linkages have heightened the importance of collaborations across geographic, disciplinary, and organizational boundaries. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. As articulated in the NSF Strategic Plan: FY 2006-2011, "…discovery increasingly requires expertise of individuals from different disciplines, with diverse perspectives, and often from different nations, working together to accommodate the extraordinary complexity of today's science and engineering challenges." The Nobel Prize winning IPCC assessments are a prime example of such an effort. Earth science education is also uniquely suited to drawing connections between the dynamic Earth system and societal issues. Events like the 2004 Indian Ocean tsunami and Hurricane Katrina provide ample evidence of this relevance, as they underscore the importance of timely and interdisciplinary integration and synthesis of data. Our success in addressing such complex problems and advancing geosciences depends on the availability of a state-of-the-art and robust cyberinfrastructure, transparent and timely access to high-quality data from diverse sources, and requisite tools to integrate and use the data effectively, toward creating new knowledge. To that end, Unidata's vision calls for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users — no matter where they are, how they are connected to the Internet, or what

  18. End-to-End demonstrator of the Safe Affordable Fission Engine (SAFE) 30: Power conversion and ion engine operation

    Science.gov (United States)

    Hrbud, Ivana; van Dyke, Melissa; Houts, Mike; Goodfellow, Keith

    2002-01-01

    The Safe Affordable Fission Engine (SAFE) test series addresses Phase 1 Space Fission Systems issues in particular non-nuclear testing and system integration issues leading to the testing and non-nuclear demonstration of a 400-kW fully integrated flight unit. The first part of the SAFE 30 test series demonstrated operation of the simulated nuclear core and heat pipe system. Experimental data acquired in a number of different test scenarios will validate existing computational models, demonstrated system flexibility (fast start-ups, multiple start-ups/shut downs), simulate predictable failure modes and operating environments. The objective of the second part is to demonstrate an integrated propulsion system consisting of a core, conversion system and a thruster where the system converts thermal heat into jet power. This end-to-end system demonstration sets a precedent for ground testing of nuclear electric propulsion systems. The paper describes the SAFE 30 end-to-end system demonstration and its subsystems. .

  19. A GMPLS/OBS network architecture enabling QoS-aware end-to-end burst transport

    OpenAIRE

    Pedroso, Pedro; Perelló Muntan, Jordi; Spadaro, Salvatore; Careglio, Davide; Solé Pareta, Josep; Klinkowski, Miroslaw

    2010-01-01

    This paper introduces a Generalized Multi-Protocol Label Switching (GMPLS)-enabled Optical Burst Switched (OBS) network architecture featuring end-to-end QoS-aware burst transport services. This is achieved by setting up burst Label Switched Paths (LSPs) properly dimensioned to match specific burst drop probability requirements. These burst LSPs are used for specific guaranteed QoS levels, whereas the remaining network capacity can be left for best-effort burst support. Aiming to ensure...

  20. Minimizing End-to-End Interference in I/O Stacks Spanning Shared Multi-Level Buffer Caches

    Science.gov (United States)

    Patrick, Christina M.

    2011-01-01

    This thesis presents an end-to-end interference minimizing uniquely designed high performance I/O stack that spans multi-level shared buffer cache hierarchies accessing shared I/O servers to deliver a seamless high performance I/O stack. In this thesis, I show that I can build a superior I/O stack which minimizes the inter-application interference…

  1. Hybrid monitoring scheme for end-to-end performance enhancement of multicast-based real-time media

    Science.gov (United States)

    Park, Ju-Won; Kim, JongWon

    2004-10-01

    As real-time media applications based on IP multicast networks spread widely, end-to-end QoS (quality of service) provisioning for these applications have become very important. To guarantee the end-to-end QoS of multi-party media applications, it is essential to monitor the time-varying status of both network metrics (i.e., delay, jitter and loss) and system metrics (i.e., CPU and memory utilization). In this paper, targeting the multicast-enabled AG (Access Grid) a next-generation group collaboration tool based on multi-party media services, the applicability of hybrid monitoring scheme that combines active and passive monitoring is investigated. The active monitoring measures network-layer metrics (i.e., network condition) with probe packets while the passive monitoring checks both application-layer metrics (i.e., user traffic condition by analyzing RTCP packets) and system metrics. By comparing these hybrid results, we attempt to pinpoint the causes of performance degradation and explore corresponding reactions to improve the end-to-end performance. The experimental results show that the proposed hybrid monitoring can provide useful information to coordinate the performance improvement of multi-party real-time media applications.

  2. Sleep/wake scheduling scheme for minimizing end-to-end delay in multi-hop wireless sensor networks

    Directory of Open Access Journals (Sweden)

    Madani Sajjad

    2011-01-01

    Full Text Available Abstract We present a sleep/wake schedule protocol for minimizing end-to-end delay for event driven multi-hop wireless sensor networks. In contrast to generic sleep/wake scheduling schemes, our proposed algorithm performs scheduling that is dependent on traffic loads. Nodes adapt their sleep/wake schedule based on traffic loads in response to three important factors, (a the distance of the node from the sink node, (b the importance of the node's location from connectivity's perspective, and (c if the node is in the proximity where an event occurs. Using these heuristics, the proposed scheme reduces end-to-end delay and maximizes the throughput by minimizing the congestion at nodes having heavy traffic load. Simulations are carried out to evaluate the performance of the proposed protocol, by comparing its performance with S-MAC and Anycast protocols. Simulation results demonstrate that the proposed protocol has significantly reduced the end-to-end delay, as well as has improved the other QoS parameters, like average energy per packet, average delay, packet loss ratio, throughput, and coverage lifetime.

  3. Understanding Effect of Constraint Release Environment on End-to-End Vector Relaxation of Linear Polymer Chains

    KAUST Repository

    Shivokhin, Maksim E.

    2017-05-30

    We propose and verify methods based on the slip-spring (SSp) model [ Macromolecules 2005, 38, 14 ] for predicting the effect of any monodisperse, binary, or ternary environment of topological constraints on the relaxation of the end-to-end vector of a linear probe chain. For this purpose we first validate the ability of the model to consistently predict both the viscoelastic and dielectric response of monodisperse and binary mixtures of type A polymers, based on published experimental data. We also report the synthesis of new binary and ternary polybutadiene systems, the measurement of their linear viscoelastic response, and the prediction of these data by the SSp model. We next clarify the relaxation mechanisms of probe chains in these constraint release (CR) environments by analyzing a set of "toy" SSp models with simplified constraint release rates, by examining fluctuations of the end-to-end vector. In our analysis, the longest relaxation time of the probe chain is determined by a competition between the longest relaxation times of the effective CR motions of the fat and thin tubes and the motion of the chain itself in the thin tube. This picture is tested by the analysis of four model systems designed to separate and estimate every single contribution involved in the relaxation of the probe\\'s end-to-end vector in polydisperse systems. We follow the CR picture of Viovy et al. [ Macromolecules 1991, 24, 3587 ] and refine the effective chain friction in the thin and fat tubes based on Read et al. [ J. Rheol. 2012, 56, 823 ]. The derived analytical equations form a basis for generalizing the proposed methodology to polydisperse mixtures of linear and branched polymers. The consistency between the SSp model and tube model predictions is a strong indicator of the compatibility between these two distinct mesoscopic frameworks.

  4. End-to-end testing. [to verify electrical equipment failure due to carbon fibers released in aircraft-fuel fires

    Science.gov (United States)

    Pride, R. A.

    1979-01-01

    The principle objective of the kinds of demonstration tests that are discussed is to try to verify whether or not carbon fibers that are released by burning composite parts in an aircraft-fuel fires can produce failures in electrical equipment. A secondary objective discussed is to experimentally validate the analytical models for some of the key elements in the risk analysis. The approach to this demonstration testing is twofold: limited end-to-end test are to be conducted in a shock tube; and planning for some large outdoor burn tests is being done.

  5. AN AUTOMATED END-TO-END MULTI-AGENT QOS BASED ARCHITECTURE FOR SELECTION OF GEOSPATIAL WEB SERVICES

    Directory of Open Access Journals (Sweden)

    M. Shah

    2012-07-01

    With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  6. End-to-End Printed-Circuit Board Assembly Design Using Altium Designer and Solid Works Systems

    Directory of Open Access Journals (Sweden)

    A. M. Goncharenko

    2015-01-01

    Full Text Available The main goal of this white paper is to investigate the methods to accelerate the end-to-end simultaneous development of electronic PC assemblies in MCAD/ECAD systems. With raising the produced electronic equipment rates and quantities, there is a need to speed up the yield of new products. The article offers an alternate approach to the end-to-end simultaneous development in Altium Designer / Solid Works CAD/CAE systems, which enables a radically shortened time to design new devices and databases of components.The first part of the paper analyses the methods and models to solve the tasks of the endto-end simultaneous development of PC assemblies using the Circuit Works module for Solid Works. It examines the problems of traditional data exchange methods between Altium Designer and Solid Works arising from the limitations of the IDF 2.0 format used, as well as from the problems of 3D-models of components and because it is necessary to support two different databases.The second part gives guidelines and an example of the end-to-end simultaneous PC assembly development using the Altium Modeler module for Solid Works aimed at Altium Designer and presents a brief review of algorithms. The proposed method neither requires an additional database, nor uses an intermediate format such as IDF. The module translates the PCB model directly to Solid Works to generate the assembly model. The Altium Modeler is also capable to update its created assembly in Solid Works, which is very useful in case of modification of components and PCB itself. This approach is better tailored to the end-to-end development in terms of acceleration, enhancing facility of simultaneous work in different MCAD/ECAD systems, and eliminating errors arising from the need to support two CAD databases of the same functionality.In the conclusion the paper gives suggestions for using the modules for simultaneous development of electronic PC assemblies in Altium Designer and Solid Works.

  7. End-to-End Joint Antenna Selection Strategy and Distributed Compress and Forward Strategy for Relay Channels

    Directory of Open Access Journals (Sweden)

    Rahul Vaze

    2009-01-01

    Full Text Available Multihop relay channels use multiple relay stages, each with multiple relay nodes, to facilitate communication between a source and destination. Previously, distributed space-time codes were proposed to maximize the achievable diversity-multiplexing tradeoff; however, they fail to achieve all the points of the optimal diversity-multiplexing tradeoff. In the presence of a low-rate feedback link from the destination to each relay stage and the source, this paper proposes an end-to-end antenna selection (EEAS strategy as an alternative to distributed space-time codes. The EEAS strategy uses a subset of antennas of each relay stage for transmission of the source signal to the destination with amplifying and forwarding at each relay stage. The subsets are chosen such that they maximize the end-to-end mutual information at the destination. The EEAS strategy achieves the corner points of the optimal diversity-multiplexing tradeoff (corresponding to maximum diversity gain and maximum multiplexing gain and achieves better diversity gain at intermediate values of multiplexing gain, versus the best-known distributed space-time coding strategies. A distributed compress and forward (CF strategy is also proposed to achieve all points of the optimal diversity-multiplexing tradeoff for a two-hop relay channel with multiple relay nodes.

  8. Context-driven, prescription-based personal activity classification: methodology, architecture, and end-to-end implementation.

    Science.gov (United States)

    Xu, James Y; Chang, Hua-I; Chien, Chieh; Kaiser, William J; Pottie, Gregory J

    2014-05-01

    Enabling large-scale monitoring and classification of a range of motion activities is of primary importance due to the need by healthcare and fitness professionals to monitor exercises for quality and compliance. Past work has not fully addressed the unique challenges that arise from scaling. This paper presents a novel end-to-end system solution to some of these challenges. The system is built on the prescription-based context-driven activity classification methodology. First, we show that by refining the definition of context, and introducing the concept of scenarios, a prescription model can provide personalized activity monitoring. Second, through a flexible architecture constructed from interface models, we demonstrate the concept of a context-driven classifier. Context classification is achieved through a classification committee approach, and activity classification follows by means of context specific activity models. Then, the architecture is implemented in an end-to-end system featuring an Android application running on a mobile device, and a number of classifiers as core classification components. Finally, we use a series of experimental field evaluations to confirm the expected benefits of the proposed system in terms of classification accuracy, rate, and sensor operating life.

  9. Risk Factors for Dehiscence of Stapled Functional End-to-End Intestinal Anastomoses in Dogs: 53 Cases (2001-2012).

    Science.gov (United States)

    Snowdon, Kyle A; Smeak, Daniel D; Chiang, Sharon

    2016-01-01

    To identify risk factors for dehiscence in stapled functional end-to-end anastomoses (SFEEA) in dogs. Retrospective case series. Dogs (n = 53) requiring an enterectomy. Medical records from a single institution for all dogs undergoing an enterectomy (2001-2012) were reviewed. Surgeries were included when gastrointestinal (GIA) and thoracoabdominal (TA) stapling equipment was used to create a functional end-to-end anastomosis between segments of small intestine or small and large intestine in dogs. Information regarding preoperative, surgical, and postoperative factors was recorded. Anastomotic dehiscence was noted in 6 of 53 cases (11%), with a mortality rate of 83%. The only preoperative factor significantly associated with dehiscence was the presence of inflammatory bowel disease (IBD). Surgical factors significantly associated with dehiscence included the presence, duration, and number of intraoperative hypotensive periods, and location of anastomosis, with greater odds of dehiscence in anastomoses involving the large intestine. IBD, location of anastomosis, and intraoperative hypotension are risk factors for intestinal anastomotic dehiscence after SFEEA in dogs. Previously suggested risk factors (low serum albumin concentration, preoperative septic peritonitis, and intestinal foreign body) were not confirmed in this study. © Copyright 2015 by The American College of Veterinary Surgeons.

  10. End-to-end self-assembly of gold nanorods in isopropanol solution: experimental and theoretical studies

    Energy Technology Data Exchange (ETDEWEB)

    Gordel, M., E-mail: marta.gordel@pwr.edu.pl [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland); Piela, K., E-mail: katarzyna.piela@pwr.edu.pl [Wrocław University of Technology, Department of Physical and Quantum Chemistry (Poland); Kołkowski, R. [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland); Koźlecki, T. [Wrocław University of Technology, Department of Chemical Engineering, Faculty of Chemistry (Poland); Buckle, M. [CNRS, École Normale Supérieure de Cachan, Laboratoire de Biologie et Pharmacologie Appliquée (France); Samoć, M. [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland)

    2015-12-15

    We describe here a modification of properties of colloidal gold nanorods (NRs) resulting from the chemical treatment used to carry out their transfer into isopropanol (IPA) solution. The NRs acquire a tendency to attach one to another by their ends (end-to-end assembly). We focus on the investigation of the change in position and shape of the longitudinal surface plasmon (l-SPR) band after self-assembly. The experimental results are supported by a theoretical calculation, which rationalizes the dramatic change in optical properties when the NRs are positioned end-to-end at short distances. The detailed spectroscopic characterization performed at the consecutive stages of transfer of the NRs from water into IPA solution revealed the features of the interaction between the polymers used as ligands and their contribution to the final stage, when the NRs were dispersed in IPA solution. The efficient method of aligning the NRs detailed here may facilitate applications of the self-assembled NRs as building blocks for optical materials and biological sensing.Graphical Abstract.

  11. End-to-end Structural Restriction of α-Synuclein and Its Influence on Amyloid Fibril Formation

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Chul Suk; Park, Jae Hyung; Choe, Young Jun; Paik, Seung R. [Seoul National University, Seoul (Korea, Republic of)

    2014-09-15

    Relationship between molecular freedom of amyloidogenic protein and its self-assembly into amyloid fibrils has been evaluated with α-synuclein, an intrinsically unfolded protein related to Parkinson's disease, by restricting its structural plasticity through an end-to-end disulfide bond formation between two newly introduced cysteine residues on the N- and C-termini. Although the resulting circular form of α-synuclein exhibited an impaired fibrillation propensity, the restriction did not completely block the protein's interactive core since co-incubation with wild-type α-synuclein dramatically facilitated the fibrillation by producing distinctive forms of amyloid fibrils. The suppressed fibrillation propensity was instantly restored as the structural restriction was unleashed with β-mercaptoethanol. Conformational flexibility of the accreting amyloidogenic protein to pre-existing seeds has been demonstrated to be critical for fibrillar extension process by exerting structural adjustment to a complementary structure for the assembly.

  12. Self-assembled nanogaps via seed-mediated growth of end-to-end linked gold nanorods

    DEFF Research Database (Denmark)

    Jain, Titoo; Westerlund, Axel Rune Fredrik; Johnson, Erik

    2009-01-01

    on the nanoscale. We here present a new way to end-to-end link AuNRs with a single or few linker molecules. Whereas methods reported in the literature so far rely on modification of the AuNRs after the synthesis, we here dimerize gold nanoparticle seeds with a water-soluble dithiol-functionalized polyethylene......Gold nanorods (AuNRs) are of interest for a wide range of applications, ranging from imaging to molecular electronics, and they have been studied extensively for the past decade. An important issue in AuNR applications is the ability to self-assemble the rods in predictable structures...... that a large fraction of the rods are flexible around the hinging molecule in solution, as expected for a molecularly linked nanogap. By using excess of gold nanoparticles relative to the linking dithiol molecule, this method can provide a high probability that a single molecule is connecting the two rods...

  13. Long-wavelength optical properties of a plasmonic crystal composed of end-to-end nanorod dimers

    Directory of Open Access Journals (Sweden)

    X. Q. Yu

    2013-06-01

    Full Text Available We theoretically investigate the long-wavelength optical properties of a plasmonic crystal composed of end-to-end gold nanorod dimers. The strong coupling between incident light and the electron oscillations inside the nanorods gives rise to a plasmon polariton, which can be analogous to the phonon polariton in an ionic crystal. Huang-Kun-like equations are employed to explore the underlying physical mechanism for both symmetrical and asymmetrical geometries. In the long wavelength limit, the macroscopic dielectric response of the proposed structure is deduced analytically. The polariton dispersion curve shows a typical anticrossing profile in the strong coupling regime and adjacent branches are separated by a Rabi splitting. The resultant polaritonic stop band is validated by the numerical simulations.

  14. End-to-end performance of cooperative relaying in spectrum-sharing systems with quality of service requirements

    KAUST Repository

    Asghari, Vahid Reza

    2011-07-01

    We propose adopting a cooperative relaying technique in spectrum-sharing cognitive radio (CR) systems to more effectively and efficiently utilize available transmission resources, such as power, rate, and bandwidth, while adhering to the quality of service (QoS) requirements of the licensed (primary) users of the shared spectrum band. In particular, we first consider that the cognitive (secondary) user\\'s communication is assisted by an intermediate relay that implements the decode-and-forward (DF) technique onto the secondary user\\'s relayed signal to help with communication between the corresponding source and the destination nodes. In this context, we obtain first-order statistics pertaining to the first- and second-hop transmission channels, and then, we investigate the end-to-end performance of the proposed spectrum-sharing cooperative relaying system under resource constraints defined to assure that the primary QoS is unaffected. Specifically, we investigate the overall average bit error rate (BER), ergodic capacity, and outage probability of the secondary\\'s communication subject to appropriate constraints on the interference power at the primary receivers. We then consider a general scenario where a cluster of relays is available between the secondary source and destination nodes. In this case, making use of the partial relay selection method, we generalize our results for the single-relay scheme and obtain the end-to-end performance of the cooperative spectrum-sharing system with a cluster of L available relays. Finally, we examine our theoretical results through simulations and comparisons, illustrating the overall performance of the proposed spectrum-sharing cooperative system and quantify its advantages for different operating scenarios and conditions. © 2011 IEEE.

  15. Recipient vessel selection in the difficult neck: Outcomes of external carotid artery transposition and end-to-end microvascular anastomosis.

    Science.gov (United States)

    Garg, Ravi K; Poore, Samuel O; Wieland, Aaron M; Sanchez, Ruston; Baskaya, Mustafa K; Hartig, Gregory K

    2017-02-01

    Selection of recipient vessels for head and neck microvascular surgery may be limited in the previously dissected or irradiated neck. When distal branches of the external carotid artery (ECA) are unavailable, additional options for arterial inflow are needed. Here we propose high ligation of the ECA and transposition toward the lower neck as an alternative. After obtaining institutional approval, patients who underwent head and neck tumor resection and simultaneous free flap reconstruction were identified over a 5-year period. Patients whose recipient artery was listed in the operative report were included. Chart review was performed to identify patient demographics, operative details, and patient and flap complications. In cases where the ECA was used, the artery was traced distally with care taken to protect the hypoglossal nerve. The ECA was then divided and transposed toward the lower neck where an end-to-end microvascular anastomosis was performed. The recipient artery used for head and neck microsurgery was available for 176 flaps, and the facial (n = 127, 72.2%) and external carotid (n = 19, 10.8%) arteries were most commonly used. There were 0 flap thromboses in the ECA group compared to 3 flap thromboses that occurred with other recipient arteries (P = 1.00). No cases of first bite syndrome or hypoglossal nerve injury were identified. The ECA may be transposed toward the lower neck and used for end-to-end microvascular anastomosis without complication of hypoglossal nerve injury or first bite syndrome. This method may be considered an alternative in patients with limited recipient vessel options for head and neck microsurgery. © 2015 Wiley Periodicals, Inc. Microsurgery 37:96-100, 2017. © 2015 Wiley Periodicals, Inc.

  16. Evaluation of Techniques to Detect Significant Network Performance Problems using End-to-End Active Network Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Cottrell, R.Les; Logg, Connie; Chhaparia, Mahesh; /SLAC; Grigoriev, Maxim; /Fermilab; Haro, Felipe; /Chile U., Catolica; Nazir, Fawad; /NUST, Rawalpindi; Sandford, Mark

    2006-01-25

    End-to-End fault and performance problems detection in wide area production networks is becoming increasingly hard as the complexity of the paths, the diversity of the performance, and dependency on the network increase. Several monitoring infrastructures are built to monitor different network metrics and collect monitoring information from thousands of hosts around the globe. Typically there are hundreds to thousands of time-series plots of network metrics which need to be looked at to identify network performance problems or anomalous variations in the traffic. Furthermore, most commercial products rely on a comparison with user configured static thresholds and often require access to SNMP-MIB information, to which a typical end-user does not usually have access. In our paper we propose new techniques to detect network performance problems proactively in close to realtime and we do not rely on static thresholds and SNMP-MIB information. We describe and compare the use of several different algorithms that we have implemented to detect persistent network problems using anomalous variations analysis in real end-to-end Internet performance measurements. We also provide methods and/or guidance for how to set the user settable parameters. The measurements are based on active probes running on 40 production network paths with bottlenecks varying from 0.5Mbits/s to 1000Mbit/s. For well behaved data (no missed measurements and no very large outliers) with small seasonal changes most algorithms identify similar events. We compare the algorithms' robustness with respect to false positives and missed events especially when there are large seasonal effects in the data. Our proposed techniques cover a wide variety of network paths and traffic patterns. We also discuss the applicability of the algorithms in terms of their intuitiveness, their speed of execution as implemented, and areas of applicability. Our encouraging results compare and evaluate the accuracy of our

  17. End-to-End Information System design at the NASA Jet Propulsion Laboratory. [data transmission between user and space-based sensor

    Science.gov (United States)

    Hooke, A. J.

    1978-01-01

    In recognition of a pressing need of the 1980s to optimize the two-way flow of information between a ground-based user and a remote-space-based sensor, an end-to-end approach to the design of information systems has been adopted at the JPL. This paper reviews End-to-End Information System (EEIS) activity at the JPL, with attention given to the scope of the EEIS transfer function, and functional and physical elements of the EEIS. The relationship between the EEIS and the NASA End-to-End Data System program is discussed.

  18. Unmanned Aircraft Systems Minimum Operations Performance Standards End-to-End Verification and Validation (E2-V2) Simulation

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Vincent, Michael J.; Sturdy, James L.; Munoz, Cesar A.; Hoffler, Keith D.; Dutle, Aaron M.; Myer, Robert R.; Dehaven, Anna M.; hide

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The current NAS relies on pilot's vigilance and judgement to remain Well Clear (CFR 14 91.113) of other aircraft. RTCA SC-228 has defined DAA Well Clear (DAAWC) to provide a quantified Well Clear volume to allow systems to be designed and measured against. Extended research efforts have been conducted to understand and quantify system requirements needed to support a UAS pilot's ability to remain well clear of other aircraft. The efforts have included developing and testing sensor, algorithm, alerting, and display requirements. More recently, sensor uncertainty and uncertainty mitigation strategies have been evaluated. This paper discusses results and lessons learned from an End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS). NASA Langley Research Center (LaRC) was called upon to develop a system that evaluates a specific set of encounters, in a variety of geometries, with end-to-end DAA functionality including the use of sensor and tracker models, a sensor uncertainty mitigation model, DAA algorithmic guidance in both vertical and horizontal maneuvering, and a pilot model which maneuvers the ownship aircraft to remain well clear from intruder aircraft, having received collective input from the previous modules of the system. LaRC developed a functioning batch simulation and added a sensor/tracker model from the Federal Aviation Administration (FAA) William J. Hughes Technical Center, an in-house developed sensor uncertainty mitigation strategy, and implemented a pilot

  19. A novel end-to-end classifier using domain transferred deep convolutional neural networks for biomedical images.

    Science.gov (United States)

    Pang, Shuchao; Yu, Zhezhou; Orgun, Mehmet A

    2017-03-01

    Highly accurate classification of biomedical images is an essential task in the clinical diagnosis of numerous medical diseases identified from those images. Traditional image classification methods combined with hand-crafted image feature descriptors and various classifiers are not able to effectively improve the accuracy rate and meet the high requirements of classification of biomedical images. The same also holds true for artificial neural network models directly trained with limited biomedical images used as training data or directly used as a black box to extract the deep features based on another distant dataset. In this study, we propose a highly reliable and accurate end-to-end classifier for all kinds of biomedical images via deep learning and transfer learning. We first apply domain transferred deep convolutional neural network for building a deep model; and then develop an overall deep learning architecture based on the raw pixels of original biomedical images using supervised training. In our model, we do not need the manual design of the feature space, seek an effective feature vector classifier or segment specific detection object and image patches, which are the main technological difficulties in the adoption of traditional image classification methods. Moreover, we do not need to be concerned with whether there are large training sets of annotated biomedical images, affordable parallel computing resources featuring GPUs or long times to wait for training a perfect deep model, which are the main problems to train deep neural networks for biomedical image classification as observed in recent works. With the utilization of a simple data augmentation method and fast convergence speed, our algorithm can achieve the best accuracy rate and outstanding classification ability for biomedical images. We have evaluated our classifier on several well-known public biomedical datasets and compared it with several state-of-the-art approaches. We propose a robust

  20. SU-F-J-177: A Novel Image Analysis Technique (center Pixel Method) to Quantify End-To-End Tests

    Energy Technology Data Exchange (ETDEWEB)

    Wen, N; Chetty, I [Henry Ford Health System, Detroit, MI (United States); Snyder, K [Henry Ford Hospital System, Detroit, MI (United States); Scheib, S [Varian Medical System, Barton (Switzerland); Qin, Y; Li, H [Henry Ford Health System, Detroit, Michigan (United States)

    2016-06-15

    Purpose: To implement a novel image analysis technique, “center pixel method”, to quantify end-to-end tests accuracy of a frameless, image guided stereotactic radiosurgery system. Methods: The localization accuracy was determined by delivering radiation to an end-to-end prototype phantom. The phantom was scanned with 0.8 mm slice thickness. The treatment isocenter was placed at the center of the phantom. In the treatment room, CBCT images of the phantom (kVp=77, mAs=1022, slice thickness 1 mm) were acquired to register to the reference CT images. 6D couch correction were applied based on the registration results. Electronic Portal Imaging Device (EPID)-based Winston Lutz (WL) tests were performed to quantify the errors of the targeting accuracy of the system at 15 combinations of gantry, collimator and couch positions. The images were analyzed using two different methods. a) The classic method. The deviation was calculated by measuring the radial distance between the center of the central BB and the full width at half maximum of the radiation field. b) The center pixel method. Since the imager projection offset from the treatment isocenter was known from the IsoCal calibration, the deviation was determined between the center of the BB and the central pixel of the imager panel. Results: Using the automatic registration method to localize the phantom and the classic method of measuring the deviation of the BB center, the mean and standard deviation of the radial distance was 0.44 ± 0.25, 0.47 ± 0.26, and 0.43 ± 0.13 mm for the jaw, MLC and cone defined field sizes respectively. When the center pixel method was used, the mean and standard deviation was 0.32 ± 0.18, 0.32 ± 0.17, and 0.32 ± 0.19 mm respectively. Conclusion: Our results demonstrated that the center pixel method accurately analyzes the WL images to evaluate the targeting accuracy of the radiosurgery system. The work was supported by a Research Scholar Grant, RSG-15-137-01-CCE from the American

  1. Gulf of California species and catch spatial distributions and historical time series - Developing end-to-end models of the Gulf of California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the northern Gulf of California, linking oceanography, biogeochemistry, food web...

  2. West Coast fish, mammal, bird life history and abunance parameters - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  3. West Coast fish, mammal, and bird species diets - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  4. Secondary link adaptation in cognitive radio networks: End-to-end performance with cross-layer design

    KAUST Repository

    Ma, Hao

    2012-04-01

    Under spectrum-sharing constraints, we consider the secondary link exploiting cross-layer combining of adaptive modulation and coding (AMC) at the physical layer with truncated automatic repeat request (T-ARQ) at the data link layer in cognitive radio networks. Both, basic AMC and aggressive AMC, are adopted to optimize the overall average spectral efficiency, subject to the interference constraints imposed by the primary user of the shared spectrum band and a target packet loss rate. We achieve the optimal boundary points in closed form to choose the AMC transmission modes by taking into account the channel state information from the secondary transmitter to both the primary receiver and the secondary receiver. Moreover, numerical results substantiate that, without any cost in the transmitter/receiver design nor the end-to-end delay, the scheme with aggressive AMC outperforms that with conventional AMC. The main reason is that, with aggressive AMC, different transmission modes utilized in the initial packet transmission and the following retransmissions match the time-varying channel conditions better than the basic pattern. © 2012 IEEE.

  5. SampleCNN: End-to-End Deep Convolutional Neural Networks Using Very Small Filters for Music Classification

    Directory of Open Access Journals (Sweden)

    Jongpil Lee

    2018-01-01

    Full Text Available Convolutional Neural Networks (CNN have been applied to diverse machine learning tasks for different modalities of raw data in an end-to-end fashion. In the audio domain, a raw waveform-based approach has been explored to directly learn hierarchical characteristics of audio. However, the majority of previous studies have limited their model capacity by taking a frame-level structure similar to short-time Fourier transforms. We previously proposed a CNN architecture which learns representations using sample-level filters beyond typical frame-level input representations. The architecture showed comparable performance to the spectrogram-based CNN model in music auto-tagging. In this paper, we extend the previous work in three ways. First, considering the sample-level model requires much longer training time, we progressively downsample the input signals and examine how it affects the performance. Second, we extend the model using multi-level and multi-scale feature aggregation technique and subsequently conduct transfer learning for several music classification tasks. Finally, we visualize filters learned by the sample-level CNN in each layer to identify hierarchically learned features and show that they are sensitive to log-scaled frequency.

  6. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy.

    Science.gov (United States)

    Gallas, Raya R; Hünemohr, Nora; Runz, Armin; Niebuhr, Nina I; Jäkel, Oliver; Greilich, Steffen

    2015-12-01

    With the increasing complexity of external beam therapy "end-to-end" tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification. Copyright © 2015. Published by Elsevier GmbH.

  7. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Gallas, Raya R.; Huenemohr, Nora; Runz, Armin; Niebuhr, Nina I.; Greilich, Steffen [German Cancer Research Center (DKFZ), Heidelberg (Germany). Div. of Medical Physics in Radiation Oncology; National Center for Radiation Research in Oncology, Heidelberg (Germany). Heidelberg Institute of Radiation Oncology (HIRO); Jaekel, Oliver [German Cancer Research Center (DKFZ), Heidelberg (Germany). Div. of Medical Physics in Radiation Oncology; National Center for Radiation Research in Oncology, Heidelberg (Germany). Heidelberg Institute of Radiation Oncology (HIRO); Heidelberg University Hospital (Germany). Dept. of Radiation Oncology; Heidelberg Ion-Beam Therapy Center (HIT), Heidelberg (Germany)

    2015-07-01

    With the increasing complexity of external beam therapy ''end-to-end'' tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification.

  8. Vision-based mobile robot navigation through deep convolutional neural networks and end-to-end learning

    Science.gov (United States)

    Zhang, Yachu; Zhao, Yuejin; Liu, Ming; Dong, Liquan; Kong, Lingqin; Liu, Lingling

    2017-09-01

    In contrast to humans, who use only visual information for navigation, many mobile robots use laser scanners and ultrasonic sensors along with vision cameras to navigate. This work proposes a vision-based robot control algorithm based on deep convolutional neural networks. We create a large 15-layer convolutional neural network learning system and achieve the advanced recognition performance. Our system is trained from end to end to map raw input images to direction in supervised mode. The images of data sets are collected in a wide variety of weather conditions and lighting conditions. Besides, the data sets are augmented by adding Gaussian noise and Salt-and-pepper noise to avoid overfitting. The algorithm is verified by two experiments, which are line tracking and obstacle avoidance. The line tracking experiment is proceeded in order to track the desired path which is composed of straight and curved lines. The goal of obstacle avoidance experiment is to avoid the obstacles indoor. Finally, we get 3.29% error rate on the training set and 5.1% error rate on the test set in the line tracking experiment, 1.8% error rate on the training set and less than 5% error rate on the test set in the obstacle avoidance experiment. During the actual test, the robot can follow the runway centerline outdoor and avoid the obstacle in the room accurately. The result confirms the effectiveness of the algorithm and our improvement in the network structure and train parameters

  9. Towards a cross-platform software framework to support end-to-end hydrometeorological sensor network deployment

    Science.gov (United States)

    Celicourt, P.; Sam, R.; Piasecki, M.

    2016-12-01

    Global phenomena such as climate change and large scale environmental degradation require the collection of accurate environmental data at detailed spatial and temporal scales from which knowledge and actionable insights can be derived using data science methods. Despite significant advances in sensor network technologies, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome and expensive task. These factors demonstrate why environmental data collection remains a challenge especially in developing countries where technical infrastructure, expertise and pecuniary resources are scarce. In addition, they also demonstrate the reason why dense and long-term environmental data collection has been historically quite difficult. Moreover, hydrometeorological data collection efforts usually overlook the (critically important) inclusion of a standards-based system for storing, managing, organizing, indexing, documenting and sharing sensor data. We are developing a cross-platform software framework using the Python programming language that will allow us to develop a low cost end-to-end (from sensor to publication) system for hydrometeorological conditions monitoring. The software framework contains provision for sensor, sensor platforms, calibration and network protocols description, sensor programming, data storage, data publication and visualization and more importantly data retrieval in a desired unit system. It is being tested on the Raspberry Pi microcomputer as end node and a laptop PC as the base station in a wireless setting.

  10. Innovative strategy for effective critical laboratory result management: end-to-end process using automation and manual call centre.

    Science.gov (United States)

    Ti, Lian Kah; Ang, Sophia Bee Leng; Saw, Sharon; Sethi, Sunil Kumar; Yip, James W L

    2012-08-01

    Timely reporting and acknowledgement are crucial steps in critical laboratory results (CLR) management. The authors previously showed that an automated pathway incorporating short messaging system (SMS) texts, auto-escalation, and manual telephone back-up improved the rate and speed of physician acknowledgement compared with manual telephone calling alone. This study investigated if it also improved the rate and speed of physician intervention to CLR and whether utilising the manual back-up affected intervention rates. Data from seven audits between November 2007 and January 2011 were analysed. These audits were carried out to assess the robustness of CLR reporting process in the authors' institution. Comparisons were made in the rate and speed of acknowledgement and intervention between the audits performed before and after automation. Using the automation audits, the authors compared intervention data between communication with SMS only and when manual intervention was required. 1680 CLR were reported during the audit periods. Automation improved the rate (100% vs 84.2%; pautomation audits, the use of SMS only did not improve physician intervention rates. The automated communication pathway improved physician intervention rate and time in tandem with improved acknowledgement rate and time when compared with manual telephone calling. The use of manual intervention to augment automation did not adversely affect physician intervention rate, implying that an end-to-end pathway was more important than automation alone.

  11. A real-time 3D end-to-end augmented reality system (and its representation transformations)

    Science.gov (United States)

    Tytgat, Donny; Aerts, Maarten; De Busser, Jeroen; Lievens, Sammy; Rondao Alface, Patrice; Macq, Jean-Francois

    2016-09-01

    The new generation of HMDs coming to the market is expected to enable many new applications that allow free viewpoint experiences with captured video objects. Current applications usually rely on 3D content that is manually created or captured in an offline manner. In contrast, this paper focuses on augmented reality applications that use live captured 3D objects while maintaining free viewpoint interaction. We present a system that allows live dynamic 3D objects (e.g. a person who is talking) to be captured in real-time. Real-time performance is achieved by traversing a number of representation formats and exploiting their specific benefits. For instance, depth images are maintained for fast neighborhood retrieval and occlusion determination, while implicit surfaces are used to facilitate multi-source aggregation for both geometry and texture. The result is a 3D reconstruction system that outputs multi-textured triangle meshes at real-time rates. An end-to-end system is presented that captures and reconstructs live 3D data and allows for this data to be used on a networked (AR) device. For allocating the different functional blocks onto the available physical devices, a number of alternatives are proposed considering the available computational power and bandwidth for each of the components. As we will show, the representation format can play an important role in this functional allocation and allows for a flexible system that can support a highly heterogeneous infrastructure.

  12. End-to-End Data Rate Performance of Decode-and-Forward Relaying with Different Resource Allocation Schemes

    Directory of Open Access Journals (Sweden)

    Inam Ullah

    2017-01-01

    Full Text Available This paper studies the end-to-end (e2e data rate of dual-hop Decode-and-Forward (DF infrastructure relaying under different resource allocation schemes. In this context, we first provide a comparative analysis of the optimal resource allocation scheme with respect to several other approaches in order to provide insights into the system behavior and show the benefits of each alternative. Then, assuming the optimal resource allocation, a closed form expression for the distribution of the mean and outage data rates is derived. It turns out that the corresponding mean e2e data rate formula attains an expression in terms of an integral that does not admit a closed form solution. Therefore, a tight lower bound formula for the mean e2e data rate is presented. Results can be used to select the most convenient resource allocation scheme and perform link dimensioning in the network planning phase, showing the explicit relationships that exist between component link bandwidths, SNR values, and mean data rate.

  13. On cryptographic security of end-to-end encrypted connections in WhatsApp and Telegram messengers

    Directory of Open Access Journals (Sweden)

    Sergey V. Zapechnikov

    2017-11-01

    Full Text Available The aim of this work is to analyze the available possibilities for improving secure messaging with end-to-end connections under conditions of external violator actions and distrusted service provider. We made a comparative analysis of cryptographic security mechanisms for two widely used messengers: Telegram and WhatsApp. It was found that Telegram is based on MTProto protocol, while WhatsApp is based on the alternative Signal protocol. We examine the specific features of messengers implementation associated with random number generation on the most popular Android mobile platform. It was shown that Signal has better security properties. It is used in several other popular messengers such as TextSecure, RedPhone, GoogleAllo, FacebookMessenger, Signal along with WhatsApp. A number of possible attacks on both messengers were analyzed in details. In particular, we demonstrate that the metadata are poorly protected in both messengers. Metadata security may be one of the goals for further studies.

  14. Effects of collagen membranes enriched with in vitro-differentiated N1E-115 cells on rat sciatic nerve regeneration after end-to-end repair

    Directory of Open Access Journals (Sweden)

    Fornaro Michele

    2010-02-01

    Full Text Available Abstract Peripheral nerves possess the capacity of self-regeneration after traumatic injury but the extent of regeneration is often poor and may benefit from exogenous factors that enhance growth. The use of cellular systems is a rational approach for delivering neurotrophic factors at the nerve lesion site, and in the present study we investigated the effects of enwrapping the site of end-to-end rat sciatic nerve repair with an equine type III collagen membrane enriched or not with N1E-115 pre-differentiated neural cells. After neurotmesis, the sciatic nerve was repaired by end-to-end suture (End-to-End group, end-to-end suture enwrapped with an equine collagen type III membrane (End-to-EndMemb group; and end-to-end suture enwrapped with an equine collagen type III membrane previously covered with neural cells pre-differentiated in vitro from N1E-115 cells (End-to-EndMembCell group. Along the postoperative, motor and sensory functional recovery was evaluated using extensor postural thrust (EPT, withdrawal reflex latency (WRL and ankle kinematics. After 20 weeks animals were sacrificed and the repaired sciatic nerves were processed for histological and stereological analysis. Results showed that enwrapment of the rapair site with a collagen membrane, with or without neural cell enrichment, did not lead to any significant improvement in most of functional and stereological predictors of nerve regeneration that we have assessed, with the exception of EPT which recovered significantly better after neural cell enriched membrane employment. It can thus be concluded that this particular type of nerve tissue engineering approach has very limited effects on nerve regeneration after sciatic end-to-end nerve reconstruction in the rat.

  15. Adaptation and validation of a commercial head phantom for cranial radiosurgery dosimetry end-to-end audit.

    Science.gov (United States)

    Dimitriadis, Alexis; Palmer, Antony L; Thomas, Russell A S; Nisbet, Andrew; Clark, Catharine H

    2017-06-01

    To adapt and validate an anthropomorphic head phantom for use in a cranial radiosurgery audit. Two bespoke inserts were produced for the phantom: one for providing the target and organ at risk for delineation and the other for performing dose measurements. The inserts were tested to assess their positional accuracy. A basic treatment plan dose verification with an ionization chamber was performed to establish a baseline accuracy for the phantom and beam model. The phantom and inserts were then used to perform dose verification measurements of a radiosurgery plan. The dose was measured with alanine pellets, EBT extended dose film and a plastic scintillation detector (PSD). Both inserts showed reproducible positioning (±0.5 mm) and good positional agreement between them (±0.6 mm). The basic treatment plan measurements showed agreement to the treatment planning system (TPS) within 0.5%. Repeated film measurements showed consistent gamma passing rates with good agreement to the TPS. For 2%-2 mm global gamma, the mean passing rate was 96.7% and the variation in passing rates did not exceed 2.1%. The alanine pellets and PSD showed good agreement with the TPS (-0.1% and 0.3% dose difference in the target) and good agreement with each other (within 1%). The adaptations to the phantom showed acceptable accuracies. The presence of alanine and PSD do not affect film measurements significantly, enabling simultaneous measurements by all three detectors. Advances in knowledge: A novel method for thorough end-to-end test of radiosurgery, with capability to incorporate all steps of the clinical pathway in a time-efficient and reproducible manner, suitable for a national audit.

  16. First Demonstration of Real-Time End-to-End 40 Gb/s PAM-4 System using 10-G Transmitter for Next Generation Access Applications

    DEFF Research Database (Denmark)

    Wei, Jinlong; Eiselt, Nicklas; Griesser, Helmut

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved.......We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved....

  17. Greenhouse gas profiling by infrared-laser and microwave occultation: retrieval algorithm and demonstration results from end-to-end simulations

    Science.gov (United States)

    Proschek, V.; Kirchengast, G.; Schweitzer, S.

    2011-10-01

    Measuring greenhouse gas (GHG) profiles with global coverage and high accuracy and vertical resolution in the upper troposphere and lower stratosphere (UTLS) is key for improved monitoring of GHG concentrations in the free atmosphere. In this respect a new satellite mission concept adding an infrared-laser part to the already well studied microwave occultation technique exploits the joint propagation of infrared-laser and microwave signals between Low Earth Orbit (LEO) satellites. This synergetic combination, referred to as LEO-LEO microwave and infrared-laser occultation (LMIO) method, enables to retrieve thermodynamic profiles (pressure, temperature, humidity) and accurate altitude levels from the microwave signals and GHG profiles from the simultaneously measured infrared-laser signals. However, due to the novelty of the LMIO method, a retrieval algorithm for GHG profiling is not yet available. Here we introduce such an algorithm for retrieving GHGs from LEO-LEO infrared-laser occultation (LIO) data, applied as a second step after retrieving thermodynamic profiles from LEO-LEO microwave occultation (LMO) data. We thoroughly describe the LIO retrieval algorithm and unveil the synergy with the LMO-retrieved pressure, temperature, and altitude information. We furthermore demonstrate the effective independence of the GHG retrieval results from background (a priori) information in discussing demonstration results from LMIO end-to-end simulations for a representative set of GHG profiles, including carbon dioxide (CO2), water vapor (H2O), methane (CH4), and ozone (O3). The GHGs except for ozone are well retrieved throughout the UTLS, while ozone is well retrieved from about 10 km to 15 km upwards, since the ozone layer resides in the lower stratosphere. The GHG retrieval errors are generally smaller than 1% to 3% r.m.s., at a vertical resolution of about 1 km. The retrieved profiles also appear unbiased, which points to the climate benchmarking capability of the LMIO

  18. End-to-end renal vein anastomosis to preserve renal venous drainage following inferior vena cava radical resection due to leiomyosarcoma.

    Science.gov (United States)

    Araujo, Raphael L C; Gaujoux, Sébastien; D'Albuquerque, Luiz Augusto Carneiro; Sauvanet, Alain; Belghiti, Jacques; Andraus, Wellington

    2014-05-01

    When retrohepatic inferior vena cava (IVC) resection is required, for example, for IVC leiomyosarcoma, reconstruction is recommended. This is particularly true when the renal vein confluence is resected to preserve venous outflow, including that of the right kidney. Two patients with retrohepatic IVC leiomyosarcoma involving renal vein confluences underwent hepatectomy with en bloc IVC resection below the renal vein confluence. IVC reconstruction was not performed, but end-to-end renal vein anastomoses were, including a prosthetic graft in 1 case. The postoperative course was uneventful with respect to kidney function, anastomosis patency assessed using Doppler ultrasonography and computerized tomography, and transient lower limb edema. End-to-end renal vein anastomosis after a retrohepatic IVC resection including the renal vein confluence should be considered as an alternative option for preserving right kidney drainage through the left renal vein when IVC reconstruction is not possible or should be avoided. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. An integrated end-to-end modeling framework for testing ecosystem-wide effects of human-induced pressures in the Baltic Sea

    DEFF Research Database (Denmark)

    Palacz, Artur; Nielsen, J. Rasmus; Christensen, Asbjørn

    We present an integrated end-to-end modeling framework that enables whole-of ecosystem climate, eutrophication, and spatial management scenario exploration in the Baltic Sea. The framework is built around the Baltic implementation of the spatially-explicit end-to-end ATLANTIS model, linked...... to the high-resolution coupled physical-biological model HBM-ERGOM and the fisheries bio-economic FishRent model. We investigate ecosystem-wide responses to changes in human-induced pressures by simulating several eutrophication scenarios that are relevant to existing Baltic Sea management plans (e.g. EU BSAP......, EU CFP). We further present the structure and calibration of the Baltic ATLANTIS model and the operational linkage to the other models. Using the results of eutrophication scenarios, and focusing on the relative changes in fish and fishery production, we discuss the robustness of the model linking...

  20. An Integrated End-to-End Modeling Framework for Testing Ecosystem-Wide Effects of Human-Induced Pressures in the Baltic Sea

    DEFF Research Database (Denmark)

    Palacz, Artur; Maar, Marie; Nielsen, Rasmus

    We present an integrated end-to-end modeling framework that enables whole-of ecosystem climate, eutrophication, and spatial management scenario exploration in the Baltic Sea. The framework is built around the Baltic implementation of the spatially-explicit end-to-end ATLANTIS model, linked...... to the high-resolution coupled physical-biological model HBM-ERGOM and the fisheries bio-economic FishRent model. We investigate ecosystem-wide responses to changes in human-induced pressures by simulating several eutrophication scenarios that are relevant to existing Baltic Sea management plans (e.g. EU BSAP......, EU CFP). We further present the structure and calibration of the Baltic ATLANTIS model and the operational linkage to the other models. Using the results of eutrophication scenarios, and focusing on the relative changes in fish and fishery production, we discuss the robustness of the model linking...

  1. Including 10-Gigabit-capable Passive Optical Network under End-to-End Generalized Multi-Protocol Label Switching Provisioned Quality of Service

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Gavler, Anders; Wessing, Henrik

    2012-01-01

    End-to-end quality of service provisioning is still a challenging task despite many years of research and development in this area. Considering a generalized multi-protocol label switching based core/metro network and resource reservation protocol capable home gateways, it is the access part...... of the network where quality of service signaling is bridged. This article proposes strategies for generalized multi-protocol label switching control over next emerging passive optical network standard, i.e., the 10-gigabit-capable passive optical network. Node management and resource allocation approaches...... are discussed, and possible issues are raised. The analysis shows that consideration of a 10-gigabit-capable passive optical network as a generalized multi-protocol label switching controlled domain is valid and may advance end-to-end quality of service provisioning for passive optical network based customers....

  2. One stage functional end-to-end stapled intestinal anastomosis and resection performed by nonexpert surgeons for the treatment of small intestinal obstruction in 30 dogs.

    Science.gov (United States)

    Jardel, Nicolas; Hidalgo, Antoine; Leperlier, Dimitri; Manassero, Mathieu; Gomes, Aymeric; Bedu, Anne Sophie; Moissonnier, Pierre; Fayolle, Pascal; Begon, Dominique; Riquois, Elisabeth; Viateau, Véronique

    2011-02-01

    To describe stapled 1-stage functional end-to-end intestinal anastomosis for treatment of small intestinal obstruction in dogs and evaluate outcome when the technique is performed by nonexpert surgeons after limited training in the technique. Case series. Dogs (n=30) with intestinal lesions requiring an enterectomy. Stapled 1-stage functional end-to-end anastomosis and resection using a GIA-60 and a TA-55 stapling devices were performed under supervision of senior residents and faculty surgeons by junior surgeons previously trained in the technique on pigs. Procedure duration and technical problems were recorded. Short-term results were collected during hospitalization and at suture removal. Long-term outcome was established by clinical and ultrasonographic examinations at least 2 months after surgery and from written questionnaires, completed by owners. Mean±SD procedure duration was 15±12 minutes. Postoperative recovery was uneventful in 25 dogs. One dog had anastomotic leakage, 1 had a localized abscess at the transverse staple line, and 3 dogs developed an incisional abdominal wall abscess. No long-term complications occurred (follow-up, 2-32 months). Stapled 1-stage functional end-to-end anastomosis and resection is a fast and safe procedure in the hand of nonexpert but trained surgeons. © Copyright 2011 by The American College of Veterinary Surgeons.

  3. Direct muscle neurotization after end-to end and end-to-side neurorrhaphy: An experimental study in the rat forelimb model.

    Science.gov (United States)

    Papalia, Igor; Ronchi, Giulia; Muratori, Luisa; Mazzucco, Alessandra; Magaudda, Ludovico; Geuna, Stefano

    2012-10-15

    The need for the continuous research of new tools for improving motor function recovery after nerve injury is justified by the still often unsatisfactory clinical outcome in these patients. It has been previously shown that the combined use of two reconstructive techniques, namely end-to-side neurorrhaphy and direct muscle neurotization in the rat hindlimb model, can lead to good results in terms of skeletal muscle reinnervation. Here we show that, in the rat forelimb model, the combined use of direct muscle neurotization with either end-to-end or end-to-side neurorrhaphy to reinnervate the denervated flexor digitorum muscles, leads to muscle atrophy prevention over a long postoperative time lapse (10 months). By contrast, very little motor recovery (in case of end-to-end neurorrhaphy) and almost no motor recovery (in case of end-to-side neurorrhaphy) were observed in the grasping activity controlled by flexor digitorum muscles. It can thus be concluded that, at least in the rat, direct muscle neurotization after both end-to-end and end-to-side neurorrhaphy represents a good strategy for preventing denervation-related muscle atrophy but not for regaining the lost motor function.

  4. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Richard Edward [ORNL; Fugate, David L [ORNL; Cetiner, Sacit M [ORNL; Qualls, A L [ORNL

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactor innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  5. A fully automatic end-to-end method for content-based image retrieval of CT scans with similar liver lesion annotations.

    Science.gov (United States)

    Spanier, A B; Caplan, N; Sosna, J; Acar, B; Joskowicz, L

    2018-01-01

    The goal of medical content-based image retrieval (M-CBIR) is to assist radiologists in the decision-making process by retrieving medical cases similar to a given image. One of the key interests of radiologists is lesions and their annotations, since the patient treatment depends on the lesion diagnosis. Therefore, a key feature of M-CBIR systems is the retrieval of scans with the most similar lesion annotations. To be of value, M-CBIR systems should be fully automatic to handle large case databases. We present a fully automatic end-to-end method for the retrieval of CT scans with similar liver lesion annotations. The input is a database of abdominal CT scans labeled with liver lesions, a query CT scan, and optionally one radiologist-specified lesion annotation of interest. The output is an ordered list of the database CT scans with the most similar liver lesion annotations. The method starts by automatically segmenting the liver in the scan. It then extracts a histogram-based features vector from the segmented region, learns the features' relative importance, and ranks the database scans according to the relative importance measure. The main advantages of our method are that it fully automates the end-to-end querying process, that it uses simple and efficient techniques that are scalable to large datasets, and that it produces quality retrieval results using an unannotated CT scan. Our experimental results on 9 CT queries on a dataset of 41 volumetric CT scans from the 2014 Image CLEF Liver Annotation Task yield an average retrieval accuracy (Normalized Discounted Cumulative Gain index) of 0.77 and 0.84 without/with annotation, respectively. Fully automatic end-to-end retrieval of similar cases based on image information alone, rather that on disease diagnosis, may help radiologists to better diagnose liver lesions.

  6. Experience of using MOSFET detectors for dose verification measurements in an end-to-end 192Ir brachytherapy quality assurance system.

    Science.gov (United States)

    Persson, Maria; Nilsson, Josef; Carlsson Tedgren, Åsa

    2017-10-27

    Establishment of an end-to-end system for the brachytherapy (BT) dosimetric chain could be valuable in clinical quality assurance. Here, the development of such a system using MOSFET (metal oxide semiconductor field effect transistor) detectors and experience gained during 2 years of use are reported with focus on the performance of the MOSFET detectors. A bolus phantom was constructed with two implants, mimicking prostate and head & neck treatments, using steel needles and plastic catheters to guide the 192Ir source and house the MOSFET detectors. The phantom was taken through the BT treatment chain from image acquisition to dose evaluation. During the 2-year evaluation-period, delivered doses were verified a total of 56 times using MOSFET detectors which had been calibrated in an external 60Co beam. An initial experimental investigation on beam quality differences between 192Ir and 60Co is reported. The standard deviation in repeated MOSFET measurements was below 3% in the six measurement points with dose levels above 2 Gy. MOSFET measurements overestimated treatment planning system doses by 2-7%. Distance-dependent experimental beam quality correction factors derived in a phantom of similar size as that used for end-to-end tests applied on a time-resolved measurement improved the agreement. MOSFET detectors provide values stable over time and function well for use as detectors for end-to-end quality assurance purposes in 192Ir BT. Beam quality correction factors should address not only distance from source but also phantom dimensions. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  7. Interoperable End-to-End Remote Patient Monitoring Platform based on IEEE 11073 PHD and ZigBee Health Care Profile.

    Science.gov (United States)

    Clarke, Malcolm; de Folter, Joost; Verma, Vivek; Gokalp, Hulya

    2017-08-07

    This paper describes the implementation of an end-to-end remote monitoring platform based on the IEEE 11073 standards for Personal Health Devices (PHD). It provides an overview of the concepts and approaches and describes how the standard has been optimized for small devices with limited resources of processor, memory and power and that use short range wireless technology. It explains aspects of IEEE 11073, including the Domain Information Model, state model and nomenclature, and how these support its plug-and-play architecture. It shows how these aspects underpin a much larger eco-system of interoperable devices and systems that include IHE PCD-01, HL7 and BlueTooth LE medical devices, and the relationship to the Continua Guidelines, advocating the adoption of data standards and nomenclature to support semantic interoperability between health and ambient assisted living (AAL) in future platforms. The paper further describes the adaptions that have been made in order to implement the standard on the ZigBee Health Care Profile and the experiences of implementing an end-to-end platform that has been deployed to frail elderly patients with chronic disease(s) and patients with diabetes.

  8. Effective link quality estimation as a means to improved end-to-end packet delivery in high traffic mobile ad hoc networks☆

    Directory of Open Access Journals (Sweden)

    Syed Rehan Afzal

    2017-08-01

    Full Text Available Accurate link quality estimation is a fundamental building block in quality aware multi hop routing. In an inherently lossy, unreliable and dynamic medium such as wireless, the task of accurate estimation becomes very challenging. Over the years ETX has been widely used as a reliable link quality estimation metric. However, more recently it has been established that under heavy traffic loads ETX performance gets significantly worse. We examine the ETX metric's behavior in detail with respect to the MAC layer and UDP data; and identify the causes of its unreliability. Motivated by the observations made in our analysis, we present the design and implementation of our link quality measurement metric xDDR – a variation of ETX. This article extends xDDR to support network mobility. Our experiments show that xDDR substantially outperforms minimum hop count, ETX and HETX in terms of end-to-end packet delivery ratio in static as well as mobile scenarios.

  9. Demonstration of a fully-coupled end-to-end model for small pelagic fish using sardine and anchovy in the California Current

    Science.gov (United States)

    Rose, Kenneth A.; Fiechter, Jerome; Curchitser, Enrique N.; Hedstrom, Kate; Bernal, Miguel; Creekmore, Sean; Haynie, Alan; Ito, Shin-ichi; Lluch-Cota, Salvador; Megrey, Bernard A.; Edwards, Chris A.; Checkley, Dave; Koslow, Tony; McClatchie, Sam; Werner, Francisco; MacCall, Alec; Agostini, Vera

    2015-11-01

    We describe and document an end-to-end model of anchovy and sardine population dynamics in the California Current as a proof of principle that such coupled models can be developed and implemented. The end-to-end model is 3-dimensional, time-varying, and multispecies, and consists of four coupled submodels: hydrodynamics, Eulerian nutrient-phytoplankton-zooplankton (NPZ), an individual-based full life cycle anchovy and sardine submodel, and an agent-based fishing fleet submodel. A predator roughly mimicking albacore was included as individuals that consumed anchovy and sardine. All submodels were coded within the ROMS open-source community model, and used the same resolution spatial grid and were all solved simultaneously to allow for possible feedbacks among the submodels. We used a super-individual approach and solved the coupled models on a distributed memory parallel computer, both of which created challenging but resolvable bookkeeping challenges. The anchovy and sardine growth, mortality, reproduction, and movement, and the fishing fleet submodel, were each calibrated using simplified grids before being inserted into the full end-to-end model. An historical simulation of 1959-2008 was performed, and the latter 45 years analyzed. Sea surface height (SSH) and sea surface temperature (SST) for the historical simulation showed strong horizontal gradients and multi-year scale temporal oscillations related to various climate indices (PDO, NPGO), and both showed responses to ENSO variability. Simulated total phytoplankton was lower during strong El Nino events and higher for the strong 1999 La Nina event. The three zooplankton groups generally corresponded to the spatial and temporal variation in simulated total phytoplankton. Simulated biomasses of anchovy and sardine were within the historical range of observed biomasses but predicted biomasses showed much less inter-annual variation. Anomalies of annual biomasses of anchovy and sardine showed a switch in the mid

  10. Application of Modified Direct Denitration to Support the ORNL Coupled-End-to-End Demonstration in Production of Mixed Oxides Suitable for Pellet Fabrication

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Elisabeth A [ORNL; Vedder, Raymond James [ORNL; Felker, Leslie Kevin [ORNL; Marschman, Steve [ORNL

    2007-01-01

    The current and future development of the Modified Direct Denitration (MDD)process is in support of Oak Ridge National Laboratory's (ORNL) Coupled End-to-End (CETE) research, development, and demonstration (R&D) of proposed advanced fuel reprocessing and fuel fabrication processes. This work will involve the co-conversion of the U/Pu/Np product streams from the UREX+3 separation flow sheet utilizing the existing MDD glove-box setup and the in-cell co-conversion of the U/Pu/Np/Am/Cm product streams from the UREX+1a flow sheet. Characterization equipment is being procured and installed. Oxide powder studies are being done on calcination/reduction variables, as well as pressing and sintering of pellets to permit metallographic examinations.

  11. SU-F-P-37: Implementation of An End-To-End QA Test of the Radiation Therapy Imaging, Planning and Delivery Process to Identify and Correct Possible Sources of Deviation

    Energy Technology Data Exchange (ETDEWEB)

    Salinas Aranda, F; Suarez, V; Arbiser, S; Sansogne, R [Vidt Centro Medico, Ciudad Autonoma De Buenos Aires, Ciudad Autonoma de Buenos Aire (Argentina)

    2016-06-15

    Purpose: To implement an end-to-end QA test of the radiation therapy imaging, planning and delivery process, aimed to assess the dosimetric agreement accuracy between planned and delivered treatment, in order to identify and correct possible sources of deviation. To establish an internal standard for machine commissioning acceptance. Methods: A test involving all steps of the radiation therapy: imaging, planning and delivery process was designed. The test includes analysis of point dose and planar dose distributions agreement between TPS calculated and measured dose. An ad hoc 16 cm diameter PMMA phantom was constructed with one central and four peripheral bores that can accommodate calibrated electron density inserts. Using Varian Eclipse 10.0 and Elekta XiO 4.50 planning systems, IMRT, RapidArc and 3DCRT with hard and dynamic wedges plans were planned on the phantom and tested. An Exradin A1SL chamber is used with a Keithley 35617EBS electrometer for point dose measurements in the phantom. 2D dose distributions were acquired using MapCheck and Varian aS1000 EPID.Gamma analysis was performed for evaluation of 2D dose distribution agreement using MapCheck software and Varian Portal Dosimetry Application.Varian high energy Clinacs Trilogy, 2100C/CD, 2000CR and low energy 6X/EX where tested.TPS-CT# vs. electron density table were checked for CT-scanners used. Results: Calculated point doses were accurate to 0.127% SD: 0.93%, 0.507% SD: 0.82%, 0.246% SD: 1.39% and 0.012% SD: 0.01% for LoX-3DCRT, HiX-3DCRT, IMRT and RapidArc plans respectively. Planar doses pass gamma 3% 3mm in all cases and 2% 2mm for VMAT plans. Conclusion: Implementation of a simple and reliable quality assurance tool was accomplished. The end-to-end proved efficient, showing excellent agreement between planned and delivered dose evidencing strong consistency of the whole process from imaging through planning to delivery. This test can be used as a first step in beam model acceptance for clinical

  12. Ecosystem limits to food web fluxes and fisheries yields in the North Sea simulated with an end-to-end food web model

    Science.gov (United States)

    Heath, Michael R.

    2012-09-01

    Equilibrium yields from an exploited fish stock represent the surplus production remaining after accounting for losses due to predation. However, most estimates of maximum sustainable yield, upon which fisheries management targets are partly based, assume that productivity and predation rates are constant in time or at least stationary. This means that there is no recognition of the potential for interaction between different fishing sectors. Here, an end-to-end ecosystem model is developed to explore the possible scale and mechanisms of interactions between pelagic and demersal fishing in the North Sea. The model simulates fluxes of nitrogen between detritus, inorganic nutrient and guilds of taxa spanning phytoplankton to mammals. The structure strikes a balance between graininess in space, taxonomy and demography, and the need to constrain the parameter-count sufficiently to enable automatic parameter optimization. Simulated annealing is used to locate the maximum likelihood parameter set, given the model structure and a suite of observations of annual rates of production and fluxes between guilds. Simulations of the impact of fishery harvesting rates showed that equilibrium yields of pelagic and demersal fish were strongly interrelated due to a variety of top-down and bottom-up food web interactions. The results clearly show that management goals based on simultaneously achieving maximum sustainable biomass yields from all commercial fish stocks is simply unattainable. Trade-offs between, for example, pelagic and demersal fishery sectors and other properties of the ecosystem have to be considered in devising an overall harvesting strategy.

  13. Partial QoS-Aware Opportunistic Relay Selection Over Two-Hop Channels: End-to-End Performance Under Spectrum-Sharing Requirements

    KAUST Repository

    Yuli Yang,

    2014-10-01

    In this paper, we propose a partial quality-of-service (QoS)-oriented relay selection scheme with a decode-and-forward (DF) relaying protocol, to reduce the feedback amount required for relay selection. In the proposed scheme, the activated relay is the one with the maximum signal-to-noise power ratio (SNR) in the second hop among those whose packet loss rates (PLRs) in the first hop achieve a predetermined QoS level. For the purpose of evaluating the performance of the proposed scheme, we exploit it with transmission constraints imposed on the transmit power budget and interference to other users. By analyzing the statistics of received SNRs in the first and second hops, we obtain the end-to-end PLR of this scheme in closed form under the considered scenario. Moreover, to compare the proposed scheme with popular relay selection schemes, we also derive the closed-form PLR expressions for partial relay selection (PRS) and opportunistic relay selection (ORS) criteria in the same scenario under study. Illustrative numerical results demonstrate the accuracy of our derivations and substantiate that the proposed relay selection scheme is a promising alternative with respect to the tradeoff between performance and complexity.

  14. Evaluation of a composite Gel-Alanine phantom on an end-to-end test to treat multiple brain metastases by a single isocenter VMAT technique.

    Science.gov (United States)

    Pavoni, Juliana Fernandes; Neves-Junior, Wellington Furtado Pimenta; da Silveira, Matheus Antonio; Haddad, Cecília Maria Kalil; Baffa, Oswaldo

    2017-09-01

    This work aims to evaluate the application of a cylindrical phantom made of dosimetric gel containing alanine pellets distributed inside the gel volume during an end-to-end test of a single isocenter VMAT for simultaneous treatment of multiple brain metastases. The evaluation is based on the comparison of the results obtained with the composite phantom with the treatment planning system (TPS) dose distribution validated by using the clinical conventional quality control with point and planar dose measurements. A cylindrical MAGIC-f gel phantom containing alanine dosimeters (composite phantom) was used to design the VMAT plan in the treatment planning system (TPS). The alanine dosimeters were pellets with radius of 2.5 mm and height of 3 mm, and played the role of brain metastasis inside the gel cylinder, which simulated the cerebral tissue. Five of the alanine dosimeters were selected to simulate five lesions; five planning target volumes (PTVs) were created including the dosimeters and irradiated with different doses. Conventional quality assurance (QA) was performed on the TPS plan and on the composite phantom; a phantom containing only gel (Gel 1 phantom) was also irradiated. One day after irradiation, magnetic resonance images were acquired for both phantoms on a 3T scanner. An electron spin resonance spectrometer was used to evaluate alanine doses. Calibration curves were constructed for the alanine and the gel dosimeters. All the gel only measurement was repeated (Gel 2 phantom) in order to confirm the previous gel measurement. The VMAT treatment plan was approved by the conventional QA. The doses measured by alanine dosimeters on the composite gel phantom agreed to the TPS on average within 3.3%. The alanine dose for each lesion was used to calibrate the gel dosimeter measurements of the concerned PTV. Both gel dose volume histograms (DVH) achieved for each PTV were in agreement with the expected TPS DVH, except for a small discrepancy observed for the Gel 2

  15. The Hurricane-Flood-Landslide Continuum: An Integrated, End-to-end Forecast and Warning System for Mountainous Islands in the Tropics

    Science.gov (United States)

    Golden, J.; Updike, R. G.; Verdin, J. P.; Larsen, M. C.; Negri, A. J.; McGinley, J. A.

    2004-12-01

    In the 10 days of 21-30 September 1998, Hurricane Georges left a trail of destruction in the Caribbean region and U.S. Gulf Coast. Subsequently, in the same year, Hurricane Mitch caused widespread destruction and loss of life in four Central American nations, and in December,1999 a tropical disturbance impacted the north coast of Venezuela causing hundreds of deaths and several million dollars of property loss. More recently, an off-season disturbance in the Central Caribbean dumped nearly 250 mm rainfall over Hispaniola during the 24-hr period on May 23, 2004. Resultant flash floods and debris flows in the Dominican Republic and Haiti killed at least 1400 people. In each instance, the tropical system served as the catalyst for major flooding and landslides at landfall. Our goal is to develop and transfer an end-to-end warning system for a prototype region in the Central Caribbean, specifically the islands of Puerto Rico and Hispaniola, which experience frequent tropical cyclones and other disturbances. The envisioned system would include satellite and surface-based observations to track and nowcast dangerous levels of precipitation, atmospheric and hydrological models to predict short-term runoff and streamflow changes, geological models to warn when and where landslides and debris flows are imminent, and the capability to communicate forecast guidance products via satellite to vital government offices in Puerto Rico, Haiti, and the Dominican Republic. In this paper, we shall present a preliminary proof-of-concept study for the May 21-24, 2004 floods and debris-flows over Hispaniola to show that the envisaged flow of data, models and graphical products can produce the desired warning outputs. The multidisciplinary research and technology transfer effort will require blending the talents of hydrometeorologists, geologists, remote sensing and GIS experts, and social scientists to ensure timely delivery of tailored graphical products to both weather offices and local

  16. Automated segmentation of 3D anatomical structures on CT images by using a deep convolutional network based on end-to-end learning approach

    Science.gov (United States)

    Zhou, Xiangrong; Takayama, Ryosuke; Wang, Song; Zhou, Xinxin; Hara, Takeshi; Fujita, Hiroshi

    2017-02-01

    We have proposed an end-to-end learning approach that trained a deep convolutional neural network (CNN) for automatic CT image segmentation, which accomplished a voxel-wised multiple classification to directly map each voxel on 3D CT images to an anatomical label automatically. The novelties of our proposed method were (1) transforming the anatomical structures segmentation on 3D CT images into a majority voting of the results of 2D semantic image segmentation on a number of 2D-slices from different image orientations, and (2) using "convolution" and "deconvolution" networks to achieve the conventional "coarse recognition" and "fine extraction" functions which were integrated into a compact all-in-one deep CNN for CT image segmentation. The advantage comparing to previous works was its capability to accomplish real-time image segmentations on 2D slices of arbitrary CT-scan-range (e.g. body, chest, abdomen) and produced correspondingly-sized output. In this paper, we propose an improvement of our proposed approach by adding an organ localization module to limit CT image range for training and testing deep CNNs. A database consisting of 240 3D CT scans and a human annotated ground truth was used for training (228 cases) and testing (the remaining 12 cases). We applied the improved method to segment pancreas and left kidney regions, respectively. The preliminary results showed that the accuracies of the segmentation results were improved significantly (pancreas was 34% and kidney was 8% increased in Jaccard index from our previous results). The effectiveness and usefulness of proposed improvement for CT image segmentations were confirmed.

  17. End-to-end process of hollow spacecraft structures with high frequency and low mass obtained with in-house structural optimization tool and additive manufacturing

    Directory of Open Access Journals (Sweden)

    Alexandru-Mihai CISMILIANU

    2017-09-01

    Full Text Available In the space sector the most decisive elements are: mass reduction, cost saving and minimum lead time; here, structural optimization and additive layer manufacturing (ALM fit best. The design must be driven by stiffness, because an important requirement for spacecraft (S/C structures is to reduce the dynamic coupling between the S/C and the launch vehicle. The objective is to create an end-to-end process, from the input given by the customer to the manufacturing of an aluminum part as light as possible but at the same time considerably stiffer while taking the full advantage of the design flexibility given by ALM. To design and optimize the parts, a specialized in-house tool was used, guaranteeing a load-sufficient material distribution. Using topological optimization, the iterations between the design and the stress departments were diminished, thus greatly reducing the lead time. In order to improve and lighten the obtained structure a design with internal cavities and hollow beams was considered. This implied developing of a procedure for powder evacuation through iterations with the manufacturer while optimizing the design for ALM. The resulted part can be then manufactured via ALM with no need of further design adjustments. To achieve a high-quality part with maximum efficiency, it is essential to have a loop between the design team and the manufacturer. Topological optimization and ALM work hand in hand if used properly. The team achieved a more efficient structure using topology optimization and ALM, than using conventional design and manufacturing methods.

  18. Rapid Cost Assessment of Space Mission Concepts through Application of Complexity Indices

    Science.gov (United States)

    Peterson, Craig; Cutts, James; Balint, Tibor; Hall, James B.

    2008-01-01

    In 2005, the Solar System Exploration Strategic Roadmap Conmrittee (chartered by NASA to develop the roadmap for Solar System Exploration Missions for the coming decades) found itself posed with the difficult problem of sorting through several mission concepts and determining their relative costs. While detailed mission studies are the normal approach to costing, neither the budget nor schedule allotted to the conmrittee could support such studies. Members of the Jet Propulsion Laboratory (JPL) supporting the conmrittee were given the challenge of developing a semi-quantitative approach that could provide the relative costs of these missions, without requiring an in depth study of the missions. In response to this challenge, a rapid cost assessment methodology based on a set of mission cost/complexity indexes was developed. This methodology also underwent two separate validations, one comparing its results when applied to historical missions, and another comparing its estimates against those of veteran space mission managers. Remarkably good agreement was achieved, suggesting that this approach provides an effective early indication of space mission costs.

  19. POTION: an end-to-end pipeline for positive Darwinian selection detection in genome-scale data through phylogenetic comparison of protein-coding genes.

    Science.gov (United States)

    Hongo, Jorge A; de Castro, Giovanni M; Cintra, Leandro C; Zerlotini, Adhemar; Lobo, Francisco P

    2015-08-01

    Detection of genes evolving under positive Darwinian evolution in genome-scale data is nowadays a prevailing strategy in comparative genomics studies to identify genes potentially involved in adaptation processes. Despite the large number of studies aiming to detect and contextualize such gene sets, there is virtually no software available to perform this task in a general, automatic, large-scale and reliable manner. This certainly occurs due to the computational challenges involved in this task, such as the appropriate modeling of data under analysis, the computation time to perform several of the required steps when dealing with genome-scale data and the highly error-prone nature of the sequence and alignment data structures needed for genome-wide positive selection detection. We present POTION, an open source, modular and end-to-end software for genome-scale detection of positive Darwinian selection in groups of homologous coding sequences. Our software represents a key step towards genome-scale, automated detection of positive selection, from predicted coding sequences and their homology relationships to high-quality groups of positively selected genes. POTION reduces false positives through several sophisticated sequence and group filters based on numeric, phylogenetic, quality and conservation criteria to remove spurious data and through multiple hypothesis corrections, and considerably reduces computation time thanks to a parallelized design. Our software achieved a high classification performance when used to evaluate a curated dataset of Trypanosoma brucei paralogs previously surveyed for positive selection. When used to analyze predicted groups of homologous genes of 19 strains of Mycobacterium tuberculosis as a case study we demonstrated the filters implemented in POTION to remove sources of errors that commonly inflate errors in positive selection detection. A thorough literature review found no other software similar to POTION in terms of customization

  20. SU-E-J-25: End-To-End (E2E) Testing On TomoHDA System Using a Real Pig Head for Intracranial Radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Corradini, N; Leick, M; Bonetti, M; Negretti, L [Clinica Luganese, Radiotherapy Center, Lugano (Switzerland)

    2015-06-15

    Purpose: To determine the MVCT imaging uncertainty on the TomoHDA system for intracranial radiosurgery treatments. To determine the end-to-end (E2E) overall accuracy of the TomoHDA system for intracranial radiosurgery. Methods: A pig head was obtained from the butcher, cut coronally through the brain, and preserved in formaldehyde. The base of the head was fixed to a positioning plate allowing precise movement, i.e. translation and rotation, in all 6 axes. A repeatability test was performed on the pig head to determine uncertainty in the image bone registration algorithm. Furthermore, the test studied images with MVCT slice thicknesses of 1 and 3 mm in unison with differing scan lengths. A sensitivity test was performed to determine the registration algorithm’s ability to find the absolute position of known translations/rotations of the pig head. The algorithm’s ability to determine absolute position was compared against that of manual operators, i.e. a radiation therapist and radiation oncologist. Finally, E2E tests for intracranial radiosurgery were performed by measuring the delivered dose distributions within the pig head using Gafchromic films. Results: The repeatability test uncertainty was lowest for the MVCTs of 1-mm slice thickness, which measured less than 0.10 mm and 0.12 deg for all axes. For the sensitivity tests, the bone registration algorithm performed better than human eyes and a maximum difference of 0.3 mm and 0.4 deg was observed for the axes. E2E test results in absolute position difference measured 0.03 ± 0.21 mm in x-axis and 0.28 ± 0.18 mm in y-axis. A maximum difference of 0.32 and 0.66 mm was observed in x and y, respectively. The average peak dose difference between measured and calculated dose was 2.7 cGy or 0.4%. Conclusion: Our tests using a pig head phantom estimate the TomoHDA system to have a submillimeter overall accuracy for intracranial radiosurgery.

  1. SU-E-T-360: End-To-End Dosimetric Testing of a Versa HD Linear Accelerator with the Agility Head Modeled in Pinnacle3

    Energy Technology Data Exchange (ETDEWEB)

    Saenz, D; Narayanasamy, G; Cruz, W; Papanikolaou, N; Stathakis, S [University of Texas Health Science Center at San Antonio, San Antonio, TX (United States)

    2015-06-15

    Purpose: The Versa HD incorporates a variety of upgrades, primarily including the Agility head. The distinct dosimetric properties of the head from its predecessors combined with flattening-filter-free (FFF) beams require a new investigation of modeling in planning systems and verification of modeling accuracy. Methods: A model was created in Pinnacle{sup 3} v9.8 with commissioned beam data. Leaf transmission was modeled as <0.5% with maximum leaf speed of 3 cm/s. Photon spectra were tuned for FFF beams, for which profiles were modeled with arbitrary profiles rather than with cones. For verification, a variety of plans with varied parameters were devised, and point dose measurements were compared to calculated values. A phantom of several plastic water and Styrofoam slabs was scanned and imported into Pinnacle{sup 3}. Beams of different field sizes, SSD, wedges, and gantry angles were created. All available photon energies (6 MV, 10 MV, 18 MV, 6 FFF, 10 FFF) as well four clinical electron energies (6, 9, 12, and 15 MeV) were investigated. The plans were verified at a calculation point (8 cm deep for photons, variable for electrons) by measurement with a PTW Semiflex ionization chamber. In addition, IMRT testing was performed with three standard plans (step and shoot IMRT, small and large field VMAT plans). The plans were delivered on the Delta4 IMRT QA phantom (ScandiDos, Uppsala, Sweden). Results: Homogeneous point dose measurement agreed within 2% for all photon and electron beams. Open field photon measurements along the central axis at 100 cm SSD passed within 1%. Gamma passing rates were >99.5% for all plans with a 3%/3mm tolerance criteria. The IMRT QA results for the first 23 patients yielded gamma passing rates of 97.4±2.3%. Conclusion: The end-to-end testing ensured confidence in the ability of Pinnacle{sup 3} to model photon and electron beams with the Agility head.

  2. SU-E-T-19: A New End-To-End Test Method for ExacTrac for Radiation and Plan Isocenter Congruence

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S; Nguyen, N; Liu, F; Huang, Y [Rhode Island Hospital / Warren Alpert Medical, Providence, RI (United States); Sio, T [Mayo Clinic, Rochester, MN (United States); Jung, J [East Carolina University, Greenville, North Carolina (United States); Pyakuryal, A [UniversityIllinois at Chicago, Chicago, IL (United States); Jang, S [Princeton Radiation Oncology Ctr., Jamesburg, NJ (United States)

    2014-06-01

    Purpose: To combine and integrate quality assurance (QA) of target localization and radiation isocenter End to End (E2E) test of BrainLAB ExacTrac system, a new QA approach was devised using anthropomorphic head and neck phantom. This test insures the target localization as well as radiation isocenter congruence which is one step ahead the current ExacTrac QA procedures. Methods: The head and neck phantom typically used for CyberKnife E2E test was irradiated to the sphere target that was visible in CT-sim images. The CT-sim was performed using 1 mm thickness slice with helical scanning technique. The size of the sphere was 3-cm diameter and contoured as a target volume using iPlan V.4.5.2. A conformal arc plan was generated using MLC-based with 7 fields, and five of them were include couch rotations. The prescription dose was 5 Gy and 95% coverage to the target volume. For the irradiation, two Gafchromic films were perpendicularly inserted into the cube that hold sphere inside. The linac used for the irradiation was TrueBeam STx equipped with HD120 MLC. In order to use ExacTrac, infra-red head–array was used to correlate orthogonal X-ray images. Results: Using orthogonal X-rays of ExacTrac the phantom was positioned. For each field, phantom was check again with X-rays and re-positioned if necessary. After each setup using ExacTrac, the target was irradiated. The films were analyzed to determine the deviation of the radiation isocenter in all three dimensions: superior-inferior, left-right and anterior-posterior. The total combining error was found to be 0.76 mm ± 0.05 mm which was within sub-millimeter accuracy. Conclusion: Until now, E2E test for ExacTrac was separately implemented to test image localization and radiation isocenter. This new method can be used for periodic QA procedures.

  3. Rapid Cost Assessment of Space Mission Concepts Through Application of Complexity-Based Cost Indices

    Science.gov (United States)

    Peterson, Craig E.; Cutts, James; Balint, Tibor; Hall, James B.

    2008-01-01

    This slide presentation reviews the development of a rapid cost assessment models for evaluation of exploration missions through the application of complexity based cost indices. In Fall of 2004, NASA began developing 13 documents, known as "strategic roadmaps," intended to outline a strategy for space exploration over the next 30 years. The Third Strategic Roadmap, The Strategic Roadmap for Solar System Exploration, focused on strategy for robotic exploration of the Solar System. Development of the Strategic Roadmap for Solar System Exploration led to the investigation of a large variety of missions. However, the necessity of planning around scientific inquiry and budgetary constraints made it necessary for the roadmap development team to evaluate potential missions not only for scientific return but also cost. Performing detailed cost studies for each of the large number of missions was impractical given the time constraints involved and lack of detailed mission studies; so a method of rapid cost assessment was developed by us to allow preliminary analysis. It has been noted that there is a strong correlation between complexity and cost and schedule of planetary missions. While these correlations were made after missions had been built and flown (successfully or otherwise), it seemed likely that a similar approach could provide at least some relative cost ranking. Cost estimation relationships (CERs) have been developed based on subsystem design choices. These CERs required more detailed information than available, forcing the team to adopt a more high level approach. Costing by analogy has been developed for small satellites, however, planetary exploration missions provide such varying spacecraft requirements that there is a lack of adequately comparable missions that can be used for analogy.

  4. Demonstration of the First Real-Time End-to-End 40-Gb/s PAM-4 for Next-Generation Access Applications using 10-Gb/s Transmitter

    DEFF Research Database (Denmark)

    Wei, J. L.; Eiselt, Nicklas; Griesser, Helmut

    2016-01-01

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next-generation access applications using 10-Gb/s class transmitters only. Based on the measurement of a real-time 40-Gb/s PAM system, low-cost upstream and downstream link power budgets are estimated. Up...

  5. Modeling the interaction of IEEE 802.3x hop-by-hop flow control and TCP end-to-end flow control

    NARCIS (Netherlands)

    R. Malhotra; R. van Haalen; M.R.H. Mandjes (Michel); R. Núñez Queija (Rudesindo (Sindo))

    2005-01-01

    textabstractEthernet is rapidly expanding beyond its niche of local area networks. However, its success in larger metropolitan area networks will be determined by its ability to combine simplicity, low costs and quality of service. A key element in successfully transporting bursty traffic and at the

  6. Study and Implementation of the End-to-End Data Pipeline for the Virtis Imaging Spectrometer Onbaord Venus Express: "From Science Operations Planning to Data Archiving and Higher Lever Processing"

    Science.gov (United States)

    Cardesín Moinelo, Alejandro

    2010-04-01

    This PhD Thesis describes the activities performed during the Research Program undertaken for two years at the Istituto Nazionale di AstroFisica in Rome, Italy, as active member of the VIRTIS Technical and Scientific Team, and one additional year at the European Space Astronomy Center in Madrid, Spain, as member of the Mars Express Science Ground Segment. This document will show a study of all sections of the Science Ground Segment of the Venus Express mission, from the planning of the scientific operations, to the generation, calibration and archiving of the science data, including the production of valuable high level products. We will present and discuss here the end-to-end diagram of the ground segment from the technical and scientific point of view, in order to describe the overall flow of information: from the original scientific requests of the principal investigator and interdisciplinary teams, up to the spacecraft, and down again for the analysis of the measurements and interpretation of the scientific results. These scientific results drive to new and more elaborated scientific requests, which are used as feedback to the planning cycle, closing the circle. Special attention is given here to describe the implementation and development of the data pipeline for the VIRTIS instrument onboard Venus Express. During the research program, both the raw data generation pipeline and the data calibration pipeline were developed and automated in order to produce the final raw and calibrated data products from the input telemetry of the instrument. The final raw and calibrated products presented in this work are currently being used by the VIRTIS Science team for data analysis and are distributed to the whole scientific community via the Planetary Science Archive. More than 20,000 raw data files and 10,000 calibrated products have already been generated after almost 4 years of mission. In the final part of the Thesis, we will also present some high level data

  7. Multi-objective optimization to support rapid air operations mission planning

    Science.gov (United States)

    Gonsalves, Paul G.; Burge, Janet E.

    2005-05-01

    Within the context of military air operations, Time-sensitive targets (TSTs) are targets where modifiers such, "emerging, perishable, high-payoff, short dwell, or highly mobile" can be used. Time-critical targets (TCTs) further the criticality of TSTs with respect to achievement of mission objectives and a limited window of opportunity for attack. The importance of TST/TCTs within military air operations has been met with a significant investment in advanced technologies and platforms to meet these challenges. Developments in ISR systems, manned and unmanned air platforms, precision guided munitions, and network-centric warfare have made significant strides for ensuring timely prosecution of TSTs/TCTs. However, additional investments are needed to further decrease the targeting decision cycle. Given the operational needs for decision support systems to enable time-sensitive/time-critical targeting, we present a tool for the rapid generation and analysis of mission plan solutions to address TSTs/TCTs. Our system employs a genetic algorithm-based multi-objective optimization scheme that is well suited to the rapid generation of approximate solutions in a dynamic environment. Genetic Algorithms (GAs) allow for the effective exploration of the search space for potentially novel solutions, while addressing the multiple conflicting objectives that characterize the prosecution of TSTs/TCTs (e.g. probability of target destruction, time to accomplish task, level of disruption to other mission priorities, level of risk to friendly assets, etc.).

  8. Trade Space Specification Tool (TSST) for Rapid Mission Architecture (Version 1.2)

    Science.gov (United States)

    Wang, Yeou-Fang; Schrock, Mitchell; Borden, Chester S.; Moeller, Robert C.

    2013-01-01

    Trade Space Specification Tool (TSST) is designed to capture quickly ideas in the early spacecraft and mission architecture design and categorize them into trade space dimensions and options for later analysis. It is implemented as an Eclipse RCP Application, which can be run as a standalone program. Users rapidly create concept items with single clicks on a graphical canvas, and can organize and create linkages between the ideas using drag-and-drop actions within the same graphical view. Various views such as a trade view, rules view, and architecture view are provided to help users to visualize the trade space. This software can identify, explore, and assess aspects of the mission trade space, as well as capture and organize linkages/dependencies between trade space components. The tool supports a user-in-the-loop preliminary logical examination and filtering of trade space options to help identify which paths in the trade space are feasible (and preferred) and what analyses need to be done later with executable models. This tool provides multiple user views of the trade space to guide the analyst/team to facilitate interpretation and communication of the trade space components and linkages, identify gaps in combining and selecting trade space options, and guide user decision-making for which combinations of architectural options should be pursued for further evaluation. This software provides an environment to capture mission trade space elements rapidly and assist users for their architecture analysis. This is primarily focused on mission and spacecraft architecture design, rather than general-purpose design application. In addition, it provides more flexibility to create concepts and organize the ideas. The software is developed as an Eclipse plug-in and potentially can be integrated with other Eclipse-based tools.

  9. End-to-end energy efficient communication

    DEFF Research Database (Denmark)

    Dittmann, Lars

    Awareness of energy consumption in communication networks such as the Internet is currently gaining momentum as it is commonly acknowledged that increased network capacity (currently driven by video applications) requires significant more electrical power. This paper stresses the importance...

  10. End-to-end image quality assessment

    Science.gov (United States)

    Raventos, Joaquin

    2012-05-01

    An innovative computerized benchmarking approach (US Patent pending Sep 2011) based on extensive application of photometry, geometrical optics, and digital media using a randomized target, for a standard observer to assess the image quality of video imaging systems, at different day time, and low-light luminance levels. It takes into account, the target's contrast and color characteristics, as well as the observer's visual acuity and dynamic response. This includes human vision as part of the "extended video imaging system" (EVIS), and allows image quality assessment by several standard observers simultaneously.

  11. Unmanned Aircraft Systems Detect and Avoid System: End-to-End Verification and Validation Simulation Study of Minimum Operations Performance Standards for Integrating Unmanned Aircraft into the National Airspace System

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Sturdy, James L.; Vincent, Michael J.; Hoffler, Keith D.; Myer, Robert R.; DeHaven, Anna M.

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The technique, results, and lessons learned from a detailed End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS), based on specific test vectors and encounter cases, will be presented in this paper.

  12. Rapid Preliminary Design of Interplanetary Trajectories Using the Evolutionary Mission Trajectory Generator

    Science.gov (United States)

    Englander, Jacob

    2016-01-01

    This set of tutorial slides is an introduction to the Evolutionary Mission Trajectory Generator (EMTG), NASA Goddard Space Flight Center's autonomous tool for preliminary design of interplanetary missions. This slide set covers the basics of creating and post-processing simple interplanetary missions in EMTG using both high-thrust chemical and low-thrust electric propulsion along with a variety of operational constraints.

  13. Rapid large- and site scale RPAS mission planning for remote sensing of rock falls and landslides in alpine areas

    Science.gov (United States)

    Gräupl, Thomas; Pschernig, Elias; Rokitansky, Carl-Herbert; Oleire-Oltmanns, Sebastian; Zobl, Fritz

    2014-05-01

    Since landslides and rock falls are complex phenomena involving a multitude of factors, current and historic surface data play besides geologic conditions and others an important role in analyzing hazard situation and efficient site-specific remediation actions. Especially in displacement acceleration phases which are frequently linked to bad weather conditions, data acquisition remains difficult. Therefore RPAS with their small ground sampling distance and correspondingly high resolution open up possibilities for surveying ground situations not only for visual inspection but also for geodetic data acquisition. Both, visual and geodetic data provide valuable information for geologists and related decision makers. Slides or rock falls in alpine areas pose special challenges due to mostly acute and unforeseen displacements on the one hand and geographic conditions of narrow valleys along with steep slopes on the other hand. Rapid RPAS mission planning and mission adaption for individual requirements according to different project stages (initial investigation, repeat measurements, identification of hazard zones for urgent remediation actions, etc.) is therefore of particular importance. Here we present a computer-simulation supported approach to RPAS mission planning taking the identified thematic and remote sensing targets, the relevant terrain and obstacle databases, legal restrictions, aircraft performance, sensor characteristics, and communication ranges into account in order to produce a safe and mission-optimized flight route. For the RPAS mission planning, we combine and adapt tools developed at University of Salzburg, namely a flight track generator taking into account a 3D-model of the earth surface with both, focus on large area coverage (e.g. Austria) and the highest available resolution (e.g. sub-meter for specific areas), available obstacle data bases for the mission area (e.g. cable car lines, power lines, buildings, slope stabilization constructions

  14. Experimental demonstration of a record high 11.25Gb/s real-time optical OFDM transceiver supporting 25km SMF end-to-end transmission in simple IMDD systems.

    Science.gov (United States)

    Giddings, R P; Jin, X Q; Hugues-Salas, E; Giacoumidis, E; Wei, J L; Tang, J M

    2010-03-15

    The fastest ever 11.25Gb/s real-time FPGA-based optical orthogonal frequency division multiplexing (OOFDM) transceivers utilizing 64-QAM encoding/decoding and significantly improved variable power loading are experimentally demonstrated, for the first time, incorporating advanced functionalities of on-line performance monitoring, live system parameter optimization and channel estimation. Real-time end-to-end transmission of an 11.25Gb/s 64-QAM-encoded OOFDM signal with a high electrical spectral efficiency of 5.625bit/s/Hz over 25km of standard and MetroCor single-mode fibres is successfully achieved with respective power penalties of 0.3dB and -0.2dB at a BER of 1.0 x 10(-3) in a directly modulated DFB laser-based intensity modulation and direct detection system without in-line optical amplification and chromatic dispersion compensation. The impacts of variable power loading as well as electrical and optical components on the transmission performance of the demonstrated transceivers are experimentally explored in detail. In addition, numerical simulations also show that variable power loading is an extremely effective means of escalating system performance to its maximum potential.

  15. Dynamic Emulation of NASA Missions for IVandV: A Case Study of JWST and SLS

    Science.gov (United States)

    Yokum, Steve

    2015-01-01

    Software-Only-Simulations are an emerging but quickly developing field of study throughout NASA. The NASA Independent Verification Validation (IVV) Independent Test Capability (ITC) team has been rapidly building a collection of simulators for a wide range of NASA missions. ITC specializes in full end-to-end simulations that enable developers, VV personnel, and operators to test-as-you-fly. In four years, the team has delivered a wide variety of spacecraft simulations ranging from low complexity science missions such as the Global Precipitation Management (GPM) satellite and the Deep Space Climate Observatory (DSCOVR), to the extremely complex missions such as the James Webb Space Telescope (JWST) and Space Launch System (SLS).

  16. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    Science.gov (United States)

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems

  17. Experimental demonstrations of record high REAM intensity modulator-enabled 19.25Gb/s real-time end-to-end dual-band optical OFDM colorless transmissions over 25km SSMF IMDD systems.

    Science.gov (United States)

    Zhang, Q W; Hugues-Salas, E; Giddings, R P; Wang, M; Tang, J M

    2013-04-08

    Record-high 19.25Gb/s real-time end-to-end dual-band optical OFDM (OOFDM) colorless transmissions across the entire C-band are experimentally demonstrated, for the first time, in reflective electro-absorption modulator (REAM)-based 25km standard SMF systems using intensity modulation and direct detection. Adaptively modulated baseband (0-2GHz) and passband (6.125 ± 2GHz) OFDM RF sub-bands, supporting signal line rates of 9.75Gb/s and 9.5Gb/s respectively, are independently generated and detected with FPGA-based DSP clocked at only 100MHz as well as DACs/ADCs operating at sampling speeds as low as 4GS/s. The two OFDM sub-bands are electrically multiplexed for intensity modulation of a single optical carrier by an 8GHz REAM. The REAM colorlessness is experimentally characterized, based on which optimum REAM operating conditions are identified. To maximize and balance the signal transmission performance of each sub-band, on-line adaptive transceiver optimization functions and live performance monitoring are fully exploited to optimize key OOFDM transceiver and system parameters. For different wavelengths within the C-band, corresponding minimum received optical powers at the FEC limit vary in a range of <0.5dB and bit error rate performances for both baseband and passband signals are almost identical. Furthermore, detailed investigations are also undertaken of the maximum aggregated signal line rate sensitivity to electrical sub-band power variation. It is shown that the aforementioned system has approximately 3dB tolerance to RF sub-band power variation.

  18. SU-F-J-150: Development of An End-To-End Chain Test for the First-In-Man MR-Guided Treatments with the MRI Linear Accelerator by Using the Alderson Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Hoogcarspel, S; Kerkmeijer, L; Lagendijk, J; Van Vulpen, M; Raaymakers, B [University Medical Center Utrecht, Utrecht, Utrecht (Netherlands)

    2016-06-15

    The Alderson phantom is a human shaped quality assurance tool that has been used for over 30 years in radiotherapy. The phantom can provide integrated tests of the entire chain of treatment planning and delivery. The purpose of this research was to investigate if this phantom can be used to chain test a treatment on the MRI linear accelerator (MRL) which is currently being developed at the UMC Utrecht, in collaboration with Elekta and Philips. The latter was demonstrated by chain testing the future First-in-Man treatments with this system.An Alderson phantom was used to chain test an entire treatment with the MRL. First, a CT was acquired of the phantom with additional markers that are both visible on MR and CT. A treatment plan for treating bone metastases in the sacrum was made. The phantom was consecutively placed in the MRL. For MRI imaging, an 3D volume was acquired. The initially developed treatment plan was then simulated on the new MRI dataset. For simulation, both the MR and CT data was used by registering them together. Before treatment delivery a MV image was acquired and compared with a DRR that was calculated form the MR/CT registration data. Finally, the treatment was delivered. Figure 1 shows both the T1 weighted MR-image of the phantom and the CT that was registered to the MR image. Figure 2 shows both the calculated and measured MV image that was acquired by the MV panel. Figure 3 shows the dose distribution that was simulated. The total elapsed time for the entire procedure excluding irradiation was 13:35 minutes.The Alderson Phantom yields sufficient MR contrast and can be used for full MR guided radiotherapy treatment chain testing. As a result, we are able to perform an end-to-end chain test of the future First-in-Man treatments.

  19. Rearrangement of potassium ions and Kv1.1/Kv1.2 potassium channels in regenerating axons following end-to-end neurorrhaphy: ionic images from TOF-SIMS.

    Science.gov (United States)

    Liu, Chiung-Hui; Chang, Hung-Ming; Wu, Tsung-Huan; Chen, Li-You; Yang, Yin-Shuo; Tseng, To-Jung; Liao, Wen-Chieh

    2017-10-01

    The voltage-gated potassium channels Kv1.1 and Kv1.2 that cluster at juxtaparanodal (JXP) regions are essential in the regulation of nerve excitability and play a critical role in axonal conduction. When demyelination occurs, Kv1.1/Kv1.2 activity increases, suppressing the membrane potential nearly to the equilibrium potential of K+, which results in an axonal conduction blockade. The recovery of K+-dependent communication signals and proper clustering of Kv1.1/Kv1.2 channels at JXP regions may directly reflect nerve regeneration following peripheral nerve injury. However, little is known about potassium channel expression and its relationship with the dynamic potassium ion distribution at the node of Ranvier during the regenerative process of peripheral nerve injury (PNI). In the present study, end-to-end neurorrhaphy (EEN) was performed using an in vivo model of PNI. The distribution of K+ at regenerating axons following EEN was detected by time-of-flight secondary-ion mass spectrometry. The specific localization and expression of Kv1.1/Kv1.2 channels were examined by confocal microscopy and western blotting. Our data showed that the re-establishment of K+ distribution and intensity was correlated with the functional recovery of compound muscle action potential morphology in EEN rats. Furthermore, the re-clustering of Kv1.1/1.2 channels 1 and 3 months after EEN at the nodal region of the regenerating nerve corresponded to changes in the K+ distribution. This study provided direct evidence of K+ distribution in regenerating axons for the first time. We proposed that the Kv1.1/Kv1.2 channels re-clustered at the JXP regions of regenerating axons are essential for modulating the proper patterns of K+ distribution in axons for maintaining membrane potential stability after EEN.

  20. Rapid Modelling of Urban Mission Areas Using Ground-Based Imagery

    NARCIS (Netherlands)

    Son, R. van; Kuijper, F.; Heuvel, F. van den

    2007-01-01

    Current modelling techniques for urban mission areas are mainly based on the use of data which is not specific or detailed enough to accurately model an existing area. Consequently, additional (manual) effort and time are required to perform data acquisition and to model the environment. The

  1. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak

  2. Virtual Mission Operations of Remote Sensors With Rapid Access To and From Space

    Science.gov (United States)

    Ivancic, William D.; Stewart, Dave; Walke, Jon; Dikeman, Larry; Sage, Steven; Miller, Eric; Northam, James; Jackson, Chris; Taylor, John; Lynch, Scott; hide

    2010-01-01

    This paper describes network-centric operations, where a virtual mission operations center autonomously receives sensor triggers, and schedules space and ground assets using Internet-based technologies and service-oriented architectures. For proof-of-concept purposes, sensor triggers are received from the United States Geological Survey (USGS) to determine targets for space-based sensors. The Surrey Satellite Technology Limited (SSTL) Disaster Monitoring Constellation satellite, the United Kingdom Disaster Monitoring Constellation (UK-DMC), is used as the space-based sensor. The UK-DMC s availability is determined via machine-to-machine communications using SSTL s mission planning system. Access to/from the UK-DMC for tasking and sensor data is via SSTL s and Universal Space Network s (USN) ground assets. The availability and scheduling of USN s assets can also be performed autonomously via machine-to-machine communications. All communication, both on the ground and between ground and space, uses open Internet standards.

  3. Mission operations management

    Science.gov (United States)

    Rocco, David A.

    1994-01-01

    Redefining the approach and philosophy that operations management uses to define, develop, and implement space missions will be a central element in achieving high efficiency mission operations for the future. The goal of a cost effective space operations program cannot be realized if the attitudes and methodologies we currently employ to plan, develop, and manage space missions do not change. A management philosophy that is in synch with the environment in terms of budget, technology, and science objectives must be developed. Changing our basic perception of mission operations will require a shift in the way we view the mission. This requires a transition from current practices of viewing the mission as a unique end product, to a 'mission development concept' built on the visualization of the end-to-end mission. To achieve this change we must define realistic mission success criteria and develop pragmatic approaches to achieve our goals. Custom mission development for all but the largest and most unique programs is not practical in the current budget environment, and we simply do not have the resources to implement all of our planned science programs. We need to shift our management focus to allow us the opportunity make use of methodologies and approaches which are based on common building blocks that can be utilized in the space, ground, and mission unique segments of all missions.

  4. Rapid Development of Gossamer Propulsion for NASA Inner Solar System Science Missions

    Science.gov (United States)

    Young, Roy M.; Montgomery, Edward E.

    2006-01-01

    Over a two and one-half year period dating from 2003 through 2005, NASA s In-Space Propulsion Program matured solar sail technology from laboratory components to full systems, demonstrated in as relevant a space environment as could feasibly be simulated on the ground. This paper describes the challenges identified; as well as the approaches taken toward solving a broad set of issues spanning material science, manufacturing technology, and interplanetary trajectory optimization. Revolutionary advances in system structural predictive analysis and characterization testing occurred. Also addressed are the remaining technology challenges that might be resolved with further ground technology research, geared toward reducing technical risks associated with future space validation and science missions.

  5. The Rapid Response Radiation Survey (R3S) Mission Using the HiSat Conformal Satellite Architecture

    Science.gov (United States)

    Miller, Nathanael A.; Norman, Ryan B.; Soto, Hector L.; Stewart, Victor A.; Jones, Mark L.; Kowalski, Matthew C.; Ben Shabat, Adam; Gough, Kerry M.; Stavely, Rebecca L.; Shim, Alex C.; hide

    2015-01-01

    The Rapid Response Radiation Survey (R3S) experiment, designed as a quick turnaround mission to make radiation measurements in Low Earth Orbit (LEO), will fly as a hosted payload in partnership with NovaWurks using their Hyper-integrated Satlet (HISat) architecture. The need for the mission arises as the Nowcast of Atmospheric Ionization Radiation for Aviation Safety (NAIRAS) model moves from a research effort into an operational radiation assessment tool. Currently, airline professionals are the second largest demographic of radiation workers and to date their radiation exposure is undocumented in the USA. The NAIRAS model seeks to fill this information gap. The data collected by R3S, in addition to the complementary data from a NASA Langley Research Center (LaRC) atmospheric balloon mission entitled Radiation Dosimetry Experiment (RaD-X), will validate exposure prediction capabilities of NAIRAS. The R3S mission collects total dose and radiation spectrum measurements using a Teledyne µDosimeter and a Liulin-6SA2 LED spectrometer. These two radiation sensors provide a cross correlated radiometric measurement in combination with the Honeywell HMR2300 Smart Digital Magnetometer. The magnetometer assesses the Earth's magnetic field in the LEO environment and allows radiation dose to be mapped as a function of the Earth's magnetic shielding. R3S is also unique in that the radiation sensors will be exposed on the outer surface of the spacecraft, possibly making this the first measurements of the LEO radiation environment with bare sensors. Viability of R3S as an extremely fast turnaround mission is due, in part, to the nature of the robust, well-defined interfaces of the conformal satellite HiSat Architecture. The HiSat architecture, which was developed with the support of the Defense Advanced Research Projects Agency's (DARPA's) Phoenix Program, enabled the R3S system to advance from the first concept to delivery of preliminary design review (PDR) level documents in

  6. Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing

    Science.gov (United States)

    Doyle, Richard; Bergman, Larry; Some, Raphael; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael

    2013-01-01

    Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and the mission; it can be aptly viewed as a "technology multiplier" in that advances in onboard computing provide dramatic improvements in flight functions and capabilities across the NASA mission classes, and will enable new flight capabilities and mission scenarios, increasing science and exploration return per mission-dollar.

  7. End-to-End Fault Tolerance Using Transport Layer Multihoming

    Science.gov (United States)

    2005-01-01

    2.1 Simulation network topology with cross-traffic, congestion -based loss, and no failures...aggressive failovers that reduce stalls during network congestion and failure events. xiv Chapter 1 INTRODUCTION 1.1 Problem Statement This...take a long time to converge on a new route after a link failure is detected. Labovitz et al. [72] show that the Internet’s interdomain routers may take

  8. End-to-End Service Oriented Architectures (SOA) Security Project

    Science.gov (United States)

    2012-02-01

    Complexity, in Proceedings of IFIP International Information Security (SEC 2011) conference, June 2011, Lucerne , Switzerland. • P. Angin, B. Bhargava...usr/lib/ /usr/lib/ The coolkey download is available online [CAC3]. Downloading the DoD Root CA certificates, and other intermediate certificate...Below is the link used to download the DoD Root CA certificates. The procedure to install these certificates is given in the link. Install all the

  9. End-to-end visual speech recognition with LSTMS

    NARCIS (Netherlands)

    Petridis, Stavros; Li, Zuwei; Pantic, Maja

    2017-01-01

    Traditional visual speech recognition systems consist of two stages, feature extraction and classification. Recently, several deep learning approaches have been presented which automatically extract features from the mouth images and aim to replace the feature extraction stage. However, research on

  10. End-to-end experiment management in HPC

    Energy Technology Data Exchange (ETDEWEB)

    Bent, John M [Los Alamos National Laboratory; Kroiss, Ryan R [Los Alamos National Laboratory; Torrez, Alfred [Los Alamos National Laboratory; Wingate, Meghan [Los Alamos National Laboratory

    2010-01-01

    Experiment management in any domain is challenging. There is a perpetual feedback loop cycling through planning, execution, measurement, and analysis. The lifetime of a particular experiment can be limited to a single cycle although many require myriad more cycles before definite results can be obtained. Within each cycle, a large number of subexperiments may be executed in order to measure the effects of one or more independent variables. Experiment management in high performance computing (HPC) follows this general pattern but also has three unique characteristics. One, computational science applications running on large supercomputers must deal with frequent platform failures which can interrupt, perturb, or terminate running experiments. Two, these applications typically integrate in parallel using MPI as their communication medium. Three, there is typically a scheduling system (e.g. Condor, Moab, SGE, etc.) acting as a gate-keeper for the HPC resources. In this paper, we introduce LANL Experiment Management (LEM), an experimental management framework simplifying all four phases of experiment management. LEM simplifies experiment planning by allowing the user to describe their experimental goals without having to fully construct the individual parameters for each task. To simplify execution, LEM dispatches the subexperiments itself thereby freeing the user from remembering the often arcane methods for interacting with the various scheduling systems. LEM provides transducers for experiments that automatically measure and record important information about each subexperiment; these transducers can easily be extended to collect additional measurements specific to each experiment. Finally, experiment analysis is simplified by providing a general database visualization framework that allows users to quickly and easily interact with their measured data.

  11. CMDS System Integration and IAMD End-to-End Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Cruise Missile Defense Systems (CMDS) Project Office is establishing a secure System Integration Laboratory at the AMRDEC. This lab will contain tactical Signal...

  12. End-to-End Multi-View Lipreading

    NARCIS (Netherlands)

    Petridis, Stavros; Wang, Yujiang; Li, Zuwei; Pantic, Maja

    2017-01-01

    Non-frontal lip views contain useful information which can be used to enhance the performance of frontal view lipreading. However, the vast majority of recent lipreading works, including the deep learning approaches which significantly outperform traditional approaches, have focused on frontal mouth

  13. END-TO-END INDIA-UK TRANSNATIONAL WIRELESS TESTBED

    National Research Council Canada - National Science Library

    Budhiraja, Rohit; Ramamurthi, Bhaskar; Narayanan, Babu; A, Oredope

    2011-01-01

    .... The India-UK Advanced Technology Centre initiative is a collaborative research project between various institutes and companies across UK and India, which envisages, apart from several research...

  14. End to End Inter-domain Quality of Service Provisioning

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy

    This thesis addresses selected topics of Quality of Service (QoS) provisioning in heterogeneous data networks that construct the communication environment of today's Internet. In the vast range of protocols available in different domains of network infrastructures, a few chosen ones are discussed...... and their key QoS features are analysed. This thesis mainly focuses on home and access networks, and their interaction. Considering home networks, UPnP-QoS Architecture was chosen in order to analyse the possibilities of QoS provisioning at users' premises using service oriented architectures. First...

  15. End to End Beam Dynamics of the ESS Linac

    DEFF Research Database (Denmark)

    Thomsen, Heine Dølrath

    2012-01-01

    The European Spallation Source, ESS, uses a linear accelerator to deliver a high intensity proton beam to the target station. The nominal beam power on target will be 5 MW at an energy of 2.5 GeV. We briefly describe the individual accelerating structures and transport lines through which we have...... carried out multiparticle beam dynamics simulations. We will present a review of the beam dynamics from the source to the target....

  16. End-to-End Privacy for Open Big Data Markets

    OpenAIRE

    Perera, Charith; Ranjan, Rajiv; Wang, Lizhe

    2015-01-01

    The idea of an open data market envisions the creation of a data trading model to facilitate exchange of data between different parties in the Internet of Things (IoT) domain. The data collected by IoT products and solutions are expected to be traded in these markets. Data owners will collect data using IoT products and solutions. Data consumers who are interested will negotiate with the data owners to get access to such data. Data captured by IoT products will allow data consumers to further...

  17. 40 Gigabit ethernet: prototyping transparent end-to-end connectivity

    NARCIS (Netherlands)

    Dumitru, C.; Koning, R.; de Laat, C.

    2011-01-01

    The ever increasing demands of data intensive eScience applications have pushed the limits of computer networks. With the launch of the new 40 Gigabit Ethernet (40GE) standard, 802.3ba, applications can go beyond the common 10 Gigabit/s per data stream barrier for both local area, and as

  18. Crew Transportation System Design Reference Missions

    Science.gov (United States)

    Mango, Edward J.

    2015-01-01

    Contains summaries of potential design reference mission goals for systems to transport humans to andfrom low Earth orbit (LEO) for the Commercial Crew Program. The purpose of this document is to describe Design Reference Missions (DRMs) representative of the end-to-end Crew Transportation System (CTS) framework envisioned to successfully execute commercial crew transportation to orbital destinations. The initial CTS architecture will likely be optimized to support NASA crew and NASA-sponsored crew rotation missions to the ISS, but consideration may be given in this design phase to allow for modifications in order to accomplish other commercial missions in the future. With the exception of NASA’s mission to the ISS, the remaining commercial DRMs are notional. Any decision to design or scar the CTS for these additional non-NASA missions is completely up to the Commercial Provider. As NASA’s mission needs evolve over time, this document will be periodically updated to reflect those needs.

  19. Smashing the Stovepipe: Leveraging the GMSEC Open Architecture and Advanced IT Automation to Rapidly Prototype, Develop and Deploy Next-Generation Multi-Mission Ground Systems

    Science.gov (United States)

    Swenson, Paul

    2017-01-01

    Satellite/Payload Ground Systems - Typically highly-customized to a specific mission's use cases - Utilize hundreds (or thousands!) of specialized point-to-point interfaces for data flows / file transfers Documentation and tracking of these complex interfaces requires extensive time to develop and extremely high staffing costs Implementation and testing of these interfaces are even more cost-prohibitive, and documentation often lags behind implementation resulting in inconsistencies down the road With expanding threat vectors, IT Security, Information Assurance and Operational Security have become key Ground System architecture drivers New Federal security-related directives are generated on a daily basis, imposing new requirements on current / existing ground systems - These mandated activities and data calls typically carry little or no additional funding for implementation As a result, Ground System Sustaining Engineering groups and Information Technology staff continually struggle to keep up with the rolling tide of security Advancing security concerns and shrinking budgets are pushing these large stove-piped ground systems to begin sharing resources - I.e. Operational / SysAdmin staff, IT security baselines, architecture decisions or even networks / hosting infrastructure Refactoring these existing ground systems into multi-mission assets proves extremely challenging due to what is typically very tight coupling between legacy components As a result, many "Multi-Mission" ops. environments end up simply sharing compute resources and networks due to the difficulty of refactoring into true multi-mission systems Utilizing continuous integration / rapid system deployment technologies in conjunction with an open architecture messaging approach allows System Engineers and Architects to worry less about the low-level details of interfaces between components and configuration of systems GMSEC messaging is inherently designed to support multi-mission requirements, and

  20. Changes in maximum muscle strength and rapid muscle force characteristics after long-term special support and reconnaissance missions

    DEFF Research Database (Denmark)

    Christensen, Peter Astrup; Jacobsen, Jacob Ole; Thorlund, Jonas B

    2008-01-01

    PURPOSE: The purpose of the present study was to examine the impact of 8 days of immobilization during a Special Support and Reconnaissance mission (SSR) on muscle mass, contraction dynamics, maximum jump height/power, and body composition. METHODS: Unilateral maximal voluntary contraction, rate...... of force development, and maximal jump height were tested to assess muscle strength/power along with whole-body impedance analysis before and after SSR. RESULTS: Body weight, fat-free mass, and total body water decreased (4-5%) after SSR, along with impairments in maximal jump height (-8%) and knee...

  1. An integrated radar model solution for mission level performance and cost trades

    Science.gov (United States)

    Hodge, John; Duncan, Kerron; Zimmerman, Madeline; Drupp, Rob; Manno, Mike; Barrett, Donald; Smith, Amelia

    2017-05-01

    A fully integrated Mission-Level Radar model is in development as part of a multi-year effort under the Northrop Grumman Mission Systems (NGMS) sector's Model Based Engineering (MBE) initiative to digitally interconnect and unify previously separate performance and cost models. In 2016, an NGMS internal research and development (IR and D) funded multidisciplinary team integrated radio frequency (RF), power, control, size, weight, thermal, and cost models together using a commercial-off-the-shelf software, ModelCenter, for an Active Electronically Scanned Array (AESA) radar system. Each represented model was digitally connected with standard interfaces and unified to allow end-to-end mission system optimization and trade studies. The radar model was then linked to the Air Force's own mission modeling framework (AFSIM). The team first had to identify the necessary models, and with the aid of subject matter experts (SMEs) understand and document the inputs, outputs, and behaviors of the component models. This agile development process and collaboration enabled rapid integration of disparate models and the validation of their combined system performance. This MBE framework will allow NGMS to design systems more efficiently and affordably, optimize architectures, and provide increased value to the customer. The model integrates detailed component models that validate cost and performance at the physics level with high-level models that provide visualization of a platform mission. This connectivity of component to mission models allows hardware and software design solutions to be better optimized to meet mission needs, creating cost-optimal solutions for the customer, while reducing design cycle time through risk mitigation and early validation of design decisions.

  2. A Rapid Turn-around, Scalable Big Data Processing Capability for the JPL Airborne Snow Observatory (ASO) Mission

    Science.gov (United States)

    Mattmann, C. A.

    2014-12-01

    The JPL Airborne Snow Observatory (ASO) is an integrated LIDAR and Spectrometer measuring snow depth and rate of snow melt in the Sierra Nevadas, specifically, the Tuolumne River Basin, Sierra Nevada, California above the O'Shaughnessy Dam of the Hetch Hetchy reservoir, and the Uncompahgre Basin, Colorado, amongst other sites. The ASO data was delivered to water resource managers from the California Department of Water Resources in under 24 hours from the time that the Twin Otter aircraft landed in Mammoth Lakes, CA to the time disks were plugged in to the ASO Mobile Compute System (MCS) deployed at the Sierra Nevada Aquatic Research Laboratory (SNARL) near the airport. ASO performed weekly flights and each flight took between 500GB to 1 Terabyte of raw data, which was then processed from level 0 data products all the way to full level 4 maps of Snow Water Equivalent, albedo mosaics, and snow depth from LIDAR. These data were produced by Interactive Data analysis Language (IDL) algorithms which were then unobtrusively and automatically integrated into an Apache OODT and Apache Tika based Big Data processing system. Data movement was both electronic and physical including novel uses of LaCie 1 and 2 TeraByte (TB) data bricks and deployment in rugged terrain. The MCS was controlled remotely from the Jet Propulsion Laboratory, California Institute of Technology (JPL) in Pasadena, California on behalf of the National Aeronautics and Space Administration (NASA). Communication was aided through the use of novel Internet Relay Chat (IRC) command and control mechanisms and through the use of the Notifico open source communication tools. This talk will describe the high powered, and light-weight Big Data processing system that we developed for ASO and its implications more broadly for airborne missions at NASA and throughout the government. The lessons learned from ASO show the potential to have a large impact in the development of Big Data processing systems in the years

  3. Comparação entre dois fios de sutura não absorvíveis na anastomose traqueal término-terminal em cães Comparison of two nonabsorbable suture materials in the end-to-end tracheal anastomosis in dogs

    Directory of Open Access Journals (Sweden)

    Sheila Canevese Rahal

    1995-01-01

    Full Text Available Doze cães sem raça definida, com idade variando entre 1 e 6 anos e peso de 6 a 20kg, foram submetidos a ressecção traqueal e anastomose término-terminal, na qual foram testados os fios poliéster trançado não capilar e náilon monofilamento. Seis animais, cada três com um mesmo tipo de fio de sutura, sofreram a excisão equivalente a três anéis traqueais. Com 15 dias foi executada uma nova intervenção onde se ressecou o equivalente a mais seis anéis, perfazendo um total de nove. Ao final de outros 15 dias foram sacrificados. Os outros seis animais, cada três com um mesmo tipo de fio, foram submetidos à excisão equivalente a três anéis traqueais e mantidos por 43 dias. As traquéias foram avaliadas por exames clínicos, radiográficos, macroscópicos e histopatológicos. O fio de náilon monofilamento apresentou menos reação tecidual do que o poliéster trançado não capilar, promoveu uma anastomose segura e com menor chance de formação de granuloma.Twelve mongrel dogs, with age between 1 and 6 years old and weight between 12 and 40 pounds, were submitted to tracheal resection and end-to-end anastomosis in which were tested braided polyester no capillary and monofilament nylon materiais. Six animais, every threeones with a same type of suture material, suffered the excision equivalent to three tracheal rings. A new intervention was performed with fifteen days, in which the equivalent of more six tracheal rings were removed, completing the total of nine. At the end of more fifteen days they were sacrificed. The other six animals, every three with a same type of suture material, were submitted to the excision equivalent to three tracheal rings and maintained for 43 days. The tracheal anastomosis were evaluated to clinic, radiographic, macroscopic and histopathologic studies. The monofilament nylon material exhibited less reaction than polyester and promoted a secure anastomosis with less risk of granuloma formation.

  4. A Distributed Simulation Software System for Multi-Spacecraft Missions

    Science.gov (United States)

    Burns, Richard; Davis, George; Cary, Everett

    2003-01-01

    The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.

  5. Deep Space Mission Trend Analyses: A Briefing to the Next Generation EBRE Study Team

    Science.gov (United States)

    Abraham, Douglas S.

    2012-01-01

    Determination of stakeholder needs for next generation implementations necessitates a multi ]pronged approach. . Future mission set analyses provide a lower gbound h for some of these needs. . Earth ]based analogies provide an upper gbound h for some of these needs. . Interpreting the results requires being mindful of both the near ]term contextual factors and long ]term factors that are in play. . In the context of last year fs analyses, the current budget environment, the potential Pu ]238 shortage, and SMD fs gsingle 34m only h policy may, collectively, create a future deep space mission set that, from a capacity and end ]to ]end link difficulty standpoint, is no more challenging than it is today. . Nonetheless, data rates and volumes continue to increase, suggesting capability and spectrum challenges ahead. These results agree with the results from the Earthbased analogies. . Emerging developments such as smallsats and distributed spacecraft could significantly change the capacity and end ]to ]end link difficulty picture.

  6. Internet Technology for Future Space Missions

    Science.gov (United States)

    Hennessy, Joseph F. (Technical Monitor); Rash, James; Casasanta, Ralph; Hogie, Keith

    2002-01-01

    Ongoing work at National Aeronautics and Space Administration Goddard Space Flight Center (NASA/GSFC), seeks to apply standard Internet applications and protocols to meet the technology challenge of future satellite missions. Internet protocols and technologies are under study as a future means to provide seamless dynamic communication among heterogeneous instruments, spacecraft, ground stations, constellations of spacecraft, and science investigators. The primary objective is to design and demonstrate in the laboratory the automated end-to-end transport of files in a simulated dynamic space environment using off-the-shelf, low-cost, commodity-level standard applications and protocols. The demonstrated functions and capabilities will become increasingly significant in the years to come as both earth and space science missions fly more sensors and the present labor-intensive, mission-specific techniques for processing and routing data become prohibitively. This paper describes how an IP-based communication architecture can support all existing operations concepts and how it will enable some new and complex communication and science concepts. The authors identify specific end-to-end data flows from the instruments to the control centers and scientists, and then describe how each data flow can be supported using standard Internet protocols and applications. The scenarios include normal data downlink and command uplink as well as recovery scenarios for both onboard and ground failures. The scenarios are based on an Earth orbiting spacecraft with downlink data rates from 300 Kbps to 4 Mbps. Included examples are based on designs currently being investigated for potential use by the Global Precipitation Measurement (GPM) mission.

  7. A Lean, Fast Mars Round-trip Mission Architecture: Using Current Technologies for a Human Mission in the 2030s

    Science.gov (United States)

    Bailey, Lora; Folta, David; Barbee, Brent W.; Vaughn, Frank; Kirchman, Frank; Englander, Jacob; Campbell, Bruce; Thronson, Harley; Lin, Tzu Yu

    2013-01-01

    We present a lean fast-transfer architecture concept for a first human mission to Mars that utilizes current technologies and two pivotal parameters: an end-to-end Mars mission duration of approximately one year, and a deep space habitat of approximately 50 metric tons. These parameters were formulated by a 2012 deep space habitat study conducted at the NASA Johnson Space Center (JSC) that focused on a subset of recognized high- engineering-risk factors that may otherwise limit space travel to destinations such as Mars or near-Earth asteroid (NEA)s. With these constraints, we model and promote Mars mission opportunities in the 2030s enabled by a combination of on-orbit staging, mission element pre-positioning, and unique round-trip trajectories identified by state-of-the-art astrodynamics algorithms.

  8. Achieving Operability via the Mission System Paradigm

    Science.gov (United States)

    Hammer, Fred J.; Kahr, Joseph R.

    2006-01-01

    In the past, flight and ground systems have been developed largely-independently, with the flight system taking the lead, and dominating the development process. Operability issues have been addressed poorly in planning, requirements, design, I&T, and system-contracting activities. In many cases, as documented in lessons-learned, this has resulted in significant avoidable increases in cost and risk. With complex missions and systems, operability is being recognized as an important end-to-end design issue. Never-the-less, lessons-learned and operability concepts remain, in many cases, poorly understood and sporadically applied. A key to effective application of operability concepts is adopting a 'mission system' paradigm. In this paradigm, flight and ground systems are treated, from an engineering and management perspective, as inter-related elements of a larger mission system. The mission system consists of flight hardware, flight software, telecom services, ground data system, testbeds, flight teams, science teams, flight operations processes, procedures, and facilities. The system is designed in functional layers, which span flight and ground. It is designed in response to project-level requirements, mission design and an operations concept, and is developed incrementally, with early and frequent integration of flight and ground components.

  9. Space Network IP Services (SNIS): An Architecture for Supporting Low Earth Orbiting IP Satellite Missions

    Science.gov (United States)

    Israel, David J.

    2005-01-01

    The NASA Space Network (SN) supports a variety of missions using the Tracking and Data Relay Satellite System (TDRSS), which includes ground stations in White Sands, New Mexico and Guam. A Space Network IP Services (SNIS) architecture is being developed to support future users with requirements for end-to-end Internet Protocol (IP) communications. This architecture will support all IP protocols, including Mobile IP, over TDRSS Single Access, Multiple Access, and Demand Access Radio Frequency (RF) links. This paper will describe this architecture and how it can enable Low Earth Orbiting IP satellite missions.

  10. Gas mission; Mission gaz

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    This preliminary report analyses the desirable evolutions of gas transport tariffing and examines some questions relative to the opening of competition on the French gas market. The report is made of two documents: a synthesis of the previous report with some recommendations about the tariffing of gas transport, about the modalities of network access to third parties, and about the dissociation between transport and trade book-keeping activities. The second document is the progress report about the opening of the French gas market. The first part presents the European problem of competition in the gas supply and its consequences on the opening and operation of the French gas market. The second part presents some partial syntheses about each topic of the mission letter of the Ministry of Economics, Finances and Industry: future evolution of network access tariffs, critical analysis of contractual documents for gas transport and delivery, examination of auxiliary services linked with the access to the network (modulation, balancing, conversion), consideration about the processing of network congestions and denied accesses, analysis of the metering dissociation between the integrated activities of gas operators. Some documents are attached in appendixes: the mission letter from July 9, 2001, the detailed analysis of the new temporary tariffs of GdF and CFM, the offer of methane terminals access to third parties, the compatibility of a nodal tariffing with the presence of three transport operators (GdF, CFM and GSO), the contract-type for GdF supply, and the contract-type for GdF connection. (J.S.)

  11. Cassini Mission

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Robert (Jet Propulsion Laboratory)

    2005-08-10

    The Cassini/Huygens mission is a joint NASA/European Space Agency/Italian Space Agency project which has a spacecraft currently in orbit about Saturn, and has successfully sent an atmospheric probe through the atmosphere of Saturn's largest moon Titan and down to its previously hidden surface. This presentation will describe the overall mission, how it got a rather massive spacecraft to Saturn, and will cover some of the scientific results of the mission to date.

  12. AAL Security and Privacy: transferring XACML policies for end-to-end acess and usage control

    NARCIS (Netherlands)

    Vlamings, H.G.M.; Koster, R.P.

    2010-01-01

    Ambient Assisted Living (AAL) systems and services aim to provide a solution for growing healthcare expenses and degradation of life quality of elderly using information and communication technology. Inparticular AAL solutions are being created that are heavily based on web services an sensor

  13. Ubiquitous Monitoring Solution for Wireless Sensor Networks with Push Notifications and End-to-End Connectivity

    Directory of Open Access Journals (Sweden)

    Luis M. L. Oliveira

    2014-01-01

    Full Text Available Wireless Sensor Networks (WSNs belongs to a new trend in technology in which tiny and resource constrained devices are wirelessly interconnected and are able to interact with the surrounding environment by collecting data such as temperature and humidity. Recently, due to the huge growth in the use of mobile devices with Internet connection, smartphones are becoming the center of future ubiquitous wireless networks. Interconnecting WSNs with smartphones and the Internet is a big challenge and new architectures are required due to the heterogeneity of these devices. Taking into account that people are using smartphones with Internet connection, there is a good opportunity to propose a new architecture for wireless sensors monitoring using push notifications and smartphones. Then, this paper proposes a ubiquitous approach for WSN monitoring based on a REST Web Service, a relational database, and an Android mobile application. Real-time data sensed by WSNs are sent directly to a smartphone or stored in a database and requested by the mobile application using a well-defined RESTful interface. A push notification system was created in order to alert mobile users when a sensor parameter overcomes a given threshold. The proposed architecture and mobile application were evaluated and validated using a laboratory WSN testbed and are ready for use.

  14. End-to-End Concurrent Multipath Transfer Using Transport Layer Multihoming

    Science.gov (United States)

    2006-07-01

    insight into the ambient conditions under which cwnd overgrowth can be observed with SCTP, we develop an analytical model of this behavior and analyze...example in Section 6.2. The goal of this model is to provide insight into the ambient conditions un- der which cwnd overgrowth can be observed, thus...to con- gestion control. While some initial work in the area demonstrates feasibility [53], further work is needed to determine how these techniques

  15. Stock assessment and end-to-end ecosystem models alter dynamics of fisheries data.

    Science.gov (United States)

    Storch, Laura S; Glaser, Sarah M; Ye, Hao; Rosenberg, Andrew A

    2017-01-01

    Although all models are simplified approximations of reality, they remain useful tools for understanding, predicting, and managing populations and ecosystems. However, a model's utility is contingent on its suitability for a given task. Here, we examine two model types: single-species fishery stock assessment and multispecies marine ecosystem models. Both are efforts to predict trajectories of populations and ecosystems to inform fisheries management and conceptual understanding. However, many of these ecosystems exhibit nonlinear dynamics, which may not be represented in the models. As a result, model outputs may underestimate variability and overestimate stability. Using nonlinear forecasting methods, we compare predictability and nonlinearity of model outputs against model inputs using data and models for the California Current System. Compared with model inputs, time series of model-processed outputs show more predictability but a higher prevalence of linearity, suggesting that the models misrepresent the actual predictability of the modeled systems. Thus, caution is warranted: using such models for management or scenario exploration may produce unforeseen consequences, especially in the context of unknown future impacts.

  16. Integration of DST's for non-conflicting end-to-end flight scheduling Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR effort we propose an innovative approach for the integration of Decision Support Tools (DSTs) for increased situational awareness, improved cooperative...

  17. Network Slicing in Industry 4.0 Applications: Abstraction Methods and End-to-End Analysis

    DEFF Research Database (Denmark)

    Nielsen, Jimmy Jessen; Popovski, Petar; Kalør, Anders Ellersgaard

    2018-01-01

    Industry 4.0 refers to the fourth industrial revolution, and introduces modern communication and computation technologies such as 5G, cloud computing and Internet of Things to industrial manufacturing systems. As a result, many devices, machines and applications will rely on connectivity, while...... having different requirements from the network, ranging from high reliability and low latency to high data rates. Furthermore, these industrial networks will be highly heterogeneous as they will feature a number of diverse communication technologies. In this article, we propose network slicing...... as a mechanism to handle the diverse set of requirements to the network. We present methods for slicing deterministic and packet-switched industrial communication protocols at an abstraction level which is decoupled from the specific implementation of the underlying technologies, and hence simplifies the slicing...

  18. Multi-hop Relaying: An End-to-End Delay Analysis

    KAUST Repository

    Chaaban, Anas

    2015-12-01

    The impact of multi-hopping schemes on the communication latency in a relay channel is studied. The main aim is to characterize conditions under which such schemes decrease the communication latency given a reliability requirement. Both decode-forward (DF) and amplify-forward (AF) with block coding are considered, and are compared with the point-to-point (P2P) scheme which ignores the relay. Latency expressions for the three schemes are derived, and conditions under which DF and AF reduce latency are obtained for high signal-to-noise ratio (SNR). Interestingly, these conditions are more strict when compared to the conditions under which the same multi-hopping schemes achieve higher long-term (information-theoretic) rates than P2P. It turns out that the relation between the sourcedestination SNR and the harmonic mean of the SNR’s of the channels to and from the relay dictates whether multi-hopping reduces latency or not.

  19. Deep View-Sensitive Pedestrian Attribute Inference in an end-to-end Model

    OpenAIRE

    Sarfraz, M. Saquib; Schumann, Arne; Wang, Yan; Stiefelhagen, Rainer

    2017-01-01

    Pedestrian attribute inference is a demanding problem in visual surveillance that can facilitate person retrieval, search and indexing. To exploit semantic relations between attributes, recent research treats it as a multi-label image classification task. The visual cues hinting at attributes can be strongly localized and inference of person attributes such as hair, backpack, shorts, etc., are highly dependent on the acquired view of the pedestrian. In this paper we assert this dependence in ...

  20. End-to-end workflow for finite element analysis of tumor treating fields in glioblastomas

    Science.gov (United States)

    Timmons, Joshua J.; Lok, Edwin; San, Pyay; Bui, Kevin; Wong, Eric T.

    2017-11-01

    Tumor Treating Fields (TTFields) therapy is an approved modality of treatment for glioblastoma. Patient anatomy-based finite element analysis (FEA) has the potential to reveal not only how these fields affect tumor control but also how to improve efficacy. While the automated tools for segmentation speed up the generation of FEA models, multi-step manual corrections are required, including removal of disconnected voxels, incorporation of unsegmented structures and the addition of 36 electrodes plus gel layers matching the TTFields transducers. Existing approaches are also not scalable for the high throughput analysis of large patient volumes. A semi-automated workflow was developed to prepare FEA models for TTFields mapping in the human brain. Magnetic resonance imaging (MRI) pre-processing, segmentation, electrode and gel placement, and post-processing were all automated. The material properties of each tissue were applied to their corresponding mask in silico using COMSOL Multiphysics (COMSOL, Burlington, MA, USA). The fidelity of the segmentations with and without post-processing was compared against the full semi-automated segmentation workflow approach using Dice coefficient analysis. The average relative differences for the electric fields generated by COMSOL were calculated in addition to observed differences in electric field-volume histograms. Furthermore, the mesh file formats in MPHTXT and NASTRAN were also compared using the differences in the electric field-volume histogram. The Dice coefficient was less for auto-segmentation without versus auto-segmentation with post-processing, indicating convergence on a manually corrected model. An existent but marginal relative difference of electric field maps from models with manual correction versus those without was identified, and a clear advantage of using the NASTRAN mesh file format was found. The software and workflow outlined in this article may be used to accelerate the investigation of TTFields in glioblastoma patients by facilitating the creation of FEA models derived from patient MRI datasets.

  1. End-to-end integrated security and performance analysis on the DEGAS Choreographer platform

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Gilmore, Stephen; Haenel, Valentin

    2005-01-01

    We present a software tool platform which facilitates security and performance analysis of systems which starts and ends with UML model descriptions. A UML project is presented to the platform for analysis, formal content is extracted in the form of process calculi descriptions, analysed...... with the analysers of the calculi, and the results of the analysis are reflected back into a modified version of the input UML model. The design platform supporting the methodology, Choreographer, interoperates with state-of-the-art UML modelling tools. We illustrate the approach with a well known protocol...

  2. Caius: Synthetic Observations Using a Robust End-to-End Radiative Transfer Pipeline

    Science.gov (United States)

    Simeon Barrow, Kirk Stuart; Wise, John H.; O'Shea, Brian; Norman, Michael L.; Xu, Hao

    2018-01-01

    We present synthetic observations for the first generations of galaxies in the Universe and make predictions for future deep field observations for redshifts greater than 6. Due to the strong impact of nebular emission lines and the relatively compact scale of HII regions, high resolution cosmological simulations and a robust suite of analysis tools are required to properly simulate spectra. We created a software pipeline consisting of FSPS, Yggdrasil, Hyperion, Cloudy and our own tools to generate synthetic IR observations from a fully three-dimensional arrangement of gas, dust, and stars. Our prescription allows us to include emission lines for a complete chemical network and tackle the effect of dust extinction and scattering in the various lines of sight. We provide spectra, 2-D binned photon imagery for both HST and JWST IR filters, luminosity relationships, and emission line strengths for a large sample of high redshift galaxies in the Renaissance Simulations (Xu et al. 2013). We also pay special attention to contributions from Population III stars and high-mass X-ray binaries and explore a direct-collapse black hole simulation (Aykutalp et al. 2014). Our resulting synthetic spectra show high variability between galactic halos with a strong dependence on stellar mass, viewing angle, metallicity, gas mass fraction, and formation history.

  3. Design and Evaluation for the End-to-End Detection of TCP/IP Header Manipulation

    Science.gov (United States)

    2014-06-01

    comments RST reset RTT round trip time SACK selective acknowledgment SDN software-defined networking SIMPLE Software-defIned Middlebox PoLicy...elements of software-defined networking ( SDN ). Concluding our coverage of the solution space, in Section 3.3 we take a closer look at various... SDN to middlebox architectures [22,83]. SDN is a new and emerging network architectural design strategy that decouples the intelligence and traffic

  4. Improving End-To-End Tsunami Warning for Risk Reduction on Canada’s West Coast

    Science.gov (United States)

    2015-01-01

    recognize that long-term survival of their communities and livelihoods rests on balancing sustainability of these industries with increased... diversification of coastal economies. Consequently, several new initiatives have been launched by federal, provincial, regional and local authorities, First

  5. Telephony Over IP: A QoS Measurement-Based End to End Control Algorithm

    Directory of Open Access Journals (Sweden)

    Luigi Alcuri

    2004-12-01

    Full Text Available This paper presents a method for admitting voice calls in Telephony over IP (ToIP scenarios. This method, called QoS-Weighted CAC, aims to guarantee Quality of Service to telephony applications. We use a measurement-based call admission control algorithm, which detects network congested links through a feedback on overall link utilization. This feedback is based on the measures of packet delivery latencies related to voice over IP connections at the edges of the transport network. In this way we introduce a close loop control method, which is able to auto-adapt the quality margin on the basis of network load and specific service level requirements. Moreover we evaluate the difference in performance achieved by different Queue management configurations to guarantee Quality of Service to telephony applications, in which our goal was to evaluate the weight of edge router queue configuration in complex and real-like telephony over IP scenario. We want to compare many well-know queue scheduling algorithms, such as SFQ, WRR, RR, WIRR, and Priority. This comparison aims to locate queue schedulers in a more general control scheme context where different elements such as DiffServ marking and Admission control algorithms contribute to the overall Quality of Service required by real-time voice conversations. By means of software simulations we want to compare this solution with other call admission methods already described in scientific literature in order to locate this proposed method in a more general control scheme context. On the basis of the results we try to evidence the possible advantages of this QoS-Weighted solution in comparison with other similar CAC solutions ( in particular Measured Sum, Bandwidth Equivalent with Hoeffding Bounds, and Simple Measure CAC, on the planes of complexity, stability, management, tune-ability to service level requirements, and compatibility with actual network implementation.

  6. Adaptive end-to-end optimization of mobile video streaming using QoS negotiation

    NARCIS (Netherlands)

    Taal, Jacco R.; Langendoen, Koen; van der Schaaf, Arjen; van Dijk, H.W.; Lagendijk, R. (Inald) L.

    Video streaming over wireless links is a non-trivial problem due to the large and frequent changes in the quality of the underlying radio channel combined with latency constraints. We believe that every layer in a mobile system must be prepared to adapt its behavior to its environment. Thus layers

  7. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems

    Science.gov (United States)

    Cota, Stephen A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Christopher J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Wilkinson, Timothy S.

    2010-06-01

    The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.

  8. End-to-end observatory software modeling using domain specific languages

    Science.gov (United States)

    Filgueira, José M.; Bec, Matthieu; Liu, Ning; Peng, Chien; Soto, José

    2014-07-01

    The Giant Magellan Telescope (GMT) is a 25-meter extremely large telescope that is being built by an international consortium of universities and research institutions. Its software and control system is being developed using a set of Domain Specific Languages (DSL) that supports a model driven development methodology integrated with an Agile management process. This approach promotes the use of standardized models that capture the component architecture of the system, that facilitate the construction of technical specifications in a uniform way, that facilitate communication between developers and domain experts and that provide a framework to ensure the successful integration of the software subsystems developed by the GMT partner institutions.

  9. Using Voice Over Internet Protocol to Create True End-to-End Security

    Science.gov (United States)

    2011-09-01

    Wikileaks (Fildes, 2010). These sets were not made public by any foreign spy or even a teenager hacking into classified networks out of curiosity or...video and chat client that supports SIP, XMPP/Jabber, AIM/ICQ, Windows Live, Yahoo !, Bonjour, and others. 4 Wireshark is a network protocol analyzer...Jitsi has the ability to connect via audio, video, and other services such as Jabber and Yahoo Messenger adding to the abnormally large section of

  10. End-to-End Architecture Modularisation and Slicing for Next Generation Networks

    OpenAIRE

    An, Xueli; Trivisonno, Riccardo; Einsiedler, Hans; von Hugo, Dirk; Haensge, Kay; Huang, Xiaofeng; Shen, Qing; Corujo, Daniel; Mahmood, Kashif; Trossen, Dirk; Liebsch, Marco; Leitao, Filipe; Phan, Cao-Thanh; Klamm, Frederic

    2016-01-01

    The journey towards the deployment of next generation networks has recently accelerated, driven by the joint effort of research and standards organisations. Despite this fact, the overall picture is still unclear as prioritization and understanding on several key concepts are not yet agreed by major vendors and network providers. Network Slicing is one of the central topics of the debate, and it is expected to become the key feature of next generation networks, providing the flexibility requi...

  11. An end-to-end assessment of extreme weather impacts on food security

    Science.gov (United States)

    Chavez, Erik; Conway, Gordon; Ghil, Michael; Sadler, Marc

    2015-11-01

    Both governments and the private sector urgently require better estimates of the likely incidence of extreme weather events, their impacts on food crop production and the potential consequent social and economic losses. Current assessments of climate change impacts on agriculture mostly focus on average crop yield vulnerability to climate and adaptation scenarios. Also, although new-generation climate models have improved and there has been an exponential increase in available data, the uncertainties in their projections over years and decades, and at regional and local scale, have not decreased. We need to understand and quantify the non-stationary, annual and decadal climate impacts using simple and communicable risk metrics that will help public and private stakeholders manage the hazards to food security. Here we present an `end-to-end’ methodological construct based on weather indices and machine learning that integrates current understanding of the various interacting systems of climate, crops and the economy to determine short- to long-term risk estimates of crop production loss, in different climate and adaptation scenarios. For provinces north and south of the Yangtze River in China, we have found that risk profiles for crop yields that translate climate into economic variability follow marked regional patterns, shaped by drivers of continental-scale climate. We conclude that to be cost-effective, region-specific policies have to be tailored to optimally combine different categories of risk management instruments.

  12. Intelligent End-To-End Resource Virtualization Using Service Oriented Architecture

    NARCIS (Netherlands)

    Onur, E.; Sfakianakis, E.; Papagianni, C.; Karagiannis, Georgios; Kontos, T.; Niemegeers, I.G.M.M.; Niemegeers, I.; Chochliouros, I.; Heemstra de Groot, S.M.; Sjödin, P.; Hidell, M.; Cinkler, T.; Maliosz, M.; Kaklamani, D.I.; Carapinha, J.; Belesioti, M.; Futrps, E.

    2009-01-01

    Service-oriented architecture can be considered as a philosophy or paradigm in organizing and utilizing services and capabilities that may be under the control of different ownership domains. Virtualization provides abstraction and isolation of lower level functionalities, enabling portability of

  13. IMS Intra- and Inter Domain End-to-End Resilience Analysis

    DEFF Research Database (Denmark)

    Kamyod, Chayapol; Nielsen, Rasmus Hjorth; Prasad, Neeli R.

    2013-01-01

    This paper evaluated resilience of the reference IMS based network topology in operation through the keys reliability parameters via OPNET. The reliability behaviors of communication within similar and across registered home IMS domains were simulated and compared. Besides, the reliability effect...

  14. Research on the Establishment and Evaluation of End - to - End Service Quality Index System

    Science.gov (United States)

    Wei, Chen; Jing, Tao; Ji, Yutong

    2018-01-01

    From the perspective of power data networks, put forward the index system model to measure the quality of service, covering user experience, business performance, network capacity support, etc., and gives the establishment and use of each layer index in the model.

  15. The Challenge of Ensuring Human Rights in the End-to-End Supply Chain

    DEFF Research Database (Denmark)

    Wieland, Andreas; Handfield, Robert B.

    2014-01-01

    Certification programs have their merits and their limitations. With the growing availability of social media, analytics tools, and supply chain data, a smarter set of solutions could soon be possible.......Certification programs have their merits and their limitations. With the growing availability of social media, analytics tools, and supply chain data, a smarter set of solutions could soon be possible....

  16. MONTAGE: A Methodology for Designing Composable End-to-End Secure Distributed Systems

    Science.gov (United States)

    2012-08-01

    MITRE, 1977. [Bis95] Matt Bishop. Race conditions, files, and security flaws; or the tortoise and the hare redux. Report CSE-95-8, Univ. of California...Tal Garfinkel, Keith Adams, Andrew Warfield, and Jason Franklin. Compat- ibility is Not Transparency: VMM Detection Myths and Realities. In Proc. 11th

  17. CUSat: An End-to-End In-Orbit Inspection System University Nanosatellite Program

    Science.gov (United States)

    2007-01-01

    valid OMB control nupber. PL§f&j DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYY) 2. REPORT TYPE 3. DATES COVERED (From - To) 09...120dWo7 Sun 320107 7/4161 Q fai Soft-4710/ 070777/wt 6.0- S. 2,25V7 T, S2,M 40 1.040/9770 Fl" 7/7 Cow FuI, S.V25,+7 Th, 1-9t7 [ GN/C NWV-0/0 D.Oft-N01...walls " ACS revising, testing * Tight lid fits on " ADS testing, sun ack 0 Stiffener * MOMS SW 0 Alodining parts * GNC Supervisor * Flight helicoil

  18. Location Assisted Vertical Handover Algorithm for QoS Optimization in End-to-End Connections

    DEFF Research Database (Denmark)

    Dam, Martin S.; Christensen, Steffen R.; Mikkelsen, Lars M.

    2012-01-01

    in this paper focus on 1) peer-to-peer in a WLAN setting, 2) p2p behind NAT and 3) what we call a server bounce mechanism. The algorithm is supported by a User-specific Virtual Network to obtain required network state information. Experimental tests are conducted, using both simulations and actual...

  19. A secure searcher for end-to-end encrypted email communication

    OpenAIRE

    Mani, Balamaruthu

    2015-01-01

    Email has become a common mode of communication for confidential personal as well as business needs. There are different approaches to authenticate the sender of an email message at the receiver‟s client and ensure that the message can be read only by the intended recipient. A typical approach is to use an email encryption standard to encrypt the message on the sender‟s client and decrypt it on the receiver‟s client for secure communication. A major drawback of this approach is that only the ...

  20. Integrating end-to-end encryption and authentication technology into broadband networks

    Energy Technology Data Exchange (ETDEWEB)

    Pierson, L.G.

    1995-11-01

    BISDN services will involve the integration of high speed data, voice, and video functionality delivered via technology similar to Asynchronous Transfer Mode (ATM) switching and SONET optical transmission systems. Customers of BISDN services may need a variety of data authenticity and privacy assurances, via Asynchronous Transfer Mode (ATM) services Cryptographic methods can be used to assure authenticity and privacy, but are hard to scale for implementation at high speed. The incorporation of these methods into computer networks can severely impact functionality, reliability, and performance. While there are many design issues associated with the serving of public keys for authenticated signaling and for establishment of session cryptovariables, this paper is concerned with the impact of encryption itself on such communications once the signaling and setup have been completed. Network security protections should be carefully matched to the threats against which protection is desired. Even after eliminating unnecessary protections, the remaining customer-required network security protections can impose severe performance penalties. These penalties (further discussed below) usually involve increased communication processing for authentication or encryption, increased error rate, increased communication delay, and decreased reliability/availability. Protection measures involving encryption should be carefully engineered so as to impose the least performance, reliability, and functionality penalties, while achieving the required security protection. To study these trade-offs, a prototype encryptor/decryptor was developed. This effort demonstrated the viability of implementing certain encryption techniques in high speed networks. The research prototype processes ATM cells in a SONET OC-3 payload. This paper describes the functionality, reliability, security, and performance design trade-offs investigated with the prototype.

  1. End-to-end unsupervised deformable image registration with a convolutional neural network

    NARCIS (Netherlands)

    de Vos, Bob D.; Berendsen, Floris; Viergever, Max A.; Staring, Marius; Išgum, Ivana

    2017-01-01

    In this work we propose a deep learning network for deformable image registration (DIRNet). The DIRNet consists of a convolutional neural network (ConvNet) regressor, a spatial transformer, and a resampler. The ConvNet analyzes a pair of fixed and moving images and outputs parameters for the spatial

  2. End-to-End Verification of Information-Flow Security for C and Assembly Programs

    Science.gov (United States)

    2016-04-01

    of children any process is allowed to spawn. Suppose a process with ID i and c children (c < mc) spawns a new child. Then the child’s ID will always...written in assembly; we verify all of the code, regardless of language. Categories and Subject Descriptors D.2.4 [Software En- gineering]: Software... allowed to flow between various domains? If we express the policy in terms of the high-level syscall specifications, then what will this imply for the

  3. New vision solar system exploration missions study: Analysis of the use of biomodal space nuclear power systems to support outer solar system exploration missions. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-08

    This report presents the results of an analysis of the capability of nuclear bimodal systems to perform outer solar system exploration missions. Missions of interest include orbiter mission s to Mars, Jupiter, Saturn, Uranus, Neptune, and Pluto. An initial technology baseline consisting of a NEBA 10 kWe, 1000 N thrust, 850 s, 1500 kg bimodal system was selected, and its performance examined against a data base for trajectories to outer solar system planetary destinations to select optimal direct and gravity assisted trajectories for study. A conceptual design for a common bimodal spacecraft capable of performing missions to all the planetary destinations was developed and made the basis of end to end mission designs for orbiter missions to Jupiter, Saturn, and Neptune. Concepts for microspacecraft capable of probing Jupiter`s atmosphere and exploring Titan were also developed. All mission designs considered use the Atlas 2AS for launch. It is shown that the bimodal nuclear power and propulsion system offers many attractive option for planetary missions, including both conventional planetary missions in which all instruments are carried by a single primary orbiting spacecraft, and unconventional missions in which the primary spacecraft acts as a carrier, relay, and mother ship for a fleet of micro spacecraft deployed at the planetary destination.

  4. Libration Orbit Mission Design: Applications of Numerical & Dynamical Methods

    Science.gov (United States)

    Bauer, Frank (Technical Monitor); Folta, David; Beckman, Mark

    2002-01-01

    Sun-Earth libration point orbits serve as excellent locations for scientific investigations. These orbits are often selected to minimize environmental disturbances and maximize observing efficiency. Trajectory design in support of libration orbits is ever more challenging as more complex missions are envisioned in the next decade. Trajectory design software must be further enabled to incorporate better understanding of the libration orbit solution space and thus improve the efficiency and expand the capabilities of current approaches. The Goddard Space Flight Center (GSFC) is currently supporting multiple libration missions. This end-to-end support consists of mission operations, trajectory design, and control. It also includes algorithm and software development. The recently launched Microwave Anisotropy Probe (MAP) and upcoming James Webb Space Telescope (JWST) and Constellation-X missions are examples of the use of improved numerical methods for attaining constrained orbital parameters and controlling their dynamical evolution at the collinear libration points. This paper presents a history of libration point missions, a brief description of the numerical and dynamical design techniques including software used, and a sample of future GSFC mission designs.

  5. An Internet Protocol-Based Software System for Real-Time, Closed-Loop, Multi-Spacecraft Mission Simulation Applications

    Science.gov (United States)

    Davis, George; Cary, Everett; Higinbotham, John; Burns, Richard; Hogie, Keith; Hallahan, Francis

    2003-01-01

    The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.

  6. The ESA-JAXA EarthCARE clouds, aerosol and radiation explorer mission: overview and development status

    Science.gov (United States)

    Lajas, Dulce; Eisinger, M.; Wehr, T.; Koopman, Robert; Lefebvre, A.

    2015-10-01

    The EarthCARE, Earth Clouds, Aerosol and Radiation Explorer, is a joint European-Japanese mission (ESA/JAXA/NICT) which has been defined with the objective of improving the understanding of cloud-aerosol-radiation interactions so as to include them correctly and reliably in climate and numerical weather prediction models. The EarthCARE Mission has been approved for implementation as ESA's third Earth Explorer Core Mission. It is currently in its Detailed Design Phase (phase C/D) with a launch scheduled for 2018 [1]. This paper presents the EarthCARE programmatic status, the current instrument design and mission performance. The mission end-to-end simulator (E3SIM) and data processing up to level 2 (geophysical products) and related science activities will be discussed. The E3SIM supports end-to-end simulations from a scene definition to synergistic level 2 products. Level 2 retrieval algorithms can be tested in the full chain (provision of input data, algorithm performance tests by comparison of outputs with known inputs) by using a single framework with well-defined interfaces helping to harmonise algorithm developments.

  7. Sentinel-2 mission status

    Science.gov (United States)

    Hoersch, Bianca

    2017-04-01

    The SENTINEL-2 mission is the European Multispectral Imaging Mission for the Copernicus joint initiative of the European Commission (EC) and the European Space Agency (ESA). The SENTINEL-2 mission includes 13-spectral band multispectral optical imager with different resolution (down to 10 m) and a swath width of 290km. It provides very short revisit times and rapid product delivery. The mission is composed of a constellation of two satellite units, SENTINEL-2A and SENTINEL-2B, sharing the same orbital plane and featuring a short repeat cycle of 5 days at the equator optimized to mitigate the impact of clouds for science and applications. SENTINEL-2 enables exploitation for a variety of land and coastal applications such as agriculture, forestry, land cover and land cover change, urban mapping, emergency, as well as inland water, ice, glaciers and also coastal zone and closed seas applications. Following the launch of the Sentinel-2A in June 2015 and successful operations and data delivery since December 2015, the Sentinel-2B satellite is set for launch in March 2017. The full operation capacity is foreseen after the in-orbit commissioning phase of the Sentinel-2B unit in early summer 2017. The objective of the talk is to provide information about the mission status, and the way to achieve full operational capacity with 2 satellites.

  8. Landsat Data Continuity Mission (LDCM) space to ground mission data architecture

    Science.gov (United States)

    Nelson, Jack L.; Ames, J.A.; Williams, J.; Patschke, R.; Mott, C.; Joseph, J.; Garon, H.; Mah, G.

    2012-01-01

    The Landsat Data Continuity Mission (LDCM) is a scientific endeavor to extend the longest continuous multi-spectral imaging record of Earth's land surface. The observatory consists of a spacecraft bus integrated with two imaging instruments; the Operational Land Imager (OLI), built by Ball Aerospace & Technologies Corporation in Boulder, Colorado, and the Thermal Infrared Sensor (TIRS), an in-house instrument built at the Goddard Space Flight Center (GSFC). Both instruments are integrated aboard a fine-pointing, fully redundant, spacecraft bus built by Orbital Sciences Corporation, Gilbert, Arizona. The mission is scheduled for launch in January 2013. This paper will describe the innovative end-to-end approach for efficiently managing high volumes of simultaneous realtime and playback of image and ancillary data from the instruments to the reception at the United States Geological Survey's (USGS) Landsat Ground Network (LGN) and International Cooperator (IC) ground stations. The core enabling capability lies within the spacecraft Command and Data Handling (C&DH) system and Radio Frequency (RF) communications system implementation. Each of these systems uniquely contribute to the efficient processing of high speed image data (up to 265Mbps) from each instrument, and provide virtually error free data delivery to the ground. Onboard methods include a combination of lossless data compression, Consultative Committee for Space Data Systems (CCSDS) data formatting, a file-based/managed Solid State Recorder (SSR), and Low Density Parity Check (LDPC) forward error correction. The 440 Mbps wideband X-Band downlink uses Class 1 CCSDS File Delivery Protocol (CFDP), and an earth coverage antenna to deliver an average of 400 scenes per day to a combination of LGN and IC ground stations. This paper will also describe the integrated capabilities and processes at the LGN ground stations for data reception using adaptive filtering, and the mission operations approach fro- the LDCM

  9. Evaluating Mission Drift in Microfinance: Lessons for Programs with Social Mission

    Science.gov (United States)

    Hishigsuren, Gaamaa

    2007-01-01

    The article contributes to a better understanding of implications of scaling up on the social mission of microfinance programs. It proposes a methodology to measure the extent, if any, to which a microfinance program with a poverty alleviation mission drifts away from its mission during rapid scaling up and presents findings from a field research…

  10. The THEMIS Mission

    CERN Document Server

    Burch, J. L

    2009-01-01

    The THEMIS mission aims to determine the trigger and large-scale evolution of substorms by employing five identical micro-satellites which line up along the Earth's magnetotail to track the motion of particles, plasma, and waves from one point to another and for the first time, resolve space-time ambiguities in key regions of the magnetosphere on a global scale. The primary goal of THEMIS is to elucidate which magnetotail process is responsible for substorm onset at the region where substorm auroras map: (i) local disruption of the plasma sheet current (current disruption) or (ii) the interaction of the current sheet with the rapid influx of plasma emanating from reconnection. The probes also traverse the radiation belts and the dayside magnetosphere, allowing THEMIS to address additional baseline objectives. This volume describes the mission, the instrumentation, and the data derived from them.

  11. MISSION PROFILE AND DESIGN CHALLENGES FOR MARS LANDING EXPLORATION

    Directory of Open Access Journals (Sweden)

    J. Dong

    2017-07-01

    Full Text Available An orbiter and a descent module will be delivered to Mars in the Chinese first Mars exploration mission. The descent module is composed of a landing platform and a rover. The module will be released into the atmosphere by the orbiter and make a controlled landing on Martian surface. After landing, the rover will egress from the platform to start its science mission. The rover payloads mainly include the subsurface radar, terrain camera, multispectral camera, magnetometer, anemometer to achieve the scientific investigation of the terrain, soil characteristics, material composition, magnetic field, atmosphere, etc. The landing process is divided into three phases (entry phase, parachute descent phase and powered descent phase, which are full of risks. There exit lots of indefinite parameters and design constrain to affect the selection of the landing sites and phase switch (mortaring the parachute, separating the heat shield and cutting off the parachute. A number of new technologies (disk-gap-band parachute, guidance and navigation, etc. need to be developed. Mars and Earth have gravity and atmosphere conditions that are significantly different from one another. Meaningful environmental conditions cannot be recreated terrestrially on earth. A full-scale flight validation on earth is difficult. Therefore the end-to-end simulation and some critical subsystem test must be considered instead. The challenges above and the corresponding design solutions are introduced in this paper, which can provide reference for the Mars exploration mission.

  12. Mission Profile and Design Challenges for Mars Landing Exploration

    Science.gov (United States)

    Dong, J.; Sun, Z.; Rao, W.; Jia, Y.; Meng, L.; Wang, C.; Chen, B.

    2017-07-01

    An orbiter and a descent module will be delivered to Mars in the Chinese first Mars exploration mission. The descent module is composed of a landing platform and a rover. The module will be released into the atmosphere by the orbiter and make a controlled landing on Martian surface. After landing, the rover will egress from the platform to start its science mission. The rover payloads mainly include the subsurface radar, terrain camera, multispectral camera, magnetometer, anemometer to achieve the scientific investigation of the terrain, soil characteristics, material composition, magnetic field, atmosphere, etc. The landing process is divided into three phases (entry phase, parachute descent phase and powered descent phase), which are full of risks. There exit lots of indefinite parameters and design constrain to affect the selection of the landing sites and phase switch (mortaring the parachute, separating the heat shield and cutting off the parachute). A number of new technologies (disk-gap-band parachute, guidance and navigation, etc.) need to be developed. Mars and Earth have gravity and atmosphere conditions that are significantly different from one another. Meaningful environmental conditions cannot be recreated terrestrially on earth. A full-scale flight validation on earth is difficult. Therefore the end-to-end simulation and some critical subsystem test must be considered instead. The challenges above and the corresponding design solutions are introduced in this paper, which can provide reference for the Mars exploration mission.

  13. Multi-mission telecom analysis tool

    Science.gov (United States)

    Hanks, D.; Kordon, M.; Baker, J.

    2002-01-01

    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  14. Towards the Development of a Global, Satellite-Based, Terrestrial Snow Mission Planning Tool

    Science.gov (United States)

    Forman, Bart; Kumar, Sujay; Le Moigne, Jacqueline; Nag, Sreeja

    2017-01-01

    A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASA's Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT-C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical permutation. One objective of the proposed observing system simulation experiment (OSSE) is to assess the complementary or perhaps contradictory information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include: What observational records are needed (in space and time) to maximize terrestrial snow experimental utility? How might observations be coordinated (in space and time) to maximize this utility? What is the additional utility associated with an additional observation? How can future mission costs be minimized while ensuring Science requirements are fulfilled?

  15. Towards the Development of a Global, Satellite-based, Terrestrial Snow Mission Planning Tool

    Science.gov (United States)

    Forman, Bart; Kumar, Sujay; Le Moigne, Jacqueline; Nag, Sreeja

    2017-01-01

    A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASAs Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical orbital configuration.One objective the proposed observing system simulation experiment (OSSE) is to assess the complementary or perhaps contradictory information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include:1. What observational records are needed (in space and time) to maximize terrestrial snow experimental utility?2. How might observations be coordinated (in space and time) to maximize utility? 3. What is the additional utility associated with an additional observation?4. How can future mission costs being minimized while ensuring Science requirements are fulfilled?

  16. Dividing the Concentrator Target From the Genesis Mission

    Science.gov (United States)

    Lauer, H. V., Jr.; Burkett, P. J.; Clemett, S. J.; Gonzales, C. P.; Nakamura-Messenger, K.; Rodriquez, M. C.; See, T. H.; Sutter, B.

    2014-01-01

    for wafer cool down from any possible heating via the laser. The ablated material that "stuck" in the "scribe-cut" was removed from the "cut" using an ultrasonic micro-tool. After all the ablated silicon was removed from the wafer, the wafer was repositioned in exactly the same orientation on the laser stage. The laser was focused using the bottom of the wafer channel, and the 31-line scribing pattern described above was reprogrammed using the Z position of the groove bottom as the starting Z value instead of the top wafer surface, which was used previously. Upon completion of the second set of scribes, the ultrasonic micro-tool was again used to clean out the cut. The wafer was remounted on the stage in exactly the same orientation as before. The laser was again focused on the bottom of the groove. This time, however, the laser was.programed to scribe only one line down the exact center of the channel. The final scribe line consisted of 100 passes with a Z advance of 5 micron per pass and with the laser power set at 0.5 watts. As mentioned above, the final cutting plan was practiced in two end-to-end trials using non-flight, triangular-shaped silicon wafers similar in size and orientation to the actual DOS 60000 target sample. The actual scribing of the triangular-shaped wafers required scribing two lines and cleaving (i.e. scribe-cleave, then scribe-cleave) to obtain the piece requested for allocation. Early in December 2012, after many months of experiments and practicing and perfecting the techniques and procedures, the team successfully subdivided the Genesis DoS 60000 target sample, one of the most scientifically important samples from the Genesis mission (figure 2). On December 17, 2012, the allocated piece of concentrator target sample was delivered to the requesting principal investigator.The cutting plan developed for the subdivision of this sample will be used as the model for subdividing future requested Genesis flight wafers (appropriately modified for

  17. Aero-Assisted Spacecraft Missions Using Hypersonic Waverider Aeroshells

    Science.gov (United States)

    Knittel, Jeremy

    optimized outcome. In examining an aero-capture of Mars, it was found that with a lifting body, the increased maneuverability can allow completion of multiple mission objectives along with the aero-capture, such as atmospheric profiling or up to 80 degrees of orbital plane change. Completing a combined orbital plane change and aero-capture might save as much as 4.5 km/s of velocity increment while increasing the feasible entry corridor by an order of magnitude. Analyzing a higher energy mission type, a database of maximum aero-gravity assist performance is developed at Mars, Earth and Venus. Finally, a methodology is presented for designing end-to-end interplanetary missions using aero-gravity assists. As a means of demonstrating the method, promising trajectories are propagated which reduce the time of flight of an interstellar probe mission by up to 50%.

  18. Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission

    Science.gov (United States)

    Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan

    2010-01-01

    The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine

  19. Social Tagging of Mission Data

    Science.gov (United States)

    Norris, Jeffrey S.; Wallick, Michael N.; Joswig, Joseph C.; Powell, Mark W.; Torres, Recaredo J.; Mittman, David S.; Abramyan, Lucy; Crockett, Thomas M.; Shams, Khawaja S.; Fox, Jason M.; hide

    2010-01-01

    Mars missions will generate a large amount of data in various forms, such as daily plans, images, and scientific information. Often, there is a semantic linkage between images that cannot be captured automatically. Software is needed that will provide a method for creating arbitrary tags for this mission data so that items with a similar tag can be related to each other. The tags should be visible and searchable for all users. A new routine was written to offer a new and more flexible search option over previous applications. This software allows users of the MSLICE program to apply any number of arbitrary tags to a piece of mission data through a MSLICE search interface. The application of tags creates relationships between data that did not previously exist. These tags can be easily removed and changed, and contain enough flexibility to be specifically configured for any mission. This gives users the ability to quickly recall or draw attention to particular pieces of mission data, for example: Give a semantic and meaningful description to mission data; for example, tag all images with a rock in them with the tag "rock." Rapidly recall specific and useful pieces of data; for example, tag a plan as"driving template." Call specific data to a user s attention; for example, tag a plan as "for:User." This software is part of the MSLICE release, which was written in Java. It will run on any current Windows, Macintosh, or Linux system.

  20. Towards a Multi-Mission, Airborne Science Data System Environment

    Science.gov (United States)

    Crichton, D. J.; Hardman, S.; Law, E.; Freeborn, D.; Kay-Im, E.; Lau, G.; Oswald, J.

    2011-12-01

    NASA earth science instruments are increasingly relying on airborne missions. However, traditionally, there has been limited common infrastructure support available to principal investigators in the area of science data systems. As a result, each investigator has been required to develop their own computing infrastructures for the science data system. Typically there is little software reuse and many projects lack sufficient resources to provide a robust infrastructure to capture, process, distribute and archive the observations acquired from airborne flights. At NASA's Jet Propulsion Laboratory (JPL), we have been developing a multi-mission data system infrastructure for airborne instruments called the Airborne Cloud Computing Environment (ACCE). ACCE encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation. This includes improving data system interoperability across each instrument. A principal characteristic is being able to provide an agile infrastructure that is architected to allow for a variety of configurations of the infrastructure from locally installed compute and storage services to provisioning those services via the "cloud" from cloud computer vendors such as Amazon.com. Investigators often have different needs that require a flexible configuration. The data system infrastructure is built on the Apache's Object Oriented Data Technology (OODT) suite of components which has been used for a number of spaceborne missions and provides a rich set of open source software components and services for constructing science processing and data management systems. In 2010, a partnership was formed between the ACCE team and the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to support the data processing and data management needs

  1. NASA Airborne Missions in Support of Coastal Ecosystems and Water Quality Research

    Science.gov (United States)

    Guild, L. S.; Hooker, S. B.; Kudela, R. M.; Russell, P. B.; Morrow, J. H.; Palacios, S. L.; Myers, J. S.; Livingston, J. M.; Kacenelenbogen, M. S.; Knobelspiesse, K. D.; Redemann, J.; Clinton, N. E.; Torres-Perez, J. L.; Negrey, K.

    2016-02-01

    Worldwide, coastal marine ecosystems are exposed to land-based sources of pollution and sedimentation from anthropogenic activities including agriculture and coastal development. Ocean color products from satellite sensors provide information on chlorophyll (phytoplankton pigment), sediments, and colored dissolved organic material. Further, ship-based in-water measurements and emerging airborne measurements provide in situ data for the vicarious calibration of current and next generation satellite ocean color sensors and to validate the algorithms that use the remotely sensed observations. Recent NASA airborne missions over Monterey Bay, CA, have demonstrated novel above- and in-water measurement capabilities supporting a combined airborne sensor approach (imaging spectrometer, microradiometers, and a sun photometer). The results characterize coastal atmospheric and aquatic properties of seasonal algal blooms through an end-to-end assessment of image acquisition, atmospheric correction, algorithm application, plus sea-truth observations from state-of-the-art instrument systems.

  2. Portable air quality sensor unit for participatory monitoring: an end-to-end VESNA-AQ based prototype

    Science.gov (United States)

    Vucnik, Matevz; Robinson, Johanna; Smolnikar, Miha; Kocman, David; Horvat, Milena; Mohorcic, Mihael

    2015-04-01

    Key words: portable air quality sensor, CITI-SENSE, participatory monitoring, VESNA-AQ The emergence of low-cost easy to use portable air quality sensors units is opening new possibilities for individuals to assess their exposure to air pollutants at specific place and time, and share this information through the Internet connection. Such portable sensors units are being used in an ongoing citizen science project called CITI-SENSE, which enables citizens to measure and share the data. The project aims through creating citizens observatories' to empower citizens to contribute to and participate in environmental governance, enabling them to support and influence community and societal priorities as well as associated decision making. An air quality measurement system based on VESNA sensor platform was primarily designed within the project for the use as portable sensor unit in selected pilot cities (Belgrade, Ljubljana and Vienna) for monitoring outdoor exposure to pollutants. However, functionally the same unit with different set of sensors could be used for example as an indoor platform. The version designed for the pilot studies was equipped with the following sensors: NO2, O3, CO, temperature, relative humidity, pressure and accelerometer. The personal sensor unit is battery powered and housed in a plastic box. The VESNA-based air quality (AQ) monitoring system comprises the VESNA-AQ portable sensor unit, a smartphone app and the remote server. Personal sensor unit supports wireless connection to an Android smartphone via built-in Wi-Fi. The smartphone in turn serves also as the communication gateway towards the remote server using any of available data connections. Besides the gateway functionality the role of smartphone is to enrich data coming from the personal sensor unit with the GPS location, timestamps and user defined context. This, together with an accelerometer, enables the user to better estimate ones exposure in relation to physical activities, time and location. The end user can monitor the measured parameters through a smartphone application. The smartphone app implements a custom developed LCSP (Lightweight Client Server Protocol) protocol which is used to send requests to the VESNA-AQ unit and to exchange information. When the data is obtained from the VESNA-AQ unit, the mobile application visualizes the data. It also has an option to forward the data to the remote server in a custom JSON structure over a HTTP POST request. The server stores the data in the database and in parallel translates the data to WFS and forwards it to the main CITI-SENSE platform over WFS-T in a common XML format over HTTP POST request. From there data can be accessed through the Internet and visualised in different forms and web applications developed by the CITI-SENSE project. In the course of the project, the collected data will be made publicly available enabling the citizens to participate in environmental governance. Acknowledgements: CITI-SENSE is a Collaborative Project partly funded by the EU FP7-ENV-2012 under grant agreement no 308524 (www.citi-sense.eu).

  3. An Anthological Review of Research Utilizing MontyLingua: a Python-Based End-to-End Text Processor

    Directory of Open Access Journals (Sweden)

    2008-06-01

    Full Text Available MontyLingua, an integral part of ConceptNet which is currently the largest commonsense knowledge base, is an English text processor developed using Python programming language in MIT Media Lab. The main feature of MontyLingua is the coverage for all aspects of English text processing from raw input text to semantic meanings and summary generation, yet each component in MontyLingua is loosely-coupled to each other at the architectural and code level, which enabled individual components to be used independently or substituted. However, there has been no review exploring the role of MontyLingua in recent research work utilizing it. This paper aims to review the use of and roles played by MontyLingua and its components in research work published in 19 articles between October 2004 and August 2006. We had observed a diversified use of MontyLingua in many different areas, both generic and domain-specific. Although the use of text summarizing component had not been observe, we are optimistic that it will have a crucial role in managing the current trend of information overload in future research.

  4. An integrated healthcare information system for end-to-end standardized exchange and homogeneous management of digital ECG formats.

    Science.gov (United States)

    Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José

    2012-07-01

    This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility.

  5. Building the tree of life from scratch: an end-to-end work flow for phylogenomic studies

    Science.gov (United States)

    Whole genome sequences are rich sources of information about organisms that are superbly useful for addressing a wide variety of evolutionary questions. Recent progress in genomics has enabled the de novo decoding of the genome of virtually any organism, greatly expanding its potential for understan...

  6. Investigating end-to-end security in the fifth generation wireless capabilities and IoT extensions

    Science.gov (United States)

    Uher, J.; Harper, J.; Mennecke, R. G.; Patton, P.; Farroha, B.

    2016-05-01

    The emerging 5th generation wireless network will be architected and specified to meet the vision of allowing the billions of devices and millions of human users to share spectrum to communicate and deliver services. The expansion of wireless networks from its current role to serve these diverse communities of interest introduces new paradigms that require multi-tiered approaches. The introduction of inherently low security components, like IoT devices, necessitates that critical data be better secured to protect the networks and users. Moreover high-speed communications that are meant to enable the autonomous vehicles require ultra reliable and low latency paths. This research explores security within the proposed new architectures and the cross interconnection of the highly protected assets with low cost/low security components forming the overarching 5th generation wireless infrastructure.

  7. End-to-end encryption in on-line payment systems : The industry reluctance and the role of laws

    NARCIS (Netherlands)

    Kasiyanto, Safari

    2016-01-01

    Various security breaches at third-party payment processors show that online payment systems are the primary target for cyber-criminals. In general, the security of online payment systems relies on a number of factors, namely technical factors, processing factors, and legal factors. The industry

  8. End-to-end encryption in on-line payment systems: The industry reluctance and the role of laws

    OpenAIRE

    Kasiyanto, Safari

    2016-01-01

    Various security breaches at third-party payment processors show that online payment systems are the primary target for cyber-criminals. In general, the security of online payment systems relies on a number of factors, namely technical factors, processing factors, and legal factors. The industry gives its best endeavors to strengthen the technical and processing factors, while the government has been called upon to improve the legal factors. However, a breach of consumer's data and financial ...

  9. Themis-ml: A Fairness-aware Machine Learning Interface for End-to-end Discrimination Discovery and Mitigation

    OpenAIRE

    Bantilan, Niels

    2017-01-01

    As more industries integrate machine learning into socially sensitive decision processes like hiring, loan-approval, and parole-granting, we are at risk of perpetuating historical and contemporary socioeconomic disparities. This is a critical problem because on the one hand, organizations who use but do not understand the discriminatory potential of such systems will facilitate the widening of social disparities under the assumption that algorithms are categorically objective. On the other ha...

  10. Towards a Software Framework to Support Deployment of Low Cost End-to-End Hydroclimatological Sensor Network

    Science.gov (United States)

    Celicourt, P.; Piasecki, M.

    2015-12-01

    Deployment of environmental sensors assemblies based on cheap platforms such as Raspberry Pi and Arduino have gained much attention over the past few years. While they are more attractive due to their ability to be controlled with a few programming language choices, the configuration task can become quite complex due to the need of having to learn several different proprietary data formats and protocols which constitute a bottleneck for the expansion of sensor network. In response to this rising complexity the Institute of Electrical and Electronics Engineers (IEEE) has sponsored the development of the IEEE 1451 standard in an attempt to introduce a common standard. The most innovative concept of the standard is the Transducer Electronic Data Sheet (TEDS) which enables transducers to self-identify, self-describe, self-calibrate, to exhibit plug-and-play functionality, etc. We used Python to develop an IEEE 1451.0 platform-independent graphical user interface to generate and provide sufficient information about almost ANY sensor and sensor platforms for sensor programming purposes, automatic calibration of sensors data, incorporation of back-end demands on data management in TEDS for automatic standard-based data storage, search and discovery purposes. These features are paramount to make data management much less onerous in large scale sensor network. Along with the TEDS Creator, we developed a tool namely HydroUnits for three specific purposes: encoding of physical units in the TEDS, dimensional analysis, and on-the-fly conversion of time series allowing users to retrieve data in a desired equivalent unit while accommodating unforeseen and user-defined units. In addition, our back-end data management comprises the Python/Django equivalent of the CUAHSI Observations Data Model (ODM) namely DjangODM that will be hosted by a MongoDB Database Server which offers more convenience for our application. We are also developing a data which will be paired with the data autoloading capability of Django and a TEDS processing script to populate the database with the incoming data. The Python WaterOneFlow Web Services developed by the Texas Water Development Board will be used to publish the data. The software suite is being tested on the Raspberry Pi as end node and a laptop PC as the base station in a wireless setting.

  11. End-To-End Solution for Integrated Workload and Data Management using glideinWMS and Globus Online

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Grid computing has enabled scientific communities to effectively share computing resources distributed over many independent sites. Several such communities, or Virtual Organizations (VO), in the Open Science Grid and the European Grid Infrastructure use the glideinWMS system to run complex application work-flows. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, dynamically-sized overlay Condor batch system on Grid resources. While the WMS addresses the management of compute resources, however, data management in the Grid is still the responsibility of the VO. In general, large VOs have resources to develop complex custom solutions, while small VOs would rather push this responsibility to the infrastructure. The latter requires a tight integration of the WMS and the data management layers, an approach still not common in modern Grids. In this paper we describe a solution developed to address this shortcoming in the context of Center for Enabling Distributed Petascale Scienc...

  12. SU-E-T-268: Proton Radiosurgery End-To-End Testing Using Lucy 3D QA Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Choi, D; Gordon, I; Ghebremedhin, A; Wroe, A; Schulte, R; Bush, D; Slater, J; Patyal, B [Loma Linda UniversityMedical Center, Loma Linda, CA (United States)

    2014-06-01

    Purpose: To check the overall accuracy of proton radiosurgery treatment delivery using ready-made circular collimator inserts and fixed thickness compensating boluses. Methods: Lucy 3D QA phantom (Standard Imaging Inc. WI, USA) inserted with GaFchromicTM film was irradiated with laterally scattered and longitudinally spread-out 126.8 MeV proton beams. The tests followed every step in the proton radiosurgery treatment delivery process: CT scan (GE Lightspeed VCT), target contouring, treatment planning (Odyssey 5.0, Optivus, CA), portal calibration, target localization using robotic couch with image guidance and dose delivery at planned gantry angles. A 2 cm diameter collimator insert in a 4 cm diameter radiosurgery cone and a 1.2 cm thick compensating flat bolus were used for all beams. Film dosimetry (RIT114 v5.0, Radiological Imaging Technology, CO, USA) was used to evaluate the accuracy of target localization and relative dose distributions compared to those calculated by the treatment planning system. Results: The localization accuracy was estimated by analyzing the GaFchromic films irradiated at gantry 0, 90 and 270 degrees. We observed 0.5 mm shift in lateral direction (patient left), ±0.9 mm shift in AP direction and ±1.0 mm shift in vertical direction (gantry dependent). The isodose overlays showed good agreement (<2mm, 50% isodose lines) between measured and calculated doses. Conclusion: Localization accuracy depends on gantry sag, CT resolution and distortion, DRRs from treatment planning computer, localization accuracy of image guidance system, fabrication of ready-made aperture and cone housing. The total deviation from the isocenter was 1.4 mm. Dose distribution uncertainty comes from distal end error due to bolus and CT density, in addition to localization error. The planned dose distribution was well matched (>90%) to the measured values 2%/2mm criteria. Our test showed the robustness of our proton radiosurgery treatment delivery system using ready-made collimator inserts and fixed thickness compensating boluses.

  13. Bridging automatic speech recognition and psycholinguistics: Extending Shortlist to an end-to-end model of human speech recognition (L)

    Science.gov (United States)

    Scharenborg, Odette; ten Bosch, Louis; Boves, Lou; Norris, Dennis

    2003-12-01

    This letter evaluates potential benefits of combining human speech recognition (HSR) and automatic speech recognition by building a joint model of an automatic phone recognizer (APR) and a computational model of HSR, viz., Shortlist [Norris, Cognition 52, 189-234 (1994)]. Experiments based on ``real-life'' speech highlight critical limitations posed by some of the simplifying assumptions made in models of human speech recognition. These limitations could be overcome by avoiding hard phone decisions at the output side of the APR, and by using a match between the input and the internal lexicon that flexibly copes with deviations from canonical phonemic representations.

  14. An end-to-end model of the Earth Radiation Budget Experiment (ERBE) Earth-viewing nonscanning radiometric channels

    OpenAIRE

    Priestly, Kory James

    1993-01-01

    The Earth Radiation Budget Experiment (ERBE) active-cavity radiometers are used to measure the incoming solar, reflected solar, and emitted longwave radiation from the Earth and its atmosphere. The radiometers are carried by the National Aeronautics and Space Administration's Earth Radiation Budget Satellite (ERBS) and the National Oceanic and Atmospheric Administration's NOAA-9 and NOAA-10 spacecraft. Four Earth-viewing nonscanning active-cavity radiometers are carried by e...

  15. Bridging Automatic Speech Recognition and Psycholinguistics: Extending Shortlist to an End-to-End Model of Human Speech Recognition

    NARCIS (Netherlands)

    Scharenborg, O.E.; Bosch, L.F.M. ten; Boves, L.W.J.; Norris, D.

    2003-01-01

    This letter evaluates potential benefits of combining human speech recognition (HSR) and automatic speech recognition by building a joint model of an automatic phone recognizer (APR) and a computational model of HSR, viz. Shortlist (Norris, 1994). Experiments based on 'real-life' speech highlight

  16. Results from the NASA Spacecraft Fault Management Workshop: Cost Drivers for Deep Space Missions

    Science.gov (United States)

    Newhouse, Marilyn E.; McDougal, John; Barley, Bryan; Stephens Karen; Fesq, Lorraine M.

    2010-01-01

    Fault Management, the detection of and response to in-flight anomalies, is a critical aspect of deep-space missions. Fault management capabilities are commonly distributed across flight and ground subsystems, impacting hardware, software, and mission operations designs. The National Aeronautics and Space Administration (NASA) Discovery & New Frontiers (D&NF) Program Office at Marshall Space Flight Center (MSFC) recently studied cost overruns and schedule delays for five missions. The goal was to identify the underlying causes for the overruns and delays, and to develop practical mitigations to assist the D&NF projects in identifying potential risks and controlling the associated impacts to proposed mission costs and schedules. The study found that four out of the five missions studied had significant overruns due to underestimating the complexity and support requirements for fault management. As a result of this and other recent experiences, the NASA Science Mission Directorate (SMD) Planetary Science Division (PSD) commissioned a workshop to bring together invited participants across government, industry, and academia to assess the state of the art in fault management practice and research, identify current and potential issues, and make recommendations for addressing these issues. The workshop was held in New Orleans in April of 2008. The workshop concluded that fault management is not being limited by technology, but rather by a lack of emphasis and discipline in both the engineering and programmatic dimensions. Some of the areas cited in the findings include different, conflicting, and changing institutional goals and risk postures; unclear ownership of end-to-end fault management engineering; inadequate understanding of the impact of mission-level requirements on fault management complexity; and practices, processes, and tools that have not kept pace with the increasing complexity of mission requirements and spacecraft systems. This paper summarizes the

  17. Mission Exploitation Platform PROBA-V

    Science.gov (United States)

    Goor, Erwin

    2016-04-01

    VITO and partners developed an end-to-end solution to drastically improve the exploitation of the PROBA-V EO-data archive (http://proba-v.vgt.vito.be/), the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers and end-users. The analysis of time series of data (+1PB) is addressed, as well as the large scale on-demand processing of near real-time data. From November 2015 an operational Mission Exploitation Platform (MEP) PROBA-V, as an ESA pathfinder project, will be gradually deployed at the VITO data center with direct access to the complete data archive. Several applications will be released to the users, e.g. - A time series viewer, showing the evolution of PROBA-V bands and derived vegetation parameters for any area of interest. - Full-resolution viewing services for the complete data archive. - On-demand processing chains e.g. for the calculation of N-daily composites. - A Virtual Machine will be provided with access to the data archive and tools to work with this data, e.g. various toolboxes and support for R and Python. After an initial release in January 2016, a research platform will gradually be deployed allowing users to design, debug and test applications on the platform. From the MEP PROBA-V, access to Sentinel-2 and landsat data will be addressed as well, e.g. to support the Cal/Val activities of the users. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to the complete data archive. To realise this, private cloud technology (openStack) is used and a distributed processing environment is built based on Hadoop. The Hadoop ecosystem offers a lot of technologies (Spark, Yarn, Accumulo, etc.) which we integrate with several open-source components. The impact of this MEP on the user community will be high and will completely change the way of working with the data and hence open the large time series to a larger

  18. Insight into the Physical and Dynamical Processes that Control Rapid Increases in Total Flash Rate

    Science.gov (United States)

    Schultz, Christopher J.; Carey, Lawrence D.; Schultz, Elise V.; Blakeslee, Richard J.; Goodman, Steven J.

    2015-01-01

    Rapid increases in total lightning (also termed "lightning jumps") have been observed for many decades. Lightning jumps have been well correlated to severe and hazardous weather occurrence. The main focus of lightning jump work has been on the development of lightning algorithms to be used in real-time assessment of storm intensity. However, in these studies it is typically assumed that the updraft "increases" without direct measurements of the vertical motion, or specification of which updraft characteristic actually increases (e.g., average speed, maximum speed, or convective updraft volume). Therefore, an end-to-end physical and dynamical basis for coupling rapid increases in total flash rate to increases in updraft speed and volume must be understood in order to ultimately relate lightning occurrence to severe storm metrics. Herein, we use polarimetric, multi-Doppler, and lightning mapping array measurements to provide physical context as to why rapid increases in total lightning are closely tied to severe and hazardous weather.

  19. A Neptune Orbiter Mission

    Science.gov (United States)

    Wallace, R. A.; Spilker, T. R.

    1998-01-01

    This paper describes the results of new analyses and mission/system designs for a low cost Neptune Orbiter mission. Science and measurement objectives, instrumentation, and mission/system design options are described and reflect an aggressive approach to the application of new advanced technologies expected to be available and developed over the next five to ten years.

  20. Computer graphics aid mission operations. [NASA missions

    Science.gov (United States)

    Jeletic, James F.

    1990-01-01

    The application of computer graphics techniques in NASA space missions is reviewed. Telemetric monitoring of the Space Shuttle and its components is discussed, noting the use of computer graphics for real-time visualization problems in the retrieval and repair of the Solar Maximum Mission. The use of the world map display for determining a spacecraft's location above the earth and the problem of verifying the relative position and orientation of spacecraft to celestial bodies are examined. The Flight Dynamics/STS Three-dimensional Monitoring System and the Trajectroy Computations and Orbital Products System world map display are described, emphasizing Space Shuttle applications. Also, consideration is given to the development of monitoring systems such as the Shuttle Payloads Mission Monitoring System and the Attitude Heads-Up Display and the use of the NASA-Goddard Two-dimensional Graphics Monitoring System during Shuttle missions and to support the Hubble Space Telescope.

  1. Reconnaissance mission planning

    Science.gov (United States)

    Fishell, Wallace G.; Fox, Alex J.

    1991-12-01

    As ATARS evolves along with its various applications, as Recce UAVs evolve to mix with manned systems, and as older systems evolve through upgrades, so should their mission planning tools evolve. To simply state that today's tactical mission planning systems will be upgraded with provisions for Reconnaissance Mission Planning completely eliminates the natural learning curve required to mature the requirements and specifications for reconnaissance planning capabilities. This paper presents MSS II lessons learned at Operation Desert Storm and briefly looks at some of the required Reconnaissance Mission Planning functions attainable through the adaptation of existing mission planning products.

  2. Mobile Ad Hoc Networks in Bandwidth-Demanding Mission-Critical Applications: Practical Implementation Insights

    KAUST Repository

    Bader, Ahmed

    2016-09-28

    There has been recently a growing trend of using live video feeds in mission-critical applications. Real-time video streaming from front-end personnel or mobile agents is believed to substantially improve situational awareness in mission-critical operations such as disaster relief, law enforcement, and emergency response. Mobile Ad Hoc Networks (MANET) is a natural contender in such contexts. However, classical MANET routing schemes fall short in terms of scalability, bandwidth and latency; all three metrics being quite essential for mission-critical applications. As such, autonomous cooperative routing (ACR) has gained traction as the most viable MANET proposition. Nonetheless, ACR is also associated with a few implementation challenges. If they go unaddressed, will deem ACR practically useless. In this paper, efficient and low-complexity remedies to those issues are presented, analyzed, and validated. The validation is based on field experiments carried out using software-defined radio (SDR) platforms. Compared to classical MANET routing schemes, ACR was shown to offer up to 2X better throughput, more than 4X reduction in end-to-end latency, while observing a given target of transport rate normalized to energy consumption.

  3. The STEREO Mission

    CERN Document Server

    2008-01-01

    The STEREO mission uses twin heliospheric orbiters to track solar disturbances from their initiation to 1 AU. This book documents the mission, its objectives, the spacecraft that execute it and the instruments that provide the measurements, both remote sensing and in situ. This mission promises to unlock many of the mysteries of how the Sun produces what has become to be known as space weather.

  4. Corot, une mission bien remplie

    Science.gov (United States)

    Baglin, Annie; Belkacem, Kevin; Chaintreuil, Sylviane; Deleuil, Magali; Lam-Trong, Thien

    2017-03-01

    The CoRoT mission developped since 1993 was dedicated to the observation of ultra high precision measurements of the variations of stellar fluxed on long and continuous durations. His two major objectives were the detection of stellar oscillations and of extrasolar planets. A a pionnier mission, it had to invent several methods and to built the successive steps of the data treatment. They are rapidly described. The low cost 'CNES Petites Missions" programme imposed severe contraints on the instrument concept (Organisation and management, choice of the detectors, of the orbit, reduction of all parasite lights) which have been and will be guides for new generation projects. Some scientific highlights are then presented on both programmes, as for instance the use of the seismic tool as an indicator of the structure and evolution of the Galaxy, the first detection of a super-Earth, and the first precise characterisation of a brown dwarf . CoRoT has opened several avenues in instrumentation as well as science. It is shown how some aspects of this heritage have been used in the design and development of its two major heiress: CHEOPS to be launched in 2018 and the more ambitious PLATO, to be launched in 2025.

  5. Juno Mission Simulation

    Science.gov (United States)

    Lee, Meemong; Weidner, Richard J.

    2008-01-01

    The Juno spacecraft is planned to launch in August of 2012 and would arrive at Jupiter four years later. The spacecraft would spend more than one year orbiting the planet and investigating the existence of an ice-rock core; determining the amount of global water and ammonia present in the atmosphere, studying convection and deep- wind profiles in the atmosphere; investigating the origin of the Jovian magnetic field, and exploring the polar magnetosphere. Juno mission management is responsible for mission and navigation design, mission operation planning, and ground-data-system development. In order to ensure successful mission management from initial checkout to final de-orbit, it is critical to share a common vision of the entire mission operation phases with the rest of the project teams. Two major challenges are 1) how to develop a shared vision that can be appreciated by all of the project teams of diverse disciplines and expertise, and 2) how to continuously evolve a shared vision as the project lifecycle progresses from formulation phase to operation phase. The Juno mission simulation team addresses these challenges by developing agile and progressive mission models, operation simulations, and real-time visualization products. This paper presents mission simulation visualization network (MSVN) technology that has enabled a comprehensive mission simulation suite (MSVN-Juno) for the Juno project.

  6. The EXIST Mission Concept Study

    Science.gov (United States)

    Fishman, Gerald J.; Grindlay, J.; Hong, J.

    2008-01-01

    EXIST is a mission designed to find and study black holes (BHs) over a wide range of environments and masses, including: 1) BHs accreting from binary companions or dense molecular clouds throughout our Galaxy and the Local Group, 2) supermassive black holes (SMBHs) lying dormant in galaxies that reveal their existence by disrupting passing stars, and 3) SMBHs that are hidden from our view at lower energies due to obscuration by the gas that they accrete. 4) the birth of stellar mass BHs which is accompanied by long cosmic gamma-ray bursts (GRBs) which are seen several times a day and may be associated with the earliest stars to form in the Universe. EXIST will provide an order of magnitude increase in sensitivity and angular resolution as well as greater spectral resolution and bandwidth compared with earlier hard X-ray survey telescopes. With an onboard optical-infra red (IR) telescope, EXIST will measure the spectra and redshifts of GRBs and their utility as cosmological probes of the highest z universe and epoch of reionization. The mission would retain its primary goal of being the Black Hole Finder Probe in the Beyond Einstein Program. However, the new design for EXIST proposed to be studied here represents a significant advance from its previous incarnation as presented to BEPAC. The mission is now less than half the total mass, would be launched on the smallest EELV available (Atlas V-401) for a Medium Class mission, and most importantly includes a two-telescope complement that is ideally suited for the study of both obscured and very distant BHs. EXIST retains its very wide field hard X-ray imaging High Energy Telescope (HET) as the primary instrument, now with improved angular and spectral resolution, and in a more compact payload that allows occasional rapid slews for immediate optical/IR imaging and spectra of GRBs and AGN as well as enhanced hard X-ray spectra and timing with pointed observations. The mission would conduct a 2 year full sky survey in

  7. Proba-V Mission Exploitation Platform

    Science.gov (United States)

    Goor, Erwin; Dries, Jeroen

    2017-04-01

    VITO and partners developed the Proba-V Mission Exploitation Platform (MEP) as an end-to-end solution to drastically improve the exploitation of the Proba-V (a Copernicus contributing mission) EO-data archive (http://proba-v.vgt.vito.be/), the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers and end-users. The analysis of time series of data (+1PB) is addressed, as well as the large scale on-demand processing of near real-time data on a powerful and scalable processing environment. Furthermore data from the Copernicus Global Land Service is in scope of the platform. From November 2015 an operational Proba-V MEP environment, as an ESA operation service, is gradually deployed at the VITO data center with direct access to the complete data archive. Since autumn 2016 the platform is operational and yet several applications are released to the users, e.g. - A time series viewer, showing the evolution of Proba-V bands and derived vegetation parameters from the Copernicus Global Land Service for any area of interest. - Full-resolution viewing services for the complete data archive. - On-demand processing chains on a powerfull Hadoop/Spark backend e.g. for the calculation of N-daily composites. - Virtual Machines can be provided with access to the data archive and tools to work with this data, e.g. various toolboxes (GDAL, QGIS, GrassGIS, SNAP toolbox, …) and support for R and Python. This allows users to immediately work with the data without having to install tools or download data, but as well to design, debug and test applications on the platform. - A prototype of jupyter Notebooks is available with some examples worked out to show the potential of the data. Today the platform is used by several third party projects to perform R&D activities on the data, and to develop/host data analysis toolboxes. In parallel the platform is further improved and extended. From the MEP PROBA-V, access to Sentinel-2 and landsat data will

  8. Advance Approach to Concept and Design Studies for Space Missions

    Science.gov (United States)

    Deutsch, M.; Nichols, J.

    1999-01-01

    Recent automated and advanced techniques developed at JPL have created a streamlined and fast-track approach to initial mission conceptualization and system architecture design, answering the need for rapid turnaround of trade studies for potential proposers, as well as mission and instrument study groups.

  9. Bering Mission Navigation Method

    DEFF Research Database (Denmark)

    Betto, Maurizio; Jørgensen, John Leif; Jørgensen, Peter Siegbjørn

    2003-01-01

    "Bering", after the name of the famous Danish explorer, is a near Earth object (NEO) and main belt asteroids mapping mission envisaged by a consortium of Danish universities and research institutes. To achieve the ambitious goals set forth by this mission, while containing the costs and risks...

  10. The Pioneer Venus Missions.

    Science.gov (United States)

    National Aeronautics and Space Administration, Mountain View, CA. Ames Research Center.

    This document provides detailed information on the atmosphere and weather of Venus. This pamphlet describes the technological hardware including the probes that enter the Venusian atmosphere, the orbiter and the launch vehicle. Information is provided in lay terms on the mission profile, including details of events from launch to mission end. The…

  11. KEEL for Mission Planning

    Science.gov (United States)

    2016-10-06

    if the Mission Planning Software is supporting human planners. Copyright 2016, Compsim, All Rights Reserved 5 KEEL Operational Policy...cognitive technology for application in automotive, industrial automation, medical, military, governmental, enterprise software and electronic gaming... Copyright 2016, Compsim, All Rights Reserved 1 KEEL® Technology in support of Mission Planning and Execution delivering Adaptive

  12. Model-Based Systems Engineering for Capturing Mission Architecture System Processes with an Application Case Study - Orion Flight Test 1

    Science.gov (United States)

    Bonanne, Kevin H.

    2011-01-01

    Model-based Systems Engineering (MBSE) is an emerging methodology that can be leveraged to enhance many system development processes. MBSE allows for the centralization of an architecture description that would otherwise be stored in various locations and formats, thus simplifying communication among the project stakeholders, inducing commonality in representation, and expediting report generation. This paper outlines the MBSE approach taken to capture the processes of two different, but related, architectures by employing the Systems Modeling Language (SysML) as a standard for architecture description and the modeling tool MagicDraw. The overarching goal of this study was to demonstrate the effectiveness of MBSE as a means of capturing and designing a mission systems architecture. The first portion of the project focused on capturing the necessary system engineering activities that occur when designing, developing, and deploying a mission systems architecture for a space mission. The second part applies activities from the first to an application problem - the system engineering of the Orion Flight Test 1 (OFT-1) End-to-End Information System (EEIS). By modeling the activities required to create a space mission architecture and then implementing those activities in an application problem, the utility of MBSE as an approach to systems engineering can be demonstrated.

  13. GPS Navigation for the Magnetospheric Multi-Scale Mission

    Science.gov (United States)

    Bamford, William; Mitchell, Jason; Southward, Michael; Baldwin, Philip; Winternitz, Luke; Heckler, Gregory; Kurichh, Rishi; Sirotzky, Steve

    2009-01-01

    utilizing a TDMA schedule to distribute a science quality message to all constellation members every ten seconds. Additionally the system generates one-way range measurements between formation members which is used as input to the Kalman filter. In preparation for the MMS Preliminary Design Review (PDR), the Navigator was required to pass a series of Technology Readiness Level (TRL) tests to earn the necessary TRL-6 classification. The TRL-6 level is achieved by demonstrating a prototype unit in a relevant end-to-end environment. The IRAS unit was able to meet all requirements during the testing phase, and has thus been TRL-6 qualified

  14. The Science and Technology of Future Space Missions

    Science.gov (United States)

    Bonati, A.; Fusi, R.; Longoni, F.

    1999-12-01

    processing. Powerful computers with customized architectures are designed and developed. High-speed intercommunication networks are studied and tested. In parallel to the hardware research activities, software development is undertaken for several purposes: digital and video compression algorithms, payload and spacecraft control and diagnostics, scientific processing algorithms, etc. Besides, embedded Java virtual machines are studied for tele-science applications (direct link between scientist console and scientific payload). At system engineering level, the demand for spacecraft autonomy is increased for planetology missions: reliable intelligent systems that can operate for long periods of time without human intervention from ground are requested and investigated. A technologically challenging but less glamorous area of development is represented by the laboratory equipment for end-to-end testing (on ground) of payload instruments. The main fields are cryogenics, laser and X-ray optics, microwave radiometry, UV and infrared testing systems.

  15. Advanced automation in space shuttle mission control

    Science.gov (United States)

    Heindel, Troy A.; Rasmussen, Arthur N.; Mcfarland, Robert Z.

    1991-01-01

    The Real Time Data System (RTDS) Project was undertaken in 1987 to introduce new concepts and technologies for advanced automation into the Mission Control Center environment at NASA's Johnson Space Center. The project's emphasis is on producing advanced near-operational prototype systems that are developed using a rapid, interactive method and are used by flight controllers during actual Shuttle missions. In most cases the prototype applications have been of such quality and utility that they have been converted to production status. A key ingredient has been an integrated team of software engineers and flight controllers working together to quickly evolve the demonstration systems.

  16. Cost-Effective Telemetry and Command Ground Systems Automation Strategy for the Soil Moisture Active Passive (SMAP) Mission

    Science.gov (United States)

    Choi, Joshua S.; Sanders, Antonio L.

    2012-01-01

    Soil Moisture Active Passive (SMAP) is an Earth-orbiting, remote-sensing NASA mission slated for launch in 2014.[double dagger] The ground data system (GDS) being developed for SMAP is composed of many heterogeneous subsystems, ranging from those that support planning and sequencing to those used for real-time operations, and even further to those that enable science data exchange. A full end-to-end automation of the GDS may result in cost savings during mission operations, but it would require a significant upfront investment to develop such comprehensive automation. As demonstrated by the Jason-1 and Wide-field Infrared Survey Explorer (WISE) missions, a measure of "lights-out" automation for routine, orbital pass ground operations can still reduce mission cost through smaller staffing of operators and limited work hours. The challenge, then, for the SMAP GDS engineering team is to formulate an automated operations strategy--and corresponding system architecture--to minimize operator intervention during operations, while balancing the development cost associated with the scope and complexity of automation. This paper discusses the automated operations approach being developed for the SMAP GDS. The focus is on automating the activities involved in routine passes, which limits the scope to real-time operations. A key subsystem of the SMAP GDS--NASA's AMMOS Mission Data Processing and Control System (AMPCS)--provides a set of capabilities that enable such automation. Also discussed are the lights-out pass automations of the Jason-1 and WISE missions and how they informed the automation strategy for SMAP. The paper aims to provide insights into what is necessary in automating the GDS operations for Earth satellite missions.

  17. Uganda Mission PRS

    Data.gov (United States)

    US Agency for International Development — A web-based performance reporting system that is managed by IBI that interfaces with the Mission's GIS database that supports USAID/Uganda and its implementing...

  18. STS-83 Mission Insignia

    Science.gov (United States)

    1997-01-01

    The crew patch for NASA's STS-83 mission depicts the Space Shuttle Columbia launching into space for the first Microgravity Sciences Laboratory 1 (MSL-1) mission. MSL-1 investigated materials science, fluid dynamics, biotechnology, and combustion science in the microgravity environment of space, experiments that were conducted in the Spacelab Module in the Space Shuttle Columbia's cargo bay. The center circle symbolizes a free liquid under microgravity conditions representing various fluid and materials science experiments. Symbolic of the combustion experiments is the surrounding starburst of a blue flame burning in space. The 3-lobed shape of the outermost starburst ring traces the dot pattern of a transmission Laue photograph typical of biotechnology experiments. The numerical designation for the mission is shown at bottom center. As a forerunner to missions involving International Space Station (ISS), STS-83 represented the hope that scientific results and knowledge gained during the flight will be applied to solving problems on Earth for the benefit and advancement of humankind.

  19. Autonomous Mission Operations Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Future human spaceflight missions will occur with crews and spacecraft at large distances, with long communication delays, to the Earth. The one-way light-time delay...

  20. Colombia: Updating the Mission

    Science.gov (United States)

    2011-09-01

    or La Violencia . Bogota was nearly destroyed, and the bloodshed spilled into the countryside where it reached its greatest intensity. The machete...role and its commitment to its assigned mission. Army Mission During La Violencia (1948–1962) Of course, it is the army that we are par- ticularly...result was Colombia’s costli- est civil war, termed simply The Violence, or La Violencia . Bogota was nearly destroyed, and the bloodshed spilled into

  1. NEEMO 7 undersea mission

    Science.gov (United States)

    Thirsk, Robert; Williams, David; Anvari, Mehran

    2007-02-01

    The NEEMO 7 mission was the seventh in a series of NASA-coordinated missions utilizing the Aquarius undersea habitat in Florida as a human space mission analog. The primary research focus of this mission was to evaluate telementoring and telerobotic surgery technologies as potential means to deliver medical care to astronauts during spaceflight. The NEEMO 7 crewmembers received minimal pre-mission training to perform selected medical and surgical procedures. These procedures included: (1) use of a portable ultrasound to locate and measure abdominal organs and structures in a crewmember subject; (2) use of a portable ultrasound to insert a small needle and drain into a fluid-filled cystic cavity in a simulated patient; (3) surgical repair of two arteries in a simulated patient; (4) cystoscopy and use of a ureteral basket to remove a renal stone in a simulated patient; and (5) laparoscopic cholecystectomy in a simulated patient. During the actual mission, the crewmembers performed the procedures without or with telementoring and telerobotic assistance from experts located in Hamilton, Ontario. The results of the NEEMO 7 medical experiments demonstrated that telehealth interventions rely heavily on a robust broadband, high data rate telecommunication link; that certain interventional procedures can be performed adequately by minimally trained individuals with telementoring assistance; and that prior clinical experience does not always correlate with better procedural performance. As space missions become longer in duration and take place further from Earth, enhancement of medical care capability and expertise will be required. The kinds of medical technologies demonstrated during the NEEMO 7 mission may play a significant role in enabling the human exploration of space beyond low earth orbit, particularly to destinations such as the Moon and Mars.

  2. Mission Possible: BioMedical Experiments on the Space Shuttle

    Science.gov (United States)

    Bopp, E.; Kreutzberg, K.

    2011-01-01

    Biomedical research, both applied and basic, was conducted on every Shuttle mission from 1981 to 2011. The Space Shuttle Program enabled NASA investigators and researchers from around the world to address fundamental issues concerning living and working effectively in space. Operationally focused occupational health investigations and tests were given priority by the Shuttle crew and Shuttle Program management for the resolution of acute health issues caused by the rigors of spaceflight. The challenges of research on the Shuttle included: limited up and return mass, limited power, limited crew time, and requirements for containment of hazards. The sheer capacity of the Shuttle for crew and equipment was unsurpassed by any other launch and entry vehicle and the Shuttle Program provided more opportunity for human research than any program before or since. To take advantage of this opportunity, life sciences research programs learned how to: streamline the complicated process of integrating experiments aboard the Shuttle, design experiments and hardware within operational constraints, and integrate requirements between different experiments and with operational countermeasures. We learned how to take advantage of commercial-off-the-shelf hardware and developed a hardware certification process with the flexibility to allow for design changes between flights. We learned the importance of end-to-end testing for experiment hardware with humans-in-the-loop. Most importantly, we learned that the Shuttle Program provided an excellent platform for conducting human research and for developing the systems that are now used to optimize research on the International Space Station. This presentation will include a review of the types of experiments and medical tests flown on the Shuttle and the processes that were used to manifest and conduct the experiments. Learning Objective: This paper provides a description of the challenges related to launching and implementing biomedical

  3. Rapid Mission Assurance Assessment via Sociotechnical Modeling and Simulation

    Science.gov (United States)

    2015-05-01

    of high USG official warning of dire threats to the US from “ aggressor nation or extremist group” actors (Bumiller & Shanker, 2012). He is by no... psychological effects were present, well reported, and possibly worse than they might have been. Cyber enabled espionage can, and likely has, led to death...their sexual partners (Williams & Johnson, 1993). A sample of SNA metrics that researchers apply to the graph models of human interactions is below

  4. Robotic Mission Simulation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Energid Technologies proposes a software tool to predict robotic mission performance and support supervision of robotic missions even when environments and...

  5. Precipitation Measurement Missions Data Access

    Data.gov (United States)

    National Aeronautics and Space Administration — Tropical Rainfall Measuring Mission (TRMM) data products are currently available from 1998 to the present. Global Precipitation Measurement (GPM) mission data...

  6. SCIENCE PARAMETRICS FOR MISSIONS TO SEARCH FOR EARTH-LIKE EXOPLANETS BY DIRECT IMAGING

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Robert A., E-mail: rbrown@stsci.edu [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States)

    2015-01-20

    We use N{sub t} , the number of exoplanets observed in time t, as a science metric to study direct-search missions like Terrestrial Planet Finder. In our model, N has 27 parameters, divided into three categories: 2 astronomical, 7 instrumental, and 18 science-operational. For various ''27-vectors'' of those parameters chosen to explore parameter space, we compute design reference missions to estimate N{sub t} . Our treatment includes the recovery of completeness c after a search observation, for revisits, solar and antisolar avoidance, observational overhead, and follow-on spectroscopy. Our baseline 27-vector has aperture D = 16 m, inner working angle IWA = 0.039'', mission time t = 0-5 yr, occurrence probability for Earth-like exoplanets η = 0.2, and typical values for the remaining 23 parameters. For the baseline case, a typical five-year design reference mission has an input catalog of ∼4700 stars with nonzero completeness, ∼1300 unique stars observed in ∼2600 observations, of which ∼1300 are revisits, and it produces N {sub 1} ∼ 50 exoplanets after one year and N {sub 5} ∼ 130 after five years. We explore offsets from the baseline for 10 parameters. We find that N depends strongly on IWA and only weakly on D. It also depends only weakly on zodiacal light for Z < 50 zodis, end-to-end efficiency for h > 0.2, and scattered starlight for ζ < 10{sup –10}. We find that observational overheads, completeness recovery and revisits, solar and antisolar avoidance, and follow-on spectroscopy are all important factors in estimating N.

  7. A pilot biomedical engineering course in rapid prototyping for mobile health.

    Science.gov (United States)

    Stokes, Todd H; Venugopalan, Janani; Hubbard, Elena N; Wang, May D

    2013-01-01

    Rapid prototyping of medically assistive mobile devices promises to fuel innovation and provides opportunity for hands-on engineering training in biomedical engineering curricula. This paper presents the design and outcomes of a course offered during a 16-week semester in Fall 2011 with 11 students enrolled. The syllabus covered a mobile health design process from end-to-end, including storyboarding, non-functional prototypes, integrated circuit programming, 3D modeling, 3D printing, cloud computing database programming, and developing patient engagement through animated videos describing the benefits of a new device. Most technologies presented in this class are open source and thus provide unlimited "hackability". They are also cost-effective and easily transferrable to other departments.

  8. NASA CYGNSS Tropical Cyclone Mission

    Science.gov (United States)

    Ruf, Chris; Atlas, Robert; Majumdar, Sharan; Ettammal, Suhas; Waliser, Duane

    2017-04-01

    The NASA Cyclone Global Navigation Satellite System (CYGNSS) mission consists of a constellation of eight microsatellites that were launched into low-Earth orbit on 15 December 2016. Each observatory carries a four-channel bistatic scatterometer receiver to measure near surface wind speed over the ocean. The transmitter half of the scatterometer is the constellation of GPS satellites. CYGNSS is designed to address the inadequacy in observations of the inner core of tropical cyclones (TCs) that result from two causes: 1) much of the TC inner core is obscured from conventional remote sensing instruments by intense precipitation in the eye wall and inner rain bands; and 2) the rapidly evolving (genesis and intensification) stages of the TC life cycle are poorly sampled in time by conventional polar-orbiting, wide-swath surface wind imagers. The retrieval of wind speed by CYGNSS in the presence of heavy precipitation is possible due to the long operating wavelength used by GPS (19 cm), at which scattering and attenuation by rain are negligible. Improved temporal sampling by CYGNSS is possible due to the use of eight spacecraft with 4 scatterometer channels on each one. Median and mean revisit times everywhere in the tropics are 3 and 7 hours, respectively. Wind speed referenced to 10m height above the ocean surface is retrieved from CYGNSS measurements of bistatic radar cross section in a manner roughly analogous to that of conventional ocean wind scatterometers. The technique has been demonstrated previously from space by the UK-DMC and UK-TDS missions. Wind speed is retrieved with 25 km spatial resolution and an uncertainty of 2 m/s at low wind speeds and 10% at wind speeds above 20 m/s. Extensive simulation studies conducted prior to launch indicate that there will be a significant positive impact on TC forecast skill for both track and intensity with CYGNSS measurements assimilated into HWRF numerical forecasts. Simulations of CYGNSS spatial and temporal sampling

  9. Rapid Prototyping

    Science.gov (United States)

    1999-01-01

    Javelin, a Lone Peak Engineering Inc. Company has introduced the SteamRoller(TM) System as a commercial product. The system was designed by Javelin during a Phase II NASA funded small commercial product. The purpose of the invention was to allow automated-feed of flexible ceramic tapes to the Laminated Object Manufacturing rapid prototyping equipment. The ceramic material that Javelin was working with during the Phase II project is silicon nitride. This engineered ceramic material is of interest for space-based component.

  10. Teamwork Reasoning and Multi-Satellite Missions

    Science.gov (United States)

    Marsella, Stacy C.; Plaunt, Christian (Technical Monitor)

    2002-01-01

    NASA is rapidly moving towards the use of spatially distributed multiple satellites operating in near Earth orbit and Deep Space. Effective operation of such multi-satellite constellations raises many key research issues. In particular, the satellites will be required to cooperate with each other as a team that must achieve common objectives with a high degree of autonomy from ground based operations. The multi-agent research community has made considerable progress in investigating the challenges of realizing such teamwork. In this report, we discuss some of the teamwork issues that will be faced by multi-satellite operations. The basis of the discussion is a particular proposed mission, the Magnetospheric MultiScale mission to explore Earth's magnetosphere. We describe this mission and then consider how multi-agent technologies might be applied in the design and operation of these missions. We consider the potential benefits of these technologies as well as the research challenges that will be raised in applying them to NASA multi-satellite missions. We conclude with some recommendations for future work.

  11. The Hinode Mission

    CERN Document Server

    Sakurai, Takashi

    2009-01-01

    The Solar-B satellite was launched in 2006 by the Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency (ISAS/JAXA), and was renamed Hinode ('sunrise' in Japanese). Hinode carries three instruments: the X-ray telescope (XRT), the EUV imaging spectrometer (EIS), and the Solar Optical Telescope (SOT). These instruments were developed by ISAS/JAXA in cooperation with the National Astronomical Observatory of Japan as domestic partner, and NASA and the Science and Technology Facilities Council (UK) as international partners. ESA and the Norwegian Space Center have been providing a downlink station. The Hinode (Solar-B) Mission gives a comprehensive description of the Hinode mission and its instruments onboard. This book is most useful for researchers, professionals, and graduate students working in the field of solar physics, astronomy, and space instrumentation. This is the only book that carefully describes the details of the Hinode mission; it is richly illustrated with full-color ima...

  12. STS-95 Mission Insignia

    Science.gov (United States)

    1998-01-01

    The STS-95 patch, designed by the crew, is intended to reflect the scientific, engineering, and historic elements of the mission. The Space Shuttle Discovery is shown rising over the sunlit Earth limb, representing the global benefits of the mission science and the solar science objectives of the Spartan Satellite. The bold number '7' signifies the seven members of Discovery's crew and also represents a historical link to the original seven Mercury astronauts. The STS-95 crew member John Glenn's first orbital flight is represented by the Friendship 7 capsule. The rocket plumes symbolize the three major fields of science represented by the mission payloads: microgravity material science, medical research for humans on Earth and in space, and astronomy.

  13. Athena Mission Status

    Science.gov (United States)

    Lumb, D.

    2016-07-01

    Athena has been selected by ESA for its second large mission opportunity of the Cosmic Visions programme, to address the theme of the Hot and Energetic Universe. Following the submission of a proposal from the community, the technical and programmatic aspects of the mission design were reviewed in ESA's Concurrent Design Facility. The proposed concept was deemed to betechnically feasible, but with potential constraints from cost and schedule. Two parallel industry study contracts have been conducted to explore these conclusions more thoroughly, with the key aim of providing consolidated inputs to a Mission Consolidation Review that was conducted in April-May 2016. This MCR has recommended a baseline design, which allows the agency to solicit proposals for a community provided payload. Key design aspects arising from the studies are described, and the new reference design is summarised.

  14. Supporting the academic mission.

    Science.gov (United States)

    Dunnick, N Reed

    2010-03-01

    The mission of an academic radiology department includes not only high-quality patient care, but also the educating of a broad variety of health care professionals, the conducting of research to advance the field, and volunteering service to the medical center and our professional societies. While funding is available for the research and educational aspects, it is insufficient to cover the actual costs. Furthermore, it is becoming increasingly difficult to make up the deficit by using a portion of the clinical revenues. Development and revenues derived from intellectual property are becoming essential to support the academic mission.

  15. MIV Project: Mission scenario

    DEFF Research Database (Denmark)

    Ravazzotti, Mariolina T.; Jørgensen, John Leif; Thuesen, Gøsta

    1997-01-01

    Under the ESA contract #11453/95/NL/JG(SC), aiming at assessing the feasibility of Rendez-vous and docking of unmanned spacecrafts, a msiision scenario was defined. This report describes the secquence of manouvres and task allocations for such missions.......Under the ESA contract #11453/95/NL/JG(SC), aiming at assessing the feasibility of Rendez-vous and docking of unmanned spacecrafts, a msiision scenario was defined. This report describes the secquence of manouvres and task allocations for such missions....

  16. The Euromir missions.

    Science.gov (United States)

    Andresen, R D; Domesle, R

    1996-11-01

    The 179-day flight of ESA Astronaut Thomas Reiter onboard the Russian Space Station Mir drew to a successful conclusion on 29 February 1996 with the safe landing of the Soyuz TM-22 capsule near Arkalyk in Kazakhstan. This mission, known as Euromir 95, was part of ESA's precursor flight programme for the International Space Station, and followed the equally successful Euromir 94 mission by ESA Astronaut Ulf Merbold (3 October-4 November 1994). This article discusses the objectives of the two flights and presents an overview of the experiment programme, a preliminary assessment of its results and achievements, and reviews some of the lessons learnt for future Space Station operations.

  17. Mars Stratigraphy Mission

    Science.gov (United States)

    Budney, C. J.; Miller, S. L.; Cutts, J. A.

    2000-01-01

    The Mars Stratigraphy Mission lands a rover on the surface of Mars which descends down a cliff in Valles Marineris to study the stratigraphy. The rover carries a unique complement of instruments to analyze and age-date materials encountered during descent past 2 km of strata. The science objective for the Mars Stratigraphy Mission is to identify the geologic history of the layered deposits in the Valles Marineris region of Mars. This includes constraining the time interval for formation of these deposits by measuring the ages of various layers and determining the origin of the deposits (volcanic or sedimentary) by measuring their composition and imaging their morphology.

  18. Reference mission 3B ascent trajectory. Mission planning, mission analysis and software formulation

    Science.gov (United States)

    Kuhn, A. E.

    1975-01-01

    Mission 3B is designed as a payload retrieval mission with both shuttle launch and orbiter landing to take place at the western test range. The mission is designed for direct rendezvous with a passive satellite in a 100 NMI circular orbit with an inclination of 104 degrees. The ascent portion of mission 3B is described as well as the trajectory simulation.

  19. The Gaia mission

    NARCIS (Netherlands)

    Collaboration, Gaia; Prusti, T.; de Bruijne, J. H. J.; Brown, A. G. A.; Vallenari, A.; Babusiaux, C.; Bailer-Jones, C. A. L.; Bastian, U.; Biermann, M.; Evans, D. W.; Eyer, L.; Jansen, F.; Jordi, C.; Klioner, S. A.; Lammers, U.; Lindegren, L.; Luri, X.; Mignard, F.; Milligan, D. J.; Panem, C.; Poinsignon, V.; Pourbaix, D.; Randich, S.; Sarri, G.; Sartoretti, P.; Siddiqui, H. I.; Soubiran, C.; Valette, V.; van Leeuwen, F.; Walton, N. A.; Aerts, C.; Arenou, F.; Cropper, M.; Drimmel, R.; Høg, E.; Katz, D.; Lattanzi, M. G.; O'Mullane, W.; Grebel, E. K.; Holland, A. D.; Huc, C.; Passot, X.; Bramante, L.; Cacciari, C.; Castañeda, J.; Chaoul, L.; Cheek, N.; De Angeli, F.; Fabricius, C.; Guerra, R.; Hernández, J.; Jean-Antoine-Piccolo, A.; Masana, E.; Messineo, R.; Mowlavi, N.; Nienartowicz, K.; Ordóñez-Blanco, D.; Panuzzo, P.; Portell, J.; Richards, P. J.; Riello, M.; Seabroke, G. M.; Tanga, P.; Thévenin, F.; Torra, J.; Els, S. G.; Gracia-Abril, G.; Comoretto, G.; Garcia-Reinaldos, M.; Lock, T.; Mercier, E.; Altmann, M.; Andrae, R.; Astraatmadja, T. L.; Bellas-Velidis, I.; Benson, K.; Berthier, J.; Blomme, R.; Busso, G.; Carry, B.; Cellino, A.; Clementini, G.; Cowell, S.; Creevey, O.; Cuypers, J.; Davidson, M.; De Ridder, J.; de Torres, A.; Delchambre, L.; Dell'Oro, A.; Ducourant, C.; Frémat, Y.; García-Torres, M.; Gosset, E.; Halbwachs, J. -L; Hambly, N. C.; Harrison, D. L.; Hauser, M.; Hestroffer, D.; Hodgkin, S. T.; Huckle, H. E.; Hutton, A.; Jasniewicz, G.; Jordan, S.; Kontizas, M.; Korn, A. J.; Lanzafame, A. C.; Manteiga, M.; Moitinho, A.; Muinonen, K.; Osinde, J.; Pancino, E.; Pauwels, T.; Petit, J. -M; Recio-Blanco, A.; Robin, A. C.; Sarro, L. M.; Siopis, C.; Smith, M.; Smith, K. W.; Sozzetti, A.; Thuillot, W.; van Reeven, W.; Viala, Y.; Abbas, U.; Abreu Aramburu, A.; Accart, S.; Aguado, J. J.; Allan, P. M.; Allasia, W.; Altavilla, G.; Álvarez, M. A.; Alves, J.; Anderson, R. I.; Andrei, A. H.; Anglada Varela, E.; Antiche, E.; Antoja, T.; Antón, S.; Arcay, B.; Atzei, A.; Ayache, L.; Bach, N.; Baker, S. G.; Balaguer-Núñez, L.; Barache, C.; Barata, C.; Barbier, A.; Barblan, F.; Baroni, M.; Barrado y Navascués, D.; Barros, M.; Barstow, M. A.; Becciani, U.; Bellazzini, M.; Bellei, G.; Bello García, A.; Belokurov, V.; Bendjoya, P.; Berihuete, A.; Bianchi, L.; Bienaymé, O.; Billebaud, F.; Blagorodnova, N.; Blanco-Cuaresma, S.; Boch, T.; Bombrun, A.; Borrachero, R.; Bouquillon, S.; Bourda, G.; Bouy, H.; Bragaglia, A.; Breddels, M. A.; Brouillet, N.; Brüsemeister, T.; Bucciarelli, B.; Budnik, F.; Burgess, P.; Burgon, R.; Burlacu, A.; Busonero, D.; Buzzi, R.; Caffau, E.; Cambras, J.; Campbell, H.; Cancelliere, R.; Cantat-Gaudin, T.; Carlucci, T.; Carrasco, J. M.; Castellani, M.; Charlot, P.; Charnas, J.; Charvet, P.; Chassat, F.; Chiavassa, A.; Clotet, M.; Cocozza, G.; Collins, R. S.; Collins, P.; Costigan, G.; Crifo, F.; Cross, N. J. G.; Crosta, M.; Crowley, C.; Dafonte, C.; Damerdji, Y.; Dapergolas, A.; David, P.; David, M.; De Cat, P.; de Felice, F.; de Laverny, P.; De Luise, F.; De March, R.; de Martino, D.; de Souza, R.; Debosscher, J.; del Pozo, E.; Delbo, M.; Delgado, A.; Delgado, H. E.; di Marco, F.; Di Matteo, P.; Diakite, S.; Distefano, E.; Dolding, C.; Dos Anjos, S.; Drazinos, P.; Durán, J.; Dzigan, Y.; Ecale, E.; Edvardsson, B.; Enke, H.; Erdmann, M.; Escolar, D.; Espina, M.; Evans, N. W.; Eynard Bontemps, G.; Fabre, C.; Fabrizio, M.; Faigler, S.; Falcão, A. J.; Farràs Casas, M.; Faye, F.; Federici, L.; Fedorets, G.; Fernández-Hernández, J.; Fernique, P.; Fienga, A.; Figueras, F.; Filippi, F.; Findeisen, K.; Fonti, A.; Fouesneau, M.; Fraile, E.; Fraser, M.; Fuchs, J.; Furnell, R.; Gai, M.; Galleti, S.; Galluccio, L.; Garabato, D.; García-Sedano, F.; Garé, P.; Garofalo, A.; Garralda, N.; Gavras, P.; Gerssen, J.; Geyer, R.; Gilmore, G.; Girona, S.; Giuffrida, G.; Gomes, M.; González-Marcos, A.; González-Núñez, J.; González-Vidal, J. J.; Granvik, M.; Guerrier, A.; Guillout, P.; Guiraud, J.; Gúrpide, A.; Gutiérrez-Sánchez, R.; Guy, L. P.; Haigron, R.; Hatzidimitriou, D.; Haywood, M.; Heiter, U.; Helmi, A.; Hobbs, D.; Hofmann, W.; Holl, B.; Holland, G.; Hunt, J. A. S.; Hypki, A.; Icardi, V.; Irwin, M.; Jevardat de Fombelle, G.; Jofré, P.; Jonker, P. G.; Jorissen, A.; Julbe, F.; Karampelas, A.; Kochoska, A.; Kohley, R.; Kolenberg, K.; Kontizas, E.; Koposov, S. E.; Kordopatis, G.; Koubsky, P.; Kowalczyk, A.; Krone-Martins, A.; Kudryashova, M.; Kull, I.; Bachchan, R. K.; Lacoste-Seris, F.; Lanza, A. F.; Lavigne, J. -B; Le Poncin-Lafitte, C.; Lebreton, Y.; Lebzelter, T.; Leccia, S.; Leclerc, N.; Lecoeur-Taibi, I.; Lemaitre, V.; Lenhardt, H.; Leroux, F.; Liao, S.; Licata, E.; Lindstrøm, H. E. P.; Lister, T. A.; Livanou, E.; Lobel, A.; Löffler, W.; López, M.; Lopez-Lozano, A.; Lorenz, D.; Loureiro, T.; MacDonald, I.; Magalhães Fernandes, T.; Managau, S.; Mann, R. G.; Mantelet, G.; Marchal, O.; Marchant, J. M.; Marconi, M.; Marie, J.; Marinoni, S.; Marrese, P. M.; Marschalkó, G.; Marshall, D. J.; Martín-Fleitas, J. M.; Martino, M.; Mary, N.; Matijevič, G.; Mazeh, T.; McMillan, P. J.; Messina, S.; Mestre, A.; Michalik, D.; Millar, N. R.; Miranda, B. M. H.; Molina, D.; Molinaro, R.; Molinaro, M.; Molnár, L.; Moniez, M.; Montegriffo, P.; Monteiro, D.; Mor, R.; Mora, A.; Morbidelli, R.; Morel, T.; Morgenthaler, S.; Morley, T.; Morris, D.; Mulone, A. F.; Muraveva, T.; Musella, I.; Narbonne, J.; Nelemans, G.; Nicastro, L.; Noval, L.; Ordénovic, C.; Ordieres-Meré, J.; Osborne, P.; Pagani, C.; Pagano, I.; Pailler, F.; Palacin, H.; Palaversa, L.; Parsons, P.; Paulsen, T.; Pecoraro, M.; Pedrosa, R.; Pentikäinen, H.; Pereira, J.; Pichon, B.; Piersimoni, A. M.; Pineau, F. -X; Plachy, E.; Plum, G.; Poujoulet, E.; Prša, A.; Pulone, L.; Ragaini, S.; Rago, S.; Rambaux, N.; Ramos-Lerate, M.; Ranalli, P.; Rauw, G.; Read, A.; Regibo, S.; Renk, F.; Reylé, C.; Ribeiro, R. A.; Rimoldini, L.; Ripepi, V.; Riva, A.; Rixon, G.; Roelens, M.; Romero-Gómez, M.; Rowell, N.; Royer, F.; Rudolph, A.; Ruiz-Dern, L.; Sadowski, G.; Sagristà Sellés, T.; Sahlmann, J.; Salgado, J.; Salguero, E.; Sarasso, M.; Savietto, H.; Schnorhk, A.; Schultheis, M.; Sciacca, E.; Segol, M.; Segovia, J. C.; Segransan, D.; Serpell, E.; Shih, I. -C; Smareglia, R.; Smart, R. L.; Smith, C.; Solano, E.; Solitro, F.; Sordo, R.; Soria Nieto, S.; Souchay, J.; Spagna, A.; Spoto, F.; Stampa, U.; Steele, I. A.; Steidelmüller, H.; Stephenson, C. A.; Stoev, H.; Suess, F. F.; Süveges, M.; Surdej, J.; Szabados, L.; Szegedi-Elek, E.; Tapiador, D.; Taris, F.; Tauran, G.; Taylor, M. B.; Teixeira, R.; Terrett, D.; Tingley, B.; Trager, S. C.; Turon, C.; Ulla, A.; Utrilla, E.; Valentini, G.; van Elteren, A.; Van Hemelryck, E.; van Leeuwen, M.; Varadi, M.; Vecchiato, A.; Veljanoski, J.; Via, T.; Vicente, D.; Vogt, S.; Voss, H.; Votruba, V.; Voutsinas, S.; Walmsley, G.; Weiler, M.; Weingrill, K.; Werner, D.; Wevers, T.; Whitehead, G.; Wyrzykowski, Ł.; Yoldas, A.; Žerjal, M.; Zucker, S.; Zurbach, C.; Zwitter, T.; Alecu, A.; Allen, M.; Allende Prieto, C.; Amorim, A.; Anglada-Escudé, G.; Arsenijevic, V.; Azaz, S.; Balm, P.; Beck, M.; Bernstein, H. -H; Bigot, L.; Bijaoui, A.; Blasco, C.; Bonfigli, M.; Bono, G.; Boudreault, S.; Bressan, A.; Brown, S.; Brunet, P. -M; Bunclark, P.; Buonanno, R.; Butkevich, A. G.; Carret, C.; Carrion, C.; Chemin, L.; Chéreau, F.; Corcione, L.; Darmigny, E.; de Boer, K. S.; de Teodoro, P.; de Zeeuw, P. T.; Delle Luche, C.; Domingues, C. D.; Dubath, P.; Fodor, F.; Frézouls, B.; Fries, A.; Fustes, D.; Fyfe, D.; Gallardo, E.; Gallegos, J.; Gardiol, D.; Gebran, M.; Gomboc, A.; Gómez, A.; Grux, E.; Gueguen, A.; Heyrovsky, A.; Hoar, J.; Iannicola, G.; Isasi Parache, Y.; Janotto, A. -M; Joliet, E.; Jonckheere, A.; Keil, R.; Kim, D. -W; Klagyivik, P.; Klar, J.; Knude, J.; Kochukhov, O.; Kolka, I.; Kos, J.; Kutka, A.; Lainey, V.; LeBouquin, D.; Liu, C.; Loreggia, D.; Makarov, V. V.; Marseille, M. G.; Martayan, C.; Martinez-Rubi, O.; Massart, B.; Meynadier, F.; Mignot, S.; Munari, U.; Nguyen, A. -T; Nordlander, T.; Ocvirk, P.; O'Flaherty, K. S.; Olias Sanz, A.; Ortiz, P.; Osorio, J.; Oszkiewicz, D.; Ouzounis, A.; Palmer, M.; Park, P.; Pasquato, E.; Peltzer, C.; Peralta, J.; Péturaud, F.; Pieniluoma, T.; Pigozzi, E.; Poels, J.; Prat, G.; Prod'homme, T.; Raison, F.; Rebordao, J. M.; Risquez, D.; Rocca-Volmerange, B.; Rosen, S.; Ruiz-Fuertes, M. I.; Russo, F.; Sembay, S.; Serraller Vizcaino, I.; Short, A.; Siebert, A.; Silva, H.; Sinachopoulos, D.; Slezak, E.; Soffel, M.; Sosnowska, D.; Straižys, V.; ter Linden, M.; Terrell, D.; Theil, S.; Tiede, C.; Troisi, L.; Tsalmantza, P.; Tur, D.; Vaccari, M.; Vachier, F.; Valles, P.; Van Hamme, W.; Veltz, L.; Virtanen, J.; Wallut, J. -M; Wichmann, R.; Wilkinson, M. I.; Ziaeepour, H.; Zschocke, S.

    2016-01-01

    Gaia is a cornerstone mission in the science programme of the EuropeanSpace Agency (ESA). The spacecraft construction was approved in 2006, following a study in which the original interferometric concept was changed to a direct-imaging approach. Both the spacecraft and the payload were built by

  20. Mission from Mars

    DEFF Research Database (Denmark)

    Dindler, Christian; Eriksson, Eva; Iversen, Ole Sejer

    2005-01-01

    In this paper a particular design method is propagated as a supplement to existing descriptive approaches to current practice studies especially suitable for gathering requirements for the design of children's technology. The Mission from Mars method was applied during the design of an electronic...

  1. EOS Aura Mission Status

    Science.gov (United States)

    Guit, William J.

    2015-01-01

    This PowerPoint presentation will discuss EOS Aura mission and spacecraft subsystem summary, recent and planned activities, inclination adjust maneuvers, propellant usage lifetime estimate. Eric Moyer, ESMO Deputy Project Manager-Technical (code 428) has reviewed and approved the slides on April 30, 2015.

  2. Robust UAV Mission Planning

    NARCIS (Netherlands)

    Evers, L.; Dollevoet, T; Barros, A.I.; Monsuur, H.

    2011-01-01

    Unmanned Aerial Vehicles (UAVs) can provide significant contributions to information gathering in military missions. UAVs can be used to capture both full motion video and still imagery of specific target locations within the area of interest. In order to improve the effectiveness of a

  3. Robust UAV Mission Planning

    NARCIS (Netherlands)

    L. Evers (Lanah); T.A.B. Dollevoet (Twan); A.I. Barros (Ana); H. Monsuur (Herman)

    2011-01-01

    textabstractUnmanned Areal Vehicles (UAVs) can provide significant contributions to information gathering in military missions. UAVs can be used to capture both full motion video and still imagery of specific target locations within the area of interest. In order to improve the effectiveness of a

  4. Robust UAV mission planning

    NARCIS (Netherlands)

    Evers, L.; Dollevoet, T.; Barros, A.I.; Monsuur, H.

    2011-01-01

    Unmanned Areal Vehicles (UAVs) can provide significant contributions to information gathering in military missions. UAVs can be used to capture both full motion video and still imagery of specific target locations within the area of interest. In order to improve the effectiveness of a reconnaissance

  5. Robust UAV Mission Planning

    NARCIS (Netherlands)

    Evers, L.; Dollevoet, T.; Barros, A.I.; Monsuur, H.

    2014-01-01

    Unmanned Aerial Vehicles (UAVs) can provide significant contributions to information gathering in military missions. UAVs can be used to capture both full motion video and still imagery of specific target locations within the area of interest. In order to improve the effectiveness of a

  6. The Lobster Mission

    Science.gov (United States)

    Barthelmy, Scott

    2011-01-01

    I will give an overview of the Goddard Lobster mission: the science goals, the two instruments, the overall instruments designs, with particular attention to the wide-field x-ray instrument (WFI) using the lobster-eye-like micro-channel optics.

  7. Portable Diagnostics Technology Assessment for Space Missions. Part 1; General Technology Capabilities for NASA Exploration Missions

    Science.gov (United States)

    Nelson, Emily S.; Chait, Arnon

    2010-01-01

    The changes in the scope of NASA s mission in the coming decade are profound and demand nimble, yet insightful, responses. On-board clinical and environmental diagnostics must be available for both mid-term lunar and long-term Mars exploration missions in an environment marked by scarce resources. Miniaturization has become an obvious focus. Despite solid achievements in lab-based devices, broad-based, robust tools for application in the field are not yet on the market. The confluence of rapid, wide-ranging technology evolution and internal planning needs are the impetus behind this work. This report presents an analytical tool for the ongoing evaluation of promising technology platforms based on mission- and application-specific attributes. It is not meant to assess specific devices, but rather to provide objective guidelines for a rational down-select of general categories of technology platforms. In this study, we have employed our expertise in the microgravity operation of fluidic devices, laboratory diagnostics for space applications, and terrestrial research in biochip development. A rating of the current state of technology development is presented using the present tool. Two mission scenarios are also investigated: a 30-day lunar mission using proven, tested technology in 5 years; and a 2- to 3-year mission to Mars in 10 to 15 years.

  8. The Double Star mission

    Directory of Open Access Journals (Sweden)

    Liu

    2005-11-01

    Full Text Available The Double Star Programme (DSP was first proposed by China in March, 1997 at the Fragrant Hill Workshop on Space Science, Beijing, organized by the Chinese Academy of Science. It is the first mission in collaboration between China and ESA. The mission is made of two spacecraft to investigate the magnetospheric global processes and their response to the interplanetary disturbances in conjunction with the Cluster mission. The first spacecraft, TC-1 (Tan Ce means "Explorer", was launched on 29 December 2003, and the second one, TC-2, on 25 July 2004 on board two Chinese Long March 2C rockets. TC-1 was injected in an equatorial orbit of 570x79000 km altitude with a 28° inclination and TC-2 in a polar orbit of 560x38000 km altitude. The orbits have been designed to complement the Cluster mission by maximizing the time when both Cluster and Double Star are in the same scientific regions. The two missions allow simultaneous observations of the Earth magnetosphere from six points in space. To facilitate the comparison of data, half of the Double Star payload is made of spare or duplicates of the Cluster instruments; the other half is made of Chinese instruments. The science operations are coordinated by the Chinese DSP Scientific Operations Centre (DSOC in Beijing and the European Payload Operations Service (EPOS at RAL, UK. The spacecraft and ground segment operations are performed by the DSP Operations and Management Centre (DOMC and DSOC in China, using three ground station, in Beijing, Shanghai and Villafranca.

  9. The Mothership Mission Architecture

    Science.gov (United States)

    Ernst, S. M.; DiCorcia, J. D.; Bonin, G.; Gump, D.; Lewis, J. S.; Foulds, C.; Faber, D.

    2015-12-01

    The Mothership is considered to be a dedicated deep space carrier spacecraft. It is currently being developed by Deep Space Industries (DSI) as a mission concept that enables a broad participation in the scientific exploration of small bodies - the Mothership mission architecture. A Mothership shall deliver third-party nano-sats, experiments and instruments to Near Earth Asteroids (NEOs), comets or moons. The Mothership service includes delivery of nano-sats, communication to Earth and visuals of the asteroid surface and surrounding area. The Mothership is designed to carry about 10 nano-sats, based upon a variation of the Cubesat standard, with some flexibility on the specific geometry. The Deep Space Nano-Sat reference design is a 14.5 cm cube, which accommodates the same volume as a traditional 3U CubeSat. To reduce cost, Mothership is designed as a secondary payload aboard launches to GTO. DSI is offering slots for nano-sats to individual customers. This enables organizations with relatively low operating budgets to closely examine an asteroid with highly specialized sensors of their own choosing and carry out experiments in the proximity of or on the surface of an asteroid, while the nano-sats can be built or commissioned by a variety of smaller institutions, companies, or agencies. While the overall Mothership mission will have a financial volume somewhere between a European Space Agencies' (ESA) S- and M-class mission for instance, it can be funded through a number of small and individual funding sources and programs, hence avoiding the processes associated with traditional space exploration missions. DSI has been able to identify a significant interest in the planetary science and nano-satellite communities.

  10. The OCO-3 MIssion

    Science.gov (United States)

    Eldering, A.; Kaki, S.; Crisp, D.; Gunson, M. R.

    2013-12-01

    For the OCO-3 mission, NASA has approved a proposal to install the OCO-2 flight spare instrument on the International Space Station (ISS). The OCO-3 mission on ISS will have a key role in delivering sustained, global, scientifically-based, spaceborne measurements of atmospheric CO2 to monitor natural sources and sinks as part of NASA's proposed OCO-2/OCO-3/ASCENDS mission sequence and NASA's Climate Architecture. The OCO-3 mission will contribute to understanding of the terrestrial carbon cycle through enabling flux estimates at smaller spatial scales and through fluorescence measurements that will reduce the uncertainty in terrestrial carbon flux measurements and drive bottom-up land surface models through constraining GPP. The combined nominal missions of both OCO-2 and OCO-3 will likely span a complete El Niño Southern Oscillation (ENSO) cycle, a key indicator of ocean variability. In addition, OCO-3 may allow investigation of the high-frequency and wavenumber structures suggested by eddying ocean circulation and ecosystem dynamics models. Finally, significant growth of urban agglomerations is underway and projected to continue in the coming decades. With the city mode sampling of the OCO-3 instrument on ISS we can evaluate different sampling strategies aimed at studying anthropogenic sources and demonstrate elements of a Greenhouse Gas Information system, as well as providing a gap-filler for tracking trends in the fastest-changing anthropogenic signals during the coming decade. In this presentation, we will describe our science objectives, the overall approach of utilization of the ISS for OCO-3, and the unique features of XCO2 measurements from ISS.

  11. Creating Communications, Computing, and Networking Technology Development Road Maps for Future NASA Human and Robotic Missions

    Science.gov (United States)

    Bhasin, Kul; Hayden, Jeffrey L.

    2005-01-01

    For human and robotic exploration missions in the Vision for Exploration, roadmaps are needed for capability development and investments based on advanced technology developments. A roadmap development process was undertaken for the needed communications, and networking capabilities and technologies for the future human and robotics missions. The underlying processes are derived from work carried out during development of the future space communications architecture, an d NASA's Space Architect Office (SAO) defined formats and structures for accumulating data. Interrelationships were established among emerging requirements, the capability analysis and technology status, and performance data. After developing an architectural communications and networking framework structured around the assumed needs for human and robotic exploration, in the vicinity of Earth, Moon, along the path to Mars, and in the vicinity of Mars, information was gathered from expert participants. This information was used to identify the capabilities expected from the new infrastructure and the technological gaps in the way of obtaining them. We define realistic, long-term space communication architectures based on emerging needs and translate the needs into interfaces, functions, and computer processing that will be required. In developing our roadmapping process, we defined requirements for achieving end-to-end activities that will be carried out by future NASA human and robotic missions. This paper describes: 10 the architectural framework developed for analysis; 2) our approach to gathering and analyzing data from NASA, industry, and academia; 3) an outline of the technology research to be done, including milestones for technology research and demonstrations with timelines; and 4) the technology roadmaps themselves.

  12. Ames Coronagraph Experiment: Enabling Missions to Directly Image Exoplanets

    Science.gov (United States)

    Belikov, Ruslan

    2014-01-01

    Technology to find biomarkers and life on other worlds is rapidly maturing. If there is a habitable planet around the nearest star, we may be able to detect it this decade with a small satellite mission. In the 2030 decade, we will likely know if there is life in our Galactic neighborhood (1000 nearest stars). The Ames Coronagraph Experiment is developing coronagraphic technologies to enable such missions.

  13. NASA's Asteroid Redirect Mission (ARM)

    Science.gov (United States)

    Abell, P. A.; Mazanek, D. D.; Reeves, D. M.; Chodas, P. W.; Gates, M. M.; Johnson, L. N.; Ticker, R. L.

    2017-01-01

    Mission Description and Objectives: NASA's Asteroid Redirect Mission (ARM) consists of two mission segments: 1) the Asteroid Redirect Robotic Mission (ARRM), a robotic mission to visit a large (greater than approximately 100 meters diameter) near-Earth asteroid (NEA), collect a multi-ton boulder from its surface along with regolith samples, and return the asteroidal material to a stable orbit around the Moon; and 2) the Asteroid Redirect Crewed Mission (ARCM), in which astronauts will explore and investigate the boulder and return to Earth with samples. The ARRM is currently planned to launch at the end of 2021 and the ARCM is scheduled for late 2026.

  14. B plant mission analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Lund, D.P.

    1995-05-24

    This report further develops the mission for B Plant originally defined in WHC-EP-0722, ``System Engineering Functions and Requirements for the Hanford Cleanup Mission: First Issue.`` The B Plant mission analysis will be the basis for a functional analysis that breaks down the B Plant mission statement into the necessary activities to accomplish the mission. These activities are the product of the functional analysis and will then be used in subsequent steps of the systems engineering process, such as identifying requirements and allocating those requirements to B Plant functions. The information in this mission analysis and the functional and requirements analysis are a part of the B Plant technical baseline.

  15. Nuclear Electric Propulsion mission operations.

    Science.gov (United States)

    Prickett, W. Z.; Spera, R. J.

    1972-01-01

    Mission operations are presented for comet rendezvous and outer planet exploration missions conducted by unmanned Nuclear Electric Propulsion (NEP) system employing in-core thermionic reactors for electric power generation. The selected reference mission are Comet Halley rendezvous and a Jupiter orbiter at 5.9 planet radii, the orbit of the moon Io. Mission operations and options are defined from spacecraft assembly through mission completion. Pre-launch operations and related GSE requirements are identified. Shuttle launch and subsequent injection to earth escape by the Centaur d-1T are discussed, as well as power plant startup and heliocentric mission phases.

  16. Defining Space Mission Architects for the Smaller Missions

    Science.gov (United States)

    Anderson, C.

    1999-01-01

    The definition of the Space Mission Architect (SMA) must be clear in both technical and human terms if we expect to train and/or to find people needed to architect the numbers of smaller missions expected in the future.

  17. Lynx mission concept study

    Science.gov (United States)

    Vikhlinin, Alexey

    2018-01-01

    Lynx is an observatory-class mission, featuring high throughput, exquisite angular resolution over a substantial field of view, and high spectral resolution for point and extended X-ray sources. The design requirements provide a tremendous leap in capabilities relative to missions such as Chandra and Athena. Lynx will observe the dawn of supermassive black holes through detection of very faint X-ray sources in the early universe and will reveal the "invisible drivers" of galaxy and structure formation through observations of hot, diffuse baryons in and around the galaxies. Lynx will enable breakthroughs across all of astrophysics, ranging from detailed understanding of stellar activity including effects on habitability of associated planets to population statistics of neutron stars and black holes in the Local Group galaxies, to earliest groups and clusters of galaxies, and to cosmology

  18. Towards A Shared Mission

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen; Orth Gaarn-Larsen, Carsten

    in the context of universities. Although the economic aspects of value are important and cannot be ignored, we argue for a much richer interpretation of value that captures the many and varied results from universities. A shared mission is a prerequisite for university management and leadership. It makes......A mission shared by stakeholders, management and employees is a prerequisite for an engaging dialog about the many and substantial changes and challenges currently facing universities. Too often this essen-tial dialog reveals mistrust and misunderstandings about the role and outcome...... of the universities. The sad result is that the dialog about university development, resources, leadership, governance etc. too often ends up in rather fruitless discussions and sometimes even mutual suspicion. This paper argues for having a dialog involving both internal and external stakeholders agreeing...

  19. Precursor missions to interstellar exploration.

    Science.gov (United States)

    Wallace, R. A.

    This paper summarizes material developed over a three-month period by a JPL team of mission architects/analysts and advanced technology developers for presentation to NASA Headquarters in the summer of 1998. A preliminary mission roadmap is suggested that leads to the exploration of star systems within 40 light years of our Solar System. The precursor missions include technology demonstrations as well as missions that return significant new knowledge about the space environment reached. Three propulsion technology candidates are selected on the basis of allowing eventual travel to the nearest star taking 10 years. One of the three propulsion technologies has a near term version applicable to early missions (prior to 2010) - the solar sail. Using early sail missions other critical supporting technologies can be developed that will later enable Interstellar travel. Example precursor missions are sail demonstration missions, including a solar storm warning mission demonstrating a simple sail, a solar polar imaging mission using an intermediate sail, and a 200-AU Heliosphere Explorer mission using an advanced solar sail. Mission and technology strategy, science return, and potential mission spin-offs are described.

  20. Joint Mission Command Implementation

    Science.gov (United States)

    2016-01-22

    1 AIR WAR COLLEGE AIR UNIVERSITY JOINT MISSION COMMAND IMPLEMENTATION by Michael Dane Acord, COL, US Army A Research Report Submitted to...assigned to the Air War College, Air University, Maxwell Air Force Base (AFB), AL. Following the Army Command and General Staff College and School...holds a Bachelor’s Degree in Biology and two Masters Degrees, a Masters in Management from Troy University and a Master of Military Arts and Sciences

  1. The INTEGRAL mission

    DEFF Research Database (Denmark)

    Winkler, C.; Courvoisier, T.J.L.; Di Cocco, G.

    2003-01-01

    -angular resolution imaging (15 keV-10 MeV). Two monitors, JEM-X (Lund et al. 2003) in the (3-35) keV X-ray band, and OMC (Mas-Hesse et al. 2003) in optical Johnson V-band complement the payload. The ground segment includes the Mission Operations Centre at ESOC, ESA and NASA ground stations, the Science Operations...

  2. Space VLBI Mission: VSOP

    Science.gov (United States)

    Murata, Yasuhiro; Hirabayashi, Hisashi; Kobayashi, Hideyuki; Shibata, Katsunori M.; Umemoto, Tomofumi; Edwards, P. G.

    2001-03-01

    We succeeded in performing space VLBI observations using the VLBI satellite HALCA (VSOP satellite), launched in February, 1997 aboard the first M-V rocket developed by ISAS. The mission is led by ISAS and NAO, with the collaborations from CRL, NASA, NRAO, and other institutes and observatories in Europe, Australia, Canada, South-Africa, and China, We succeeded to make a lot of observations and to get the new features from the active galaxies, the cosmic jets, and other astronomical objects.

  3. FINCH: time-dependent simulation of nulling interferometry for the DARWIN mission

    Science.gov (United States)

    Ergenzinger, Klaus; Kersten, Michael; Sesselmann, Rainer; Schwarz, Raphael; Johann, Ulrich; Wilhelm, Rainer C.; Scales, Kevin L.; Erd, Christian

    2004-09-01

    Within the scope of the DARWIN Technology and Research Programme, the European Space Agency (ESA) initiated the development of a dynamic system simulator (called FINCH "Fast Interferometer Characterization") for the spaceborne nulling interferometry mission DARWIN.The FINCH project is realized by two parallel activities: (1) a simulator for the Guidance, Navigation and Control (FINCH/GNC) of the free-flying satellite array, and (2) a simulator for the optical subsystems and the beam propagation within the system (FINCH/OPT). While the GNC activity is handled by EADS Astrium, France, the optical part is performed in a joint effort by EADS Astrium, Germany, and the European Southern Observatory (ESO). In this paper we focus on FINCH/OPT aspects and describe: " DARWIN and the corresponding overall end-to-end simulation approach " the completed FINCH/OPT development for modelling of point sources " modelling of extended objects, exact and with suitable approximations " details of optical modelling of DARWIN configurations within FINCH " applications of FINCH

  4. Image Processor Electronics (IPE): The High-Performance Computing System for NASA SWIFT Mission

    Science.gov (United States)

    Nguyen, Quang H.; Settles, Beverly A.

    2003-01-01

    Gamma Ray Bursts (GRBs) are believed to be the most powerful explosions that have occurred in the Universe since the Big Bang and are a mystery to the scientific community. Swift, a NASA mission that includes international participation, was designed and built in preparation for a 2003 launch to help to determine the origin of Gamma Ray Bursts. Locating the position in the sky where a burst originates requires intensive computing, because the duration of a GRB can range between a few milliseconds up to approximately a minute. The instrument data system must constantly accept multiple images representing large regions of the sky that are generated by sixteen gamma ray detectors operating in parallel. It then must process the received images very quickly in order to determine the existence of possible gamma ray bursts and their locations. The high-performance instrument data computing system that accomplishes this is called the Image Processor Electronics (IPE). The IPE was designed, built and tested by NASA Goddard Space Flight Center (GSFC) in order to meet these challenging requirements. The IPE is a small size, low power and high performing computing system for space applications. This paper addresses the system implementation and the system hardware architecture of the IPE. The paper concludes with the IPE system performance that was measured during end-to-end system testing.

  5. Mars Exploration Rover mission

    Science.gov (United States)

    Crisp, Joy A.; Adler, Mark; Matijevic, Jacob R.; Squyres, Steven W.; Arvidson, Raymond E.; Kass, David M.

    2003-10-01

    In January 2004 the Mars Exploration Rover mission will land two rovers at two different landing sites that show possible evidence for past liquid-water activity. The spacecraft design is based on the Mars Pathfinder configuration for cruise and entry, descent, and landing. Each of the identical rovers is equipped with a science payload of two remote-sensing instruments that will view the surrounding terrain from the top of a mast, a robotic arm that can place three instruments and a rock abrasion tool on selected rock and soil samples, and several onboard magnets and calibration targets. Engineering sensors and components useful for science investigations include stereo navigation cameras, stereo hazard cameras in front and rear, wheel motors, wheel motor current and voltage, the wheels themselves for digging, gyros, accelerometers, and reference solar cell readings. Mission operations will allow commanding of the rover each Martian day, or sol, on the basis of the previous sol's data. Over a 90-sol mission lifetime, the rovers are expected to drive hundreds of meters while carrying out field geology investigations, exploration, and atmospheric characterization. The data products will be delivered to the Planetary Data System as integrated batch archives.

  6. From rankings to mission.

    Science.gov (United States)

    Kirch, Darrell G; Prescott, John E

    2013-08-01

    Since the 1980s, school ranking systems have been a topic of discussion among leaders of higher education. Various ranking systems are based on inadequate data that fail to illustrate the complex nature and special contributions of the institutions they purport to rank, including U.S. medical schools, each of which contributes uniquely to meeting national health care needs. A study by Tancredi and colleagues in this issue of Academic Medicine illustrates the limitations of rankings specific to primary care training programs. This commentary discusses, first, how each school's mission and strengths, as well as the impact it has on the community it serves, are distinct, and, second, how these schools, which are each unique, are poorly represented by overly subjective ranking methodologies. Because academic leaders need data that are more objective to guide institutional development, the Association of American Medical Colleges (AAMC) has been developing tools to provide valid data that are applicable to each medical school. Specifically, the AAMC's Medical School Admissions Requirements and its Missions Management Tool each provide a comprehensive assessment of medical schools that leaders are using to drive institutional capacity building. This commentary affirms the importance of mission while challenging the leaders of medical schools, teaching hospitals, and universities to use reliable data to continually improve the quality of their training programs to improve the health of all.

  7. Nanosatellite missions - the future

    Science.gov (United States)

    Koudelka, O.; Kuschnig, R.; Wenger, M.; Romano, P.

    2017-09-01

    In the beginning, nanosatellite projects were focused on educational aspects. In the meantime, the technology matured and now allows to test, demonstrate and validate new systems, operational procedures and services in space at low cost and within much shorter timescales than traditional space endeavors. The number of spacecraft developed and launched has been increasing exponentially in the last years. The constellation of BRITE nanosatellites is demonstrating impressively that demanding scientific requirements can be met with small, low-cost satellites. Industry and space agencies are now embracing small satellite technology. Particularly in the USA, companies have been established to provide commercial services based on CubeSats. The approach is in general different from traditional space projects with their strict product/quality assurance and documentation requirements. The paper gives an overview of nanosatellite missions in different areas of application. Based on lessons learnt from the BRITE mission and recent developments at TU Graz (in particular the implementation of the OPS-SAT nanosatellite for ESA), enhanced technical possibilities for a future astronomy mission after BRITE will be discussed. Powerful on-board computers will allow on-board data pre-processing. A state-of-the-art telemetry system with high data rates would facilitate interference-free operations and increase science data return.

  8. The LISA Pathfinder Mission

    Science.gov (United States)

    Thorpe, james; McNamara, P. W.

    2011-01-01

    LISA Pathfinder is a dedicated technology demonstration space mission for the Laser Interferometer Space Antenna (LISA), a NASA/ESA collaboration to operate a space-based observatory for gravitational waves in the milli-Hertz band. Although the formal partnership between the agencies was dissolved in the Spring of 2011, both agencies are actively pursuing concepts for LISA-like gravitational wave observatories. These concepts take advantage of the significant technology development efforts that have already been made, especially those of the LISA Pathfinder mission. LISA Pathfinder, which is in the late stages of implementation, will place two test masses in drag-free flight and measure the relative acceleration between them. This measurement will validate a number of technologies that are critical to LISA-like gravitational wave instruments including sensing and control of the test masses, drag-free control laws, microNewton thrusters, and picometer-level laser metrology. We will present the current status of the LISA Pathfinder mission and associated activities.

  9. Bion-11 Spaceflight Mission

    Science.gov (United States)

    Skidmore, M.

    1999-01-01

    The Sensors 2000! Program, in support of the Space Life Sciences Payloads Office at NASA Ames Research Center developed a suite of bioinstrumentation hardware for use on the Joint US/Russian Bion I I Biosatellite Mission (December 24, 1996 - January 7, 1997). This spaceflight included 20 separate experiments that were organized into a complimentary and interrelated whole, and performed by teams of US, Russian, and French investigators. Over 40 separate parameters were recorded in-flight on both analog and digital recording media for later analysis. These parameters included; Electromyogram (7 ch), Electrogastrogram, Electrooculogram (2 ch), ECG/EKG, Electroencephlogram (2 ch), single fiber firing of Neurovestibular afferent nerves (7 ch), Tendon Force, Head Motion Velocity (pitch & yaw), P02 (in vivo & ambient), temperature (deep body, skin, & ambient), and multiple animal and spacecraft performance parameters for a total of 45 channels of recorded data. Building on the close cooperation of previous missions, US and Russian engineers jointly developed, integrated, and tested the physiologic instrumentation and data recording system. For the first time US developed hardware replaced elements of the Russian systems resulting in a US/Russian hybrid instrumentation and data system that functioned flawlessly during the 14 day mission.

  10. Landsat Data Continuity Mission

    Science.gov (United States)

    ,

    2012-01-01

    The Landsat Data Continuity Mission (LDCM) is a partnership formed between the National Aeronautics and Space Administration (NASA) and the U.S. Geological Survey (USGS) to place the next Landsat satellite in orbit in January 2013. The Landsat era that began in 1972 will become a nearly 41-year global land record with the successful launch and operation of the LDCM. The LDCM will continue the acquisition, archiving, and distribution of multispectral imagery affording global, synoptic, and repetitive coverage of the Earth's land surfaces at a scale where natural and human-induced changes can be detected, differentiated, characterized, and monitored over time. The mission objectives of the LDCM are to (1) collect and archive medium resolution (30-meter spatial resolution) multispectral image data affording seasonal coverage of the global landmasses for a period of no less than 5 years; (2) ensure that LDCM data are sufficiently consistent with data from the earlier Landsat missions in terms of acquisition geometry, calibration, coverage characteristics, spectral characteristics, output product quality, and data availability to permit studies of landcover and land-use change over time; and (3) distribute LDCM data products to the general public on a nondiscriminatory basis at no cost to the user.

  11. Agile Science Planning: Rapid Response Re-planning Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop autonomous rapid response to science observations in missions targeting small bodies in fly-by mode where observing and reaction time is precious.

  12. Multi-Mission SDR Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Wireless transceivers used for NASA space missions have traditionally been highly custom and mission specific. Programs such as the GRC Space Transceiver Radio...

  13. Mission Critical Occupation (MCO) Charts

    Data.gov (United States)

    Office of Personnel Management — Agencies report resource data and targets for government-wide mission critical occupations and agency specific mission critical and/or high risk occupations. These...

  14. Exomars Mission Achievements

    Science.gov (United States)

    Lecomte, J.; Juillet, J. J.

    2016-12-01

    ExoMars is the first step of the European Space Agency's Aurora Exploration Programme. Comprising two missions, the first one launched in 2016 and the second one to be launched in 2020, ExoMars is a program developed in a broad ESA and Roscosmos co-operation, with significant contribution from NASA that addresses the scientific question of whether life ever existed on Mars and demonstrate key technologies for entry, descent, landing, drilling and roving on the Martian surface . Thales Alenia Space is the overall prime contractor of the Exomars program leading a large industrial team The Spacecraft Composite (SCC), consisting of a Trace Gas Orbiter (TGO) and an EDL (Entry Descend and Landing) Demonstrator Module (EDM) named Schiaparelli, has been launched on 14 March 2016 from the Baikonur Cosmodrome by a Proton Launcher. The two modules will separate on 16 October 2016 after a 7 months cruise. The TGO will search for evidence of methane and other atmospheric gases that could be signatures of active biological or geological processes on Mars and will provide communications relay for the 2020 surface assets. The Schiaparelli module will prove the technologies required to safely land a payload on the surface of Mars, with a package of sensors aimed to support the reconstruction of the flown trajectory and the assessment of the performance of the EDL subsystems. For the second Exomars mission a space vehicle composed of a Carrier Module (CM) and a Descent Module (DM), whose Landing Platform (LP) will house a Rover, will begin a 7 months long trip to Mars in August 2020. In 2021 the Descent Module will be separated from the Carrier to carry out the entry into the planet's atmosphere and subsequently make the Landing Platform and the Rover land gently on the surface of Mars. While the LP will continue to measure the environmental parameters of the landing site, the Rover will begin exploration of the surface, which is expected to last 218 Martian days (approx. 230 Earth

  15. Solar sail mission design

    Energy Technology Data Exchange (ETDEWEB)

    Leipold, M.

    2000-02-01

    The main subject of this work is the design and detailed orbit transfer analysis of space flight missions with solar sails utilizing solar pressure for primary propulsion. Such a sailcraft requires ultra-light weight, gossamer-like deployable structures and materials in order to effectively utilize the transfer of momentum of solar photons. Different design concepts as well as technological elements for solar sails are considered, and an innovative design of a deployable sail structure including new methods for sail folding and unfolding is presented. The main focus of this report is on trajectory analysis, simulation and optimization of planetocentric as well as heliocentric low-thrust orbit transfers with solar sails. In a parametric analysis, geocentric escape spiral trajectories are simulated and corresponding flight times are determined. In interplanetary space, solar sail missions to all planets in our solar system as well as selected minor bodies are included in the analysis. Comparisons to mission concepts utilizing chemical propulsion as well as ion propulsion are included in order to assess whether solar sailing could possibly enhance or even enable this mission. The emphasis in the interplanetary mission analysis is on novel concepts: a unique method to realize a sun-synchronous Mercury orbiter, fast missions to the outer planets and the outer heliosphere applying a ''solar photonic assist'', rendezvous and sample return missions to asteroids and comets, as well as innovative concepts to reach unique vantage points for solar observation (''Solar Polar Orbiter'' and ''Solar Probe''). Finally, a propellant-less sailcraft attitude control concept using an external torque due to solar pressure is analyzed. Examples for sail navigation and control in circular Earth orbit applying a PD-control algorithm are shown, illustrating the maneuverability of a sailcraft. (orig.) [German] Gegenstand dieser

  16. Solar Maximum Mission - A systems overview

    Science.gov (United States)

    Guha, A. K.

    1981-01-01

    The Solar Maximum Mission (SMM), or the central effort of the Solar Maximum Year research endeavor is discussed. The mission's attempt to exploit the synergistic advantages of correlated data to obtain a complete picture of solar phenomena is stressed, as is the coordination provided by a world-wide network of ground-based observations. The prominent features of the SMM observatory, including the payload module and the solar-array system, are shown diagramatically and the science instruments (coronagraph/polarimeter, ultraviolet spectrometer/polarimeter, soft X-ray polychromater) are discussed. Descriptions of the spacecraft's electrical power system, attitude determination and control systems and communications systems are also included. The Experiment Operations Facility, which provides quick response to rapidly changing solar conditions and permits coordination with a multitude of ground observatories and coordinated experiments, is described.

  17. Climate Benchmark Missions: CLARREO

    Science.gov (United States)

    Wielicki, Bruce A.; Young, David F.

    2010-01-01

    CLARREO (Climate Absolute Radiance and Refractivity Observatory) is one of the four Tier 1 missions recommended by the recent NRC decadal survey report on Earth Science and Applications from Space (NRC, 2007). The CLARREO mission addresses the need to rigorously observe climate change on decade time scales and to use decadal change observations as the most critical method to determine the accuracy of climate change projections such as those used in the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR4). A rigorously known accuracy of both decadal change observations as well as climate projections is critical in order to enable sound policy decisions. The CLARREO mission accomplishes this critical objective through highly accurate and SI traceable decadal change observations sensitive to many of the key uncertainties in climate radiative forcings, responses, and feedbacks that in turn drive uncertainty in current climate model projections. The same uncertainties also lead to uncertainty in attribution of climate change to anthropogenic forcing. The CLARREO breakthrough in decadal climate change observations is to achieve the required levels of accuracy and traceability to SI standards for a set of observations sensitive to a wide range of key decadal change variables. These accuracy levels are determined both by the projected decadal changes as well as by the background natural variability that such signals must be detected against. The accuracy for decadal change traceability to SI standards includes uncertainties of calibration, sampling, and analysis methods. Unlike most other missions, all of the CLARREO requirements are judged not by instantaneous accuracy, but instead by accuracy in large time/space scale average decadal changes. Given the focus on decadal climate change, the NRC Decadal Survey concluded that the single most critical issue for decadal change observations was their lack of accuracy and low confidence in

  18. Enabling the human mission

    Science.gov (United States)

    Bosley, John

    The duplication of earth conditions aboard a spacecraft or planetary surface habitat requires 60 lb/day/person of food, potable and hygiene water, and oxygen. A 1000-day mission to Mars would therefore require 30 tons of such supplies per crew member in the absence of a closed-cycle, or regenerative, life-support system. An account is given of the development status of regenerative life-support systems, as well as of the requisite radiation protection and EVA systems, the health-maintenance and medical care facilities, zero-gravity deconditioning measures, and planetary surface conditions protection.

  19. SOFIA mission operations

    Science.gov (United States)

    Waddell, Patrick G.; Davidson, Jacqueline A.

    2002-02-01

    The SOFIA Airborne Observatory will operate a 2.5 m aperture telescope with the goal of obtaining over 960 successful science hours per year at a nominal altitude of 12.5 km and covering a wavelength range from 0.3 mm to 1.6 mm. The observatory platform is comprised of a Boeing 747SP with numerous significant modifications. The ground and flight mission operations architectures and plans are tailored to keep the telescope emissivity low and achieve high observing efficiency.

  20. The ARTEMIS mission

    CERN Document Server

    Angelopoulos, Vassilis

    2014-01-01

    The ARTEMIS mission was initiated by skillfully moving the two outermost Earth-orbiting THEMIS spacecraft into lunar orbit to conduct unprecedented dual spacecraft observations of the lunar environment. ARTEMIS stands for Acceleration, Reconnection, Turbulence and Electrodynamics of the Moon's Interaction with the Sun. Indeed, this volume discusses initial findings related to the Moon’s magnetic and plasma environments and the electrical conductivity of the lunar interior. This work is aimed at researchers and graduate students in both heliophysics and planetary physics. Originally published in Space Science Reviews, Vol. 165/1-4, 2011.

  1. EU Universities’ Mission Statements

    Directory of Open Access Journals (Sweden)

    Liudmila Arcimaviciene

    2015-04-01

    Full Text Available In the last 10 years, a highly productive space of metaphor analysis has been established in the discourse studies of media, politics, business, and education. In the theoretical framework of Conceptual Metaphor Theory and Critical Discourse Analysis, the restored metaphorical patterns are especially valued for their implied ideological value as realized both conceptually and linguistically. By using the analytical framework of Critical Metaphor Analysis and procedurally employing Pragglejaz Group’s Metaphor Identification Procedure, this study aims at analyzing the implied value of the evoked metaphors in the mission statements of the first 20 European Universities, according to the Webometrics ranking. In this article, it is proposed that Universities’ mission statements are based on the positive evaluation of the COMMERCE metaphor, which does not fully correlate with the ideological framework of sustainability education but is rather oriented toward consumerism in both education and society. Despite this overall trend, there are some traceable features of the conceptualization reflecting the sustainability approach to higher education, as related to freedom of speech, tolerance, and environmental concerns. Nonetheless, these are suppressed by the metaphoric usages evoking traditional dogmas of the conservative ideology grounded in the concepts of the transactional approach to relationship, competitiveness for superiority, the importance of self-interest and strength, and quantifiable quality.

  2. Apollo 11 Mission Commemorated

    Science.gov (United States)

    Showstack, Randy

    2009-07-01

    On 24 July 1969, 4 days after Apollo 11 Mission Commander Neil Armstrong and Lunar Module Eagle Pilot Eugene “Buzz” Aldrin had become the first people to walk on the Moon, they and Apollo 11 Command Module Pilot Michael Collins peered through a window of the Mobile Quarantine Facility on board the U.S.S. Hornet following splashdown of the command module in the central Pacific as U.S. President Richard Nixon told them, “This is the greatest week in the history of the world since the creation.” Forty years later, the Apollo 11 crew and other Apollo-era astronauts gathered at several events in Washington, D. C., to commemorate and reflect on the Apollo program, that mission, and the future of manned spaceflight. “I don’t know what the greatest week in history is,” Aldrin told Eos. “But it was certainly a pioneering opening the door. With the door open when we touched down on the Moon, that was what enabled humans to put many more footprints on the surface of the Moon.”

  3. The Global Precipitation Mission

    Science.gov (United States)

    Braun, Scott; Kummerow, Christian

    2000-01-01

    The Global Precipitation Mission (GPM), expected to begin around 2006, is a follow-up to the Tropical Rainfall Measuring Mission (TRMM). Unlike TRMM, which primarily samples the tropics, GPM will sample both the tropics and mid-latitudes. The primary, or core, satellite will be a single, enhanced TRMM satellite that can quantify the 3-D spatial distributions of precipitation and its associated latent heat release. The core satellite will be complemented by a constellation of very small and inexpensive drones with passive microwave instruments that will sample the rainfall with sufficient frequency to be not only of climate interest, but also have local, short-term impacts by providing global rainfall coverage at approx. 3 h intervals. The data is expected to have substantial impact upon quantitative precipitation estimation/forecasting and data assimilation into global and mesoscale numerical models. Based upon previous studies of rainfall data assimilation, GPM is expected to lead to significant improvements in forecasts of extratropical and tropical cyclones. For example, GPM rainfall data can provide improved initialization of frontal systems over the Pacific and Atlantic Oceans. The purpose of this talk is to provide information about GPM to the USWRP (U.S. Weather Research Program) community and to discuss impacts on quantitative precipitation estimation/forecasting and data assimilation.

  4. General Mission Analysis Tool (GMAT): Mission, Vision, and Business Case

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    The Goal of the GMAT project is to develop new space trajectory optimization and mission design technology by working inclusively with ordinary people, universities businesses and other government organizations; and to share that technology in an open and unhindered way. GMAT's a free and open source software system; free for anyone to use in development of new mission concepts or to improve current missions, freely available in source code form for enhancement or future technology development.

  5. Venus Aerobot Multisonde Mission

    Science.gov (United States)

    Cutts, James A.; Kerzhanovich, Viktor; Balaram, J. Bob; Campbell, Bruce; Gershaman, Robert; Greeley, Ronald; Hall, Jeffery L.; Cameron, Jonathan; Klaasen, Kenneth; Hansen, David M.

    1999-01-01

    Robotic exploration of Venus presents many challenges because of the thick atmosphere and the high surface temperatures. The Venus Aerobot Multisonde mission concept addresses these challenges by using a robotic balloon or aerobot to deploy a number of short lifetime probes or sondes to acquire images of the surface. A Venus aerobot is not only a good platform for precision deployment of sondes but is very effective at recovering high rate data. This paper describes the Venus Aerobot Multisonde concept and discusses a proposal to NASA's Discovery program using the concept for a Venus Exploration of Volcanoes and Atmosphere (VEVA). The status of the balloon deployment and inflation, balloon envelope, communications, thermal control and sonde deployment technologies are also reviewed.

  6. InterDomain-QOSM: The NSIS QOS Model for Inter-domain Signaling to Enable End-to-End QoS Provisioning Over Heterogeneous Network Domains

    NARCIS (Netherlands)

    Zhang, J.; Monteiro, E.; Mendes, P.; Karagiannis, Georgios; Andres-Colas, J.

    2006-01-01

    This document has three goals. First of all, it presents our analysis of how to use the NSIS signaling (inter-domain QOSM and intra-domain QOSM) to fulfill the QoS control in accord with the ITU-T RACF functional architecture. For this goal, we discuss how the ITU-T RACF entities in the ITU-T RACF

  7. An End-to-End DNA Taxonomy Methodology for Benthic Biodiversity Survey in the Clarion-Clipperton Zone, Central Pacific Abyss

    Directory of Open Access Journals (Sweden)

    Adrian G. Glover

    2015-12-01

    Full Text Available Recent years have seen increased survey and sampling expeditions to the Clarion-Clipperton Zone (CCZ, central Pacific Ocean abyss, driven by commercial interests from contractors in the potential extraction of polymetallic nodules in the region. Part of the International Seabed Authority (ISA regulatory requirements are that these contractors undertake environmental research expeditions to their CCZ exploration claims following guidelines approved by the ISA Legal and Technical Commission (ISA, 2010. Section 9 (e of these guidelines instructs contractors to “…collect data on the sea floor communities specifically relating to megafauna, macrofauna, meiofauna, microfauna, nodule fauna and demersal scavengers”. There are a number of methodological challenges to this, including the water depth (4000–5000 m, extremely warm surface waters (~28 °C compared to bottom water (~1.5 °C and great distances to ports requiring a large and long seagoing expedition with only a limited number of scientists. Both scientists and regulators have recently realized that a major gap in our knowledge of the region is the fundamental taxonomy of the animals that live there; this is essential to inform our knowledge of the biogeography, natural history and ultimately our stewardship of the region. Recognising this, the ISA is currently sponsoring a series of taxonomic workshops on the CCZ fauna and to assist in this process we present here a series of methodological pipelines for DNA taxonomy (incorporating both molecular and morphological data of the macrofauna and megafauna from the CCZ benthic habitat in the recent ABYSSLINE cruise program to the UK-1 exploration claim. A major problem on recent CCZ cruises has been the collection of high-quality samples suitable for both morphology and DNA taxonomy, coupled with a workflow that ensures these data are made available. The DNA sequencing techniques themselves are relatively standard, once good samples have been obtained. The key to quality taxonomic work on macrofaunal animals from the tropical abyss is careful extraction of the animals (in cold, filtered seawater, microscopic observation and preservation of live specimens, from a variety of sampling devices by experienced zoologists at sea. Essential to the long-term iterative building of taxonomic knowledge from the CCZ is an “end-to-end” methodology to the taxonomic science that takes into account careful sampling design, at-sea taxonomic identification and fixation, post-cruise laboratory work with both DNA and morphology and finally a careful sample and data management pipeline that results in specimens and data in accessible open museum collections and online repositories.

  8. Perceptual Objective Listening Quality Assessment (POLQA), The Third Generation ITU-T Standard for End-to-End Speech Quality Measurement Part I—Temporal Alignment

    NARCIS (Netherlands)

    Beerends, J.G.; Schmidmer, C.; Berger, J.; Obermann, M.; Ullman, R.; Pomy, J.; Keyhl, M.

    2013-01-01

    In this and the companion paper Part II, the authors present the Perceptual Objective Listening Quality Assessment (POLQA), the third-generation speech quality measurement algorithm, standardized by the International Telecommunication Union in 2011 as Recommendation P.863. In contrast to the

  9. Perceptual Objective Listening Quality Assessment (POLQA), The Third Generation ITU-T Standard for End-to-End Speech Quality Measurement : Part II – Perceptual Model

    NARCIS (Netherlands)

    Beerends, J.G.; Schmidmer, C.; Berger, J.; Obermann, M.; Ullman, R.; Pomy, J.; Keyhl, M.

    2013-01-01

    In this and the companion paper Part I, the authors present the Perceptual Objective Listening Quality Assessment (POLQA), the third-generation speech quality measurement algorithm, standardized by the International Telecommunication Union in 2011 as Recommendation P.863. This paper describes the

  10. FROM UAS DATA ACQUISITION TO ACTIONABLE INFORMATION – HOW AN END-TO-END SOLUTION HELPS OIL PALM PLANTATION OPERATORS TO PERFORM A MORE SUSTAINABLE PLANTATION MANAGEMENT

    Directory of Open Access Journals (Sweden)

    C. Hoffmann

    2016-06-01

    The research results describe how operators can successfully make use of a UAS-based solution together with the developed software solution to improve their efficiency in oil palm plantation management.

  11. From Ambient Sensing to IoT-based Context Computing: An Open Framework for End to End QoC Management.

    Science.gov (United States)

    Marie, Pierrick; Desprats, Thierry; Chabridon, Sophie; Sibilla, Michelle; Taconet, Chantal

    2015-06-16

    Quality of Context (QoC) awareness is recognized as a key point for the success of context-aware computing. At the time where the combination of the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms offer together new opportunities for managing richer context data, the next generation of Distributed Context Managers (DCM) is facing new challenges concerning QoC management. This paper presents our model-driven QoCIM framework. QoCIM is the acronym for Quality of Context Information Model. We show how it can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario.

  12. SU-E-J-55: End-To-End Effectiveness Analysis of 3D Surface Image Guided Voluntary Breath-Holding Radiotherapy for Left Breast

    Energy Technology Data Exchange (ETDEWEB)

    Lin, M; Feigenberg, S [University of Maryland School of Medicine, Baltimore, MD (United States)

    2015-06-15

    Purpose To evaluate the effectiveness of using 3D-surface-image to guide breath-holding (BH) left-side breast treatment. Methods Two 3D surface image guided BH procedures were implemented and evaluated: normal-BH, taking BH at a comfortable level, and deep-inspiration-breath-holding (DIBH). A total of 20 patients (10 Normal-BH and 10 DIBH) were recruited. Patients received a BH evaluation using a commercialized 3D-surface- tracking-system (VisionRT, London, UK) to quantify the reproducibility of BH positions prior to CT scan. Tangential 3D/IMRT plans were conducted. Patients were initially setup under free-breathing (FB) condition using the FB surface obtained from the untaged CT to ensure a correct patient position. Patients were then guided to reach the planned BH position using the BH surface obtained from the BH CT. Action-levels were set at each phase of treatment process based on the information provided by the 3D-surface-tracking-system for proper interventions (eliminate/re-setup/ re-coaching). We reviewed the frequency of interventions to evaluate its effectiveness. The FB-CBCT and port-film were utilized to evaluate the accuracy of 3D-surface-guided setups. Results 25% of BH candidates with BH positioning uncertainty > 2mm are eliminated prior to CT scan. For >90% of fractions, based on the setup deltas from3D-surface-trackingsystem, adjustments of patient setup are needed after the initial-setup using laser. 3D-surface-guided-setup accuracy is comparable as CBCT. For the BH guidance, frequency of interventions (a re-coaching/re-setup) is 40%(Normal-BH)/91%(DIBH) of treatments for the first 5-fractions and then drops to 16%(Normal-BH)/46%(DIBH). The necessity of re-setup is highly patient-specific for Normal-BH but highly random among patients for DIBH. Overall, a −0.8±2.4 mm accuracy of the anterior pericardial shadow position was achieved. Conclusion 3D-surface-image technology provides effective intervention to the treatment process and ensures favorable day-to-day setup accuracy. DIBH setup appears to be more uncertain and this would be the patient group who will definitely benefit from the extra information of 3D surface setup.

  13. National Renewable Energy Laboratory (NREL) Topic 2 Final Report: End-to-End Communication and Control System to Support Clean Energy Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Hudgins, Andrew P [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Carrillo, Ismael M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Jin, Xin [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Simmins, John [Electric Power Research Institute (EPRI)

    2018-02-21

    This document is the final report of a two-year development, test, and demonstration project, 'Cohesive Application of Standards- Based Connected Devices to Enable Clean Energy Technologies.' The project was part of the National Renewable Energy Laboratory's (NREL's) Integrated Network Testbed for Energy Grid Research and Technology (INTEGRATE) initiative hosted at Energy Systems Integration Facility (ESIF). This project demonstrated techniques to control distribution grid events using the coordination of traditional distribution grid devices and high-penetration renewable resources and demand response. Using standard communication protocols and semantic standards, the project examined the use cases of high/low distribution voltage, requests for volt-ampere-reactive (VAR) power support, and transactive energy strategies using Volttron. Open source software, written by EPRI to control distributed energy resources (DER) and demand response (DR), was used by an advanced distribution management system (ADMS) to abstract the resources reporting to a collection of capabilities rather than needing to know specific resource types. This architecture allows for scaling both horizontally and vertically. Several new technologies were developed and tested. Messages from the ADMS based on the common information model (CIM) were developed to control the DER and DR management systems. The OpenADR standard was used to help manage grid events by turning loads off and on. Volttron technology was used to simulate a homeowner choosing the price at which to enter the demand response market. Finally, the ADMS used newly developed algorithms to coordinate these resources with a capacitor bank and voltage regulator to respond to grid events.

  14. Web-based bioinformatics workflows for end-to-end RNA-seq data computation and analysis in agricultural animal species

    Science.gov (United States)

    Remarkable advances in next-generation sequencing (NGS) technologies, bioinformatics algorithms, and computational technologies have significantly accelerated genomic research. However, complicated NGS data analysis still remains as a major bottleneck. RNA-seq, as one of the major area in the NGS fi...

  15. Ex vivo proof-of-concept of end-to-end scaffold-enhanced laser-assisted vascular anastomosis of porcine arteries

    NARCIS (Netherlands)

    Pabittei, Dara R.; Heger, Michal; van Tuijl, Sjoerd; Simonet, Marc; de Boon, Wadim; van der Wal, Allard C.; Balm, Ron; de Mol, Bas A.

    2015-01-01

    The low welding strength of laser-assisted vascular anastomosis (LAVA) has hampered the clinical application of LAVA as an alternative to suture anastomosis. To improve welding strength, LAVA in combination with solder and polymeric scaffolds (ssLAVA) has been optimized in vitro. Currently, ssLAVA

  16. Ex vivo proof-of-concept of end-to-end scaffold-enhanced laser-assisted vascular anastomosis of porcine arteries.

    Science.gov (United States)

    Pabittei, Dara R; Heger, Michal; van Tuijl, Sjoerd; Simonet, Marc; de Boon, Wadim; van der Wal, Allard C; Balm, Ron; de Mol, Bas A

    2015-07-01

    The low welding strength of laser-assisted vascular anastomosis (LAVA) has hampered the clinical application of LAVA as an alternative to suture anastomosis. To improve welding strength, LAVA in combination with solder and polymeric scaffolds (ssLAVA) has been optimized in vitro. Currently, ssLAVA requires proof-of-concept in a physiologically representative ex vivo model before advancing to in vivo studies. This study therefore investigated the feasibility of ex vivo ssLAVA in medium-sized porcine arteries. Scaffolds composed of poly(ε-caprolactone) (PCL) or poly(lactic-co-glycolic acid) (PLGA) were impregnated with semisolid solder and placed over coapted aortic segments. ssLAVA was performed with a 670-nm diode laser. In the first substudy, the optimum number of laser spots was determined by bursting pressure analysis. The second substudy investigated the resilience of the welds in a Langendorf-type pulsatile pressure setup, monitoring the number of failed vessels. The type of failure (cohesive vs adhesive) was confirmed by electron microscopy, and thermal damage was assessed histologically. The third substudy compared breaking strength of aortic repairs made with PLGA and semisolid genipin solder (ssLAVR) to repairs made with BioGlue. ssLAVA with 11 lasing spots and PLGA scaffold yielded the highest bursting pressure (923 ± 56 mm Hg vs 703 ± 96 mm Hg with PCL ssLAVA; P = .0002) and exhibited the fewest failures (20% vs 70% for PCL ssLAVA; P = .0218). The two failed PLGA ssLAVA arteries leaked at 19 and 22 hours, whereas the seven failed PCL ssLAVA arteries burst between 12 and 23 hours. PLGA anastomoses broke adhesively, whereas PCL welds failed cohesively. Both modalities exhibited full-thickness thermal damage. Repairs with PLGA scaffold yielded higher breaking strength than BioGlue repairs (323 ± 28 N/cm(2) vs 25 ± 4 N/cm(2), respectively; P = .0003). PLGA ssLAVA yields greater anastomotic strength and fewer anastomotic failures than PCL ssLAVA. Aortic repairs with BioGlue were inferior to those produced with PLGA ssLAVR. The results demonstrate the feasibility of ssLAVA/R as an alternative method to suture anastomosis or tissue sealant. Further studies should focus on reducing thermal damage. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  17. The CTTC 5G End-to-End Experimental Platform : Integrating Heterogeneous Wireless/Optical Networks, Distributed Cloud, and IoT Devices

    OpenAIRE

    Muñóz, Raul; Mangues-Bafalluy, Josep; Vilalta, Ricard; Verikoukis, Christos; Alonso-Zarate, Jesús; Bartzoudis, Nikolaos; Georgiadis, Apostolos; Payaró, Miquel; Pérez-Neira, Ana; Casellas, Ramon; Martínez, Ricardo; Núñez-Martínez, Jose; Manuel Requena Esteso, Manuel; Pubill, David; Font-Batch, Oriol

    2016-01-01

    The Internet of Things (IoT) will facilitate a wide variety of applications in different domains, such as smart cities, smart grids, industrial automation (Industry 4.0), smart driving, assistance of the elderly, and home automation. Billions of heterogeneous smart devices with different application requirements will be connected to the networks and will generate huge aggregated volumes of data that will be processed in distributed cloud infrastructures. On the other hand, there is also a gen...

  18. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems II. Extension to the thermal infrared: equations and methods

    Science.gov (United States)

    Cota, Stephen A.; Lomheim, Terrence S.; Florio, Christopher J.; Harbold, Jeffrey M.; Muto, B. Michael; Schoolar, Richard B.; Wintz, Daniel T.; Keller, Robert A.

    2011-10-01

    In a previous paper in this series, we described how The Aerospace Corporation's Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) tool may be used to model space and airborne imaging systems operating in the visible to near-infrared (VISNIR). PICASSO is a systems-level tool, representative of a class of such tools used throughout the remote sensing community. It is capable of modeling systems over a wide range of fidelity, anywhere from conceptual design level (where it can serve as an integral part of the systems engineering process) to as-built hardware (where it can serve as part of the verification process). In the present paper, we extend the discussion of PICASSO to the modeling of Thermal Infrared (TIR) remote sensing systems, presenting the equations and methods necessary to modeling in that regime.

  19. Perspectives with the GCT end-to-end prototype of the small-sized telescope proposed for the Cherenkov telescope array

    Science.gov (United States)

    Costantini, H.; Dournaux, J.-L.; Ernenwein, J.-P.; Laporte, P.; Sol, H.

    2017-01-01

    In the framework of the Cherenkov Telescope Array (CTA), the GCT (Gamma-ray Cherenkov Telescope) team is building a dual-mirror telescope as one of the proposed prototypes for the CTA small size class of telescopes. The telescope is based on a Schwarzschild-Couder (SC) optical design, an innovative solution for ground-based Cherenkov astronomy, which allows a compact telescope structure, a lightweight large Field of View (FoV) camera and enables good angular resolution across the entire FoV. We review the different mechanical and optical components of the telescope. In order to characterise them, the Paris prototype will be operated during several weeks in 2016. In this framework, an estimate of the expected performance of this prototype has been made, based on Monte Carlo simulations. In particular the observability of the Crab Nebula in the context of high Night Sky Background (NSB) is presented.

  20. An Intrinsic TE Approach for End-to-End QoS Provisioning in OBS Networks Using Static Load-Balanced Routing Strategies

    Directory of Open Access Journals (Sweden)

    Alvaro L. Barradas

    2010-10-01

    Full Text Available Optical burst switching provides a feasible paradigm for the next IP over optical backbones. However its burst loss performance can be highly affected by burst contention. In this paper we discuss traffic engineering approaches for path selection with the objective tominimize contention using only topological information. The discussed strategies are based on balancing the traffic across the network in order to reduce congestion without incurring into link state protocol penalties. The routing strategies are evaluated by simulation on an optical burst switching model specifically developed for the purpose with OMNeT++. Results show that our strategies outperform the traditionally used shortest path routing to an extent that depends on the network connectivity.

  1. From Ambient Sensing to IoT-based Context Computing: An Open Framework for End to End QoC Management

    Directory of Open Access Journals (Sweden)

    Pierrick Marie

    2015-06-01

    Full Text Available Quality of Context (QoC awareness is recognized as a key point for the success of context-aware computing. At the time where the combination of the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms offer together new opportunities for managing richer context data, the next generation of Distributed Context Managers (DCM is facing new challenges concerning QoC management. This paper presents our model-driven QoCIM framework. QoCIM is the acronym for Quality of Context Information Model. We show how it can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario.

  2. FROM UAS DATA ACQUISITION TO ACTIONABLE INFORMATION – HOW AN END-TO-END SOLUTION HELPS OIL PALM PLANTATION OPERATORS TO PERFORM A MORE SUSTAINABLE PLANTATION MANAGEMENT

    Directory of Open Access Journals (Sweden)

    C. Hoffmann

    2016-06-01

    Full Text Available Palm oil represents the most efficient oilseed crop in the world but the production of palm oil involves plantation operations in one of the most fragile environments - the tropical lowlands. Deforestation, the drying-out of swampy lowlands and chemical fertilizers lead to environmental problems that are putting pressure on this industry. Unmanned aircraft systems (UAS together with latest photogrammetric processing and image analysis capabilities represent an emerging technology that was identified to be suitable to optimize oil palm plantation operations. This paper focuses on two key elements of a UAS-based oil palm monitoring system. The first is the accuracy of the acquired data that is necessary to achieve meaningful results in later analysis steps. High performance GNSS technology was utilized to achieve those accuracies while decreasing the demand for cost-intensive GCP measurements. The second key topic is the analysis of the resulting data in order to optimize plantation operations. By automatically extracting information on a block level as well as on a single-tree level, operators can utilize the developed application to increase their productivity. The research results describe how operators can successfully make use of a UAS-based solution together with the developed software solution to improve their efficiency in oil palm plantation management.

  3. SU-E-T-508: End to End Testing of a Prototype Eclipse Module for Planning Modulated Arc Therapy On the Siemens Platform

    Energy Technology Data Exchange (ETDEWEB)

    Huang, L [Huntsman Cancer Hospital, Salt Lake City, UT (United States); Sarkar, V [University of Utah Hospitals, Salt Lake City, UT (United States); Spiessens, S [Varian Medical Systems France, Buc Cedex (France); Rassiah-Szegedi, P; Huang, Y; Salter, B [University Utah, Salt Lake City, UT (United States); Zhao, H [University of Utah, Salt Lake City, UT (United States); Szegedi, M [Huntsman Cancer Hospital, The University of Utah, Salt Lake City, UT (United States)

    2014-06-01

    Purpose: The latest clinical implementation of the Siemens Artiste linac allows for delivery of modulated arcs (mARC) using full-field flattening filter free (FFF) photon beams. The maximum doserate of 2000 MU/min is well suited for high dose treatments such as SBRT. We tested and report on the performance of a prototype Eclipse TPS module supporting mARC capability on the Artiste platform. Method: our spine SBRT patients originally treated with 12/13 field static-gantry IMRT (SGIMRT) were chosen for this study. These plans were designed to satisfy RTOG0631 guidelines with a prescription of 16Gy in a single fraction. The cases were re-planned as mARC plans in the prototype Eclipse module using the 7MV FFF beam and required to satisfy RTOG0631 requirements. All plans were transferred from Eclipse, delivered on a Siemens Artiste linac and dose-validated using the Delta4 system. Results: All treatment plans were straightforwardly developed, in timely fashion, without challenge or inefficiency using the prototype module. Due to the limited number of segments in a single arc, mARC plans required 2-3 full arcs to yield plan quality comparable to SGIMRT plans containing over 250 total segments. The average (3%/3mm) gamma pass-rate for all arcs was 98.5±1.1%, thus demonstrating both excellent dose prediction by the AAA dose algorithm and excellent delivery fidelity. Mean delivery times for the mARC plans(10.5±1.7min) were 50-70% lower than the SGIMRT plans(26±2min), with both delivered at 2000 MU/min. Conclusion: A prototype Eclipse module capable of planning for Burst Mode modulated arc delivery on the Artiste platform has been tested and found to perform efficiently and accurately for treatment plan development and delivered-dose prediction. Further investigation of more treatment sites is being carried out and data will be presented.

  4. Wide Area Recovery and Resiliency Program (WARRP) Biological Attack Response and Recovery: End to End Medical Countermeasure Distribution and Dispensing Processes

    Science.gov (United States)

    2012-04-24

    emergency planning. Public health departments are suffering from brain drain.‖  ―For planning purposes, the assumption is that 60% of volunteers will...concerto practice. "In a concerto, a musician practices the difficult parts one at a time, then they put the whole process together. POD elements

  5. Design and implementation of a secure and user-friendly broker platform supporting the end-to-end provisioning of e-homecare services.

    Science.gov (United States)

    Van Hoecke, Sofie; Steurbaut, Kristof; Taveirne, Kristof; De Turck, Filip; Dhoedt, Bart

    2010-01-01

    We designed a broker platform for e-homecare services using web service technology. The broker allows efficient data communication and guarantees quality requirements such as security, availability and cost-efficiency by dynamic selection of services, minimizing user interactions and simplifying authentication through a single user sign-on. A prototype was implemented, with several e-homecare services (alarm, telemonitoring, audio diary and video-chat). It was evaluated by patients with diabetes and multiple sclerosis. The patients found that the start-up time and overhead imposed by the platform was satisfactory. Having all e-homecare services integrated into a single application, which required only one login, resulted in a high quality of experience for the patients.

  6. Operating performance of the gamma-ray Cherenkov telescope: An end-to-end Schwarzschild–Couder telescope prototype for the Cherenkov Telescope Array

    Energy Technology Data Exchange (ETDEWEB)

    Dournaux, J.L., E-mail: jean-laurent.dournaux@obspm.fr [GEPI, Observatoire de Paris, PSL Research University, CNRS, Sorbonne Paris Cité, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); De Franco, A. [Department of Physics, University of Oxford, Keble Road, Oxford OX1 3RH (United Kingdom); Laporte, P. [GEPI, Observatoire de Paris, PSL Research University, CNRS, Sorbonne Paris Cité, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); White, R. [Max-Planck-Institut für Kernphysik, Saupfercheckweg 1, 69117 Heidelberg (Germany); Greenshaw, T. [University of Liverpool, Oliver Lodge Laboratory, P.O. Box 147, Oxford Street, Liverpool L69 3BX (United Kingdom); Sol, H. [LUTH, Observatoire de Paris, PSL Research University, CNRS, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); Abchiche, A. [CNRS, Division technique DT-INSU, 1 Place Aristide Briand, 92190 Meudon (France); Allan, D. [Department of Physics and Centre for Advanced Instrumentation, Durham University, South Road, Durham DH1 3LE (United Kingdom); Amans, J.P. [GEPI, Observatoire de Paris, PSL Research University, CNRS, Sorbonne Paris Cité, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); Armstrong, T.P. [Department of Physics and Centre for Advanced Instrumentation, Durham University, South Road, Durham DH1 3LE (United Kingdom); Balzer, A.; Berge, D. [GRAPPA, University of Amsterdam, Science Park 904, 1098 XH Amsterdam (Netherlands); Boisson, C. [LUTH, Observatoire de Paris, PSL Research University, CNRS, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); and others

    2017-02-11

    The Cherenkov Telescope Array (CTA) consortium aims to build the next-generation ground-based very-high-energy gamma-ray observatory. The array will feature different sizes of telescopes allowing it to cover a wide gamma-ray energy band from about 20 GeV to above 100 TeV. The highest energies, above 5 TeV, will be covered by a large number of Small-Sized Telescopes (SSTs) with a field-of-view of around 9°. The Gamma-ray Cherenkov Telescope (GCT), based on Schwarzschild–Couder dual-mirror optics, is one of the three proposed SST designs. The GCT is described in this contribution and the first images of Cherenkov showers obtained using the telescope and its camera are presented. These were obtained in November 2015 in Meudon, France.

  7. Hayabusa2 Mission Overview

    Science.gov (United States)

    Watanabe, Sei-ichiro; Tsuda, Yuichi; Yoshikawa, Makoto; Tanaka, Satoshi; Saiki, Takanao; Nakazawa, Satoru

    2017-07-01

    The Hayabusa2 mission journeys to C-type near-Earth asteroid (162173) Ryugu (1999 JU3) to observe and explore the 900 m-sized object, as well as return samples collected from the surface layer. The Haybusa2 spacecraft developed by Japan Aerospace Exploration Agency (JAXA) was successfully launched on December 3, 2014 by an H-IIA launch vehicle and performed an Earth swing-by on December 3, 2015 to set it on a course toward its target Ryugu. Hayabusa2 aims at increasing our knowledge of the early history and transfer processes of the solar system through deciphering memories recorded on Ryugu, especially about the origin of water and organic materials transferred to the Earth's region. Hayabusa2 carries four remote-sensing instruments, a telescopic optical camera with seven colors (ONC-T), a laser altimeter (LIDAR), a near-infrared spectrometer covering the 3-μm absorption band (NIRS3), and a thermal infrared imager (TIR). It also has three small rovers of MINERVA-II and a small lander MASCOT (Mobile Asteroid Surface Scout) developed by German Aerospace Center (DLR) in cooperation with French space agency CNES. MASCOT has a wide angle imager (MasCam), a 6-band thermal radiator (MARA), a 3-axis magnetometer (MasMag), and a hyperspectral infrared microscope (MicrOmega). Further, Hayabusa2 has a sampling device (SMP), and impact experiment devices which consist of a small carry-on impactor (SCI) and a deployable camera (DCAM3). The interdisciplinary research using the data from these onboard and lander's instruments and the analyses of returned samples are the key to success of the mission.

  8. The Messenger Mission to Mercury

    CERN Document Server

    Domingue, D. L

    2007-01-01

    NASA’s MESSENGER mission, launched on 3 August, 2004 is the seventh mission in the Discovery series. MESSENGER encounters the planet Mercury four times, culminating with an insertion into orbit on 18 March 2011. It carries a comprehensive package of geophysical, geological, geochemical, and space environment experiments to complete the complex investigations of this solar-system end member, which begun with Mariner 10. The articles in this book, written by the experts in each area of the MESSENGER mission, describe the mission, spacecraft, scientific objectives, and payload. The book is of interest to all potential users of the data returned by the MESSENGER mission, to those studying the nature of the planet Mercury, and by all those interested in the design and implementation of planetary exploration missions.

  9. NASA's Asteroid Redirect Mission (ARM)

    Science.gov (United States)

    Abell, Paul; Mazanek, Dan; Reeves, David; Naasz, Bo; Cichy, Benjamin

    2015-11-01

    The National Aeronautics and Space Administration (NASA) is developing a robotic mission to visit a large near-Earth asteroid (NEA), collect a multi-ton boulder from its surface, and redirect it into a stable orbit around the Moon. Once returned to cislunar space in the mid-2020s, astronauts will explore the boulder and return to Earth with samples. This Asteroid Redirect Mission (ARM) is part of NASA’s plan to advance the technologies, capabilities, and spaceflight experience needed for a human mission to the Martian system in the 2030s. Subsequent human and robotic missions to the asteroidal material would also be facilitated by its return to cislunar space. Although ARM is primarily a capability demonstration mission (i.e., technologies and associated operations), there exist significant opportunities to advance our knowledge of small bodies in the synergistic areas of science, planetary defense, asteroidal resources and in-situ resource utilization (ISRU), and capability and technology demonstrations. In order to maximize the knowledge return from the mission, NASA is organizing an ARM Investigation Team, which is being preceded by the Formulation Assessment and Support Team. These teams will be comprised of scientists, technologists, and other qualified and interested individuals to help plan the implementation and execution of ARM. An overview of robotic and crewed segments of ARM, including the mission requirements, NEA targets, and mission operations, will be provided along with a discussion of the potential opportunities associated with the mission.

  10. Simulation of Mission Phases

    Science.gov (United States)

    Carlstrom, Nicholas Mercury

    2016-01-01

    This position with the Simulation and Graphics Branch (ER7) at Johnson Space Center (JSC) provided an introduction to vehicle hardware, mission planning, and simulation design. ER7 supports engineering analysis and flight crew training by providing high-fidelity, real-time graphical simulations in the Systems Engineering Simulator (SES) lab. The primary project assigned by NASA mentor and SES lab manager, Meghan Daley, was to develop a graphical simulation of the rendezvous, proximity operations, and docking (RPOD) phases of flight. The simulation is to include a generic crew/cargo transportation vehicle and a target object in low-Earth orbit (LEO). Various capsule, winged, and lifting body vehicles as well as historical RPOD methods were evaluated during the project analysis phase. JSC core mission to support the International Space Station (ISS), Commercial Crew Program (CCP), and Human Space Flight (HSF) influenced the project specifications. The simulation is characterized as a 30 meter +V Bar and/or -R Bar approach to the target object's docking station. The ISS was selected as the target object and the international Low Impact Docking System (iLIDS) was selected as the docking mechanism. The location of the target object's docking station corresponds with the RPOD methods identified. The simulation design focuses on Guidance, Navigation, and Control (GNC) system architecture models with station keeping and telemetry data processing capabilities. The optical and inertial sensors, reaction control system thrusters, and the docking mechanism selected were based on CCP vehicle manufacturer's current and proposed technologies. A significant amount of independent study and tutorial completion was required for this project. Multiple primary source materials were accessed using the NASA Technical Report Server (NTRS) and reference textbooks were borrowed from the JSC Main Library and International Space Station Library. The Trick Simulation Environment and User

  11. Hipparcos: mission accomplished

    Science.gov (United States)

    1993-08-01

    During the last few months of its life, as the high radiation environment to which the satellite was exposed took its toll on the on-board system, Hipparcos was operated with only two of the three gyroscopes normally required for such a satellite, following an ambitious redesign of the on-board and on-ground systems. Plans were in hand to operate the satellite without gyroscopes at all, and the first such "gyro- less" data had been acquired, when communication failure with the on-board computers on 24 June 1993 put an end to the relentless flow of 24000 bits of data that have been sent down from the satellite each second, since launch. Further attempts to continue operations proved unsuccessful, and after a short series of sub-systems tests, operations were terminated four years and a week after launch. An enormous wealth of scientific data was gathered by Hipparcos. Even though data analysis by the scientific teams involved in the programme is not yet completed, it is clear that the mission has been an overwhelming success. "The ESA advisory bodies took a calculated risk in selecting this complex but fundamental programme" said Dr. Roger Bonnet, ESA's Director of Science, "and we are delighted to have been able to bring it to a highly successful conclusion, and to have contributed unique information that will take a prominent place in the history and development of astrophysics". Extremely accurate positions of more than one hundred thousand stars, precise distance measurements (in most cases for the first time), and accurate determinations of the stars' velocity through space have been derived. The resulting HIPPARCOS Star Catalogue, expected to be completed in 1996, will be of unprecedented accuracy, achieving results some 10-100 times more accurate than those routinely determined from ground-based astronomical observatories. A further star catalogue, the Thyco Star Catalogue of more than a million stars, is being compiled from additional data accumulated by the

  12. The AGILE Mission

    CERN Document Server

    Tavani, M.; Argan, A.; Boffelli, F.; Bulgarelli, A.; Caraveo, P.; Cattaneo, P.W.; Chen, A.W.; Cocco, V.; Costa, E.; D'Ammando, F.; Del Monte, E.; De Paris, G.; Di Cocco, G.; Di Persio, G.; Donnarumma, I.; Evangelista, Y.; Feroci, M.; Ferrari, A.; Fiorini, M.; Fornari, F.; Fuschino, F.; Froysland, T.; Frutti, M.; Galli, M.; Gianotti, F.; Giuliani, A.; Labanti, C.; Lapshov, I.; Lazzarotto, F.; Liello, F.; Lipari, P.; Longo, F.; Mattaini, E.; Marisaldi, M.; Mastropietro, M.; Mauri, A.; Mauri, F.; Mereghetti, S.; Morelli, E.; Morselli, A.; Pacciani, L.; Pellizzoni, A.; Perotti, F.; Piano, G.; Picozza, P.; Pontoni, C.; Porrovecchio, G.; Prest, M.; Pucella, G.; Rapisarda, M.; Rappoldi, A.; Rossi, E.; Rubini, A.; Soffitta, P.; Traci, A.; Trifoglio, M.; Trois, A.; Vallazza, E.; Vercellone, S.; Vittorini, V.; Zambra, A.; Zanello, D.; Pittori, C.; Preger, B.; Santolamazza, P.; Verrecchia, F.; Giommi, P.; Colafrancesco, S.; Antonelli, A.; Cutini, S.; Gasparrini, D.; Stellato, S.; Fanari, G.; Primavera, R.; Tamburelli, F.; Viola, F.; Guarrera, G.; Salotti, L.; D'Amico, F.; Marchetti, E.; Crisconio, M.; Sabatini, P.; Annoni, G.; Alia, S.; Longoni, A.; Sanquerin, R.; Battilana, M.; Concari, P.; Dessimone, E.; Grossi, R.; Parise, A.; Monzani, F.; Artina, E.; Pavesi, R.; Marseguerra, G.; Nicolini, L.; Scandelli, L.; Soli, L.; Vettorello, V.; Zardetto, E.; Bonati, A.; Maltecca, L.; D'Alba, E.; Patane, M.; Babini, G.; Onorati, F.; Acquaroli, L.; Angelucci, M.; Morelli, B.; Agostara, C.; Cerone, M.; Michetti, A.; Tempesta, P.; D'Eramo, S.; Rocca, F.; Giannini, F.; Borghi, G.; Garavelli, B.; Conte, M.; Balasini, M.; Ferrario, I.; Vanotti, M.; Collavo, E.; Giacomazzo, M.

    2008-01-01

    AGILE is an Italian Space Agency mission dedicated to the observation of the gamma-ray Universe. The AGILE very innovative instrumentation combines for the first time a gamma-ray imager (sensitive in the energy range 30 MeV - 50 GeV), a hard X-ray imager (sensitive in the range 18-60 keV) together with a Calorimeter (sensitive in the range 300 keV - 100 MeV) and an anticoincidence system. AGILE was successfully launched on April 23, 2007 from the Indian base of Sriharikota and was inserted in an equatorial orbit with a very low particle background. AGILE provides crucial data for the study of Active Galactic Nuclei, Gamma-Ray Bursts, pulsars, unidentified gamma-ray sources, Galactic compact objects, supernova remnants, TeV sources, and fundamental physics by microsecond timing. An optimal angular resolution (reaching 0.1-0.2 degrees in gamma-rays, 1-2 arcminutes in hard X-rays) and very large fields of view (2.5 sr and 1 sr, respectively) are obtained by the use of Silicon detectors integrated in a very compa...

  13. The PLATO Mission

    Science.gov (United States)

    Rauer, Heike

    2017-04-01

    PLATO (PLAnetary Transits and Oscillations of stars) has been selected for ESA's M3 launch opportunity end 2025. PLATO will carry out high-precision, long-term photometric and asteroseismic monitoring of a large number of stars. It will provide a large sample of small planets around bright stars, including terrestrial planets in the habitable zone of solar-like stars. PLATO will characterize planets for their radius, mass, and age with high accuracy. PLATO will provide the first large-scale catalogue of well-characterized small planets at intermediate orbital periods, which will be an important constraint to planet formation theories and will provide targets for future atmosphere spectroscopy follow-up observations. This data base of bulk characterized small planets will form a solid basis to put the Solar System into a wider context and allow for comparative exo-planetology. In addition, the precise stellar parameters obtained by asteroseismic studies will open new doors to better understand stellar interiors and allow us to constrain poorly-understood physical processes, like convection, improve our understanding of stellar evolution, and determine precise ages of stars and planetary systems. The talk will provide an overview of the current status of the PLATO mission and focus on its science goals.

  14. Urinary albumin in space missions

    DEFF Research Database (Denmark)

    Cirillo, Massimo; De Santo, Natale G; Heer, Martina

    2002-01-01

    Proteinuria was hypothesized for space mission but research data are missing. Urinary albumin, as index of proteinuria, was analyzed in frozen urine samples collected by astronauts during space missions onboard MIR station and on ground (control). Urinary albumin was measured by a double antibody...

  15. The future of NASA's missions

    Science.gov (United States)

    A'Hearn, Michael F.

    2017-04-01

    Can the recent Discovery mission selections be used as tea leaves to understand the future directions of NASA? In an age of many programmes being used to advance administrative and programmatic goals, Discovery appears to be driven almost entirely by science and by NASA's goal of cheaper missions.

  16. Pastoral ministry in a missional age: Towards a practical theological understanding of missional pastoral care

    Directory of Open Access Journals (Sweden)

    Guillaume H. Smit

    2015-03-01

    Full Text Available This article concerns itself with the development of a missional ecclesiology and the practices that may accept the challenge of conducting pastoral ministry in the context of South African, middleclass congregations adapting to a rapidly changing, post-apartheid environment. Some practical theological perspectives on pastoral counselling are investigated, whilst Narrative Therapy is explored as an emerging theory of deconstruction to enable the facilitating of congregational change towards a missional understanding of church life in local communities. Subsequently, the theological paradigm of missional ecclesiology is investigated before drawing the broad lines of a theory for pastoral ministry within missional ecclesiology.Intradisciplinary and/or interdisciplinary implications: In this article, a missional base theory is proposed for pastoral counselling, consisting of interdisciplinary insights gained from the fields of Missiology, Practical Theology, Narrative Therapy and Cognitive Behaviour Therapy. The implications of this proposal for the development of a missional pastoral theory focus on the following three aspects:� re-establishing pastoral identity: exploring Christ� pastoral development: intentional faith formation� pastoral ministry: enabling Christ-centred lives.In such a missional pastoral theory four practices should be operationalised: first of all, a cognitive approach to increasing knowledge of the biblical narrative is necessary. This provides the hermeneutical skills necessary to enable people to internalise the biblical ethics and character traits ascribed to the Christian life. Secondly, a pastoral theory needs to pay close attention to development of emotional intelligence. Thirdly, this should be done in the context of small groups, where the focus falls on the personality development of members. Finally, missional pastoral theory should also include the acquisition of life coaching skills, where leaders can be

  17. Modeling Constellation Virtual Missions Using the Vdot(Trademark) Process Management Tool

    Science.gov (United States)

    Hardy, Roger; ONeil, Daniel; Sturken, Ian; Nix, Michael; Yanez, Damian

    2011-01-01

    The authors have identified a software tool suite that will support NASA's Virtual Mission (VM) effort. This is accomplished by transforming a spreadsheet database of mission events, task inputs and outputs, timelines, and organizations into process visualization tools and a Vdot process management model that includes embedded analysis software as well as requirements and information related to data manipulation and transfer. This paper describes the progress to date, and the application of the Virtual Mission to not only Constellation but to other architectures, and the pertinence to other aerospace applications. Vdot s intuitive visual interface brings VMs to life by turning static, paper-based processes into active, electronic processes that can be deployed, executed, managed, verified, and continuously improved. A VM can be executed using a computer-based, human-in-the-loop, real-time format, under the direction and control of the NASA VM Manager. Engineers in the various disciplines will not have to be Vdot-proficient but rather can fill out on-line, Excel-type databases with the mission information discussed above. The author s tool suite converts this database into several process visualization tools for review and into Microsoft Project, which can be imported directly into Vdot. Many tools can be embedded directly into Vdot, and when the necessary data/information is received from a preceding task, the analysis can be initiated automatically. Other NASA analysis tools are too complex for this process but Vdot automatically notifies the tool user that the data has been received and analysis can begin. The VM can be simulated from end-to-end using the author s tool suite. The planned approach for the Vdot-based process simulation is to generate the process model from a database; other advantages of this semi-automated approach are the participants can be geographically remote and after refining the process models via the human-in-the-loop simulation, the

  18. Bringing the Ocean into Finer Focus through the NASA COAST, HyspIRI, and OCEANIA Suborbital Missions

    Science.gov (United States)

    Palacios, S. L.; Guild, L. S.; Kudela, R. M.; Hooker, S. B.; Morrow, J. H.; Russell, P. B.; Livingston, J. M.; Negrey, K.; Torres-Perez, J. L.; Kacenelenbogen, M. S.

    2014-12-01

    High-quality ocean color measurements are needed to characterize water quality and phytoplankton functional types in the coastal zone. Accurate ocean color retrievals are often confounded by inadequacies in atmospheric correction. The recent NASA COAST, HyspIRI, and OCEANIA suborbital missions over Monterey Bay, CA have used novel instruments in a multi-sensor, multi-platform approach to collect above- and in-water measurements to better characterize ocean color through improvements in instrument dynamic range and attention to atmospheric correction. High-level objectives of these missions are to characterize the coastal ocean through end-to-end assessment of image acquisition, atmospheric correction, algorithm application, and sea-truth observations to improve vicarious calibration and validation of satellite ocean color products. We present results from COAST, HyspIRI, and OCEANIA to demonstrate the importance of coincident atmospheric and sea-truth measurements to improve atmospheric correction. Our specific objective was to conduct a sensitivity analysis of the atmospheric correction algorithm, Tafkaa, on Headwall Imaging Spectrometer data using input parameters of atmospheric aerosol optical depth spectra and column water vapor obtained from the Ames Airborne Tracking Sunphotometer (AATS-14) collected on the CIRPAS Twin Otter during COAST (2011). Use of the high dynamic-range, in-water Compact-Optical Profiling System (C-OPS) and above-water Coastal Airborne In-situ Radiometers (C-AIR) with matched wavelength channels enabled accurate observations of exact water-leaving radiance to use in validating imagery. Results from HyspIRI and OCEANIA (October 2013) flown on the NASA ER-2 and CIRPAS Twin Otter will be presented. Knowledge gained from these missions will improve vicarious calibration and validation of legacy (MODIS) and future (PACE & GEO-CAPE) satellite sensors to better characterize coastal ecosystems using ocean color observations.

  19. The Asteroid Redirect Mission (ARM)

    Science.gov (United States)

    Abell, P. A.; Mazanek, D. D.; Reeves, D. M.; Chodas, P. W.; Gates, M. M.; Johnson, L. N.; Ticker, R. L.

    2016-01-01

    To achieve its long-term goal of sending humans to Mars, the National Aeronautics and Space Administration (NASA) plans to proceed in a series of incrementally more complex human spaceflight missions. Today, human flight experience extends only to Low-Earth Orbit (LEO), and should problems arise during a mission, the crew can return to Earth in a matter of minutes to hours. The next logical step for human spaceflight is to gain flight experience in the vicinity of the Moon. These cis-lunar missions provide a "proving ground" for the testing of systems and operations while still accommodating an emergency return path to the Earth that would last only several days. Cis-lunar mission experience will be essential for more ambitious human missions beyond the Earth- Moon system, which will require weeks, months, or even years of transit time.

  20. Multi-mission Satellite Management

    Science.gov (United States)

    Jamilkowski, M. L.; Teter, M. A.; Grant, K. D.; Dougherty, B.; Cochran, S.

    2015-12-01

    NOAA's next-generation environmental satellite, the Joint Polar Satellite System (JPSS) replaces the current Polar-orbiting Operational Environmental Satellites (POES). JPSS satellites carry sensors which collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The first JPSS satellite was launched in 2011 and is currently NOAA's primary operational polar satellite. The JPSS ground system is the Common Ground System (CGS), and provides command, control, and communications (C3) and data processing (DP). A multi-mission system, CGS provides combinations of C3/DP for numerous NASA, NOAA, DoD, and international missions. In preparation for the next JPSS satellite, CGS improved its multi-mission capabilities to enhance mission operations for larger constellations of earth observing satellites with the added benefit of streamlining mission operations for other NOAA missions. CGS's multi-mission capabilities allows management all of assets as a single enterprise, more efficiently using ground resources and personnel and consolidating multiple ground systems into one. Sophisticated scheduling algorithms compare mission priorities and constraints across all ground stations, creating an enterprise schedule optimized to mission needs, which CGS executes to acquire the satellite link, uplink commands, downlink and route data to the operations and data processing facilities, and generate the final products for delivery to downstream users. This paper will illustrate the CGS's ability to manage multiple, enterprise-wide polar orbiting missions by demonstrating resource modeling and tasking, production of enterprise contact schedules for NOAA's Fairbanks ground station (using both standing and ad hoc requests), deconflicting resources due to ground outages, and updating resource allocations through dynamic priority definitions.

  1. Trajectories for a Near Term Mission to the Interstellar Medium

    Science.gov (United States)

    Arora, Nitin; Strange, Nathan; Alkalai, Leon

    2015-01-01

    Trajectories for rapid access to the interstellar medium (ISM) with a Kuiper Belt Object (KBO) flyby, launching between 2022 and 2030, are described. An impulsive-patched-conic broad search algorithm combined with a local optimizer is used for the trajectory computations. Two classes of trajectories, (1) with a powered Jupiter flyby and (2) with a perihelion maneuver, are studied and compared. Planetary flybys combined with leveraging maneuvers reduce launch C3 requirements (by factor of 2 or more) and help satisfy mission-phasing constraints. Low launch C3 combined with leveraging and a perihelion maneuver is found to be enabling for a near-term potential mission to the ISM.

  2. Oceanography from the SWOT Mission: Opportunities and Challenges

    Science.gov (United States)

    Fu, L. L.; Morrow, R.

    2016-12-01

    The Surface Water and Ocean Topography (SWOT) is a joint mission of NASA and the French Space Agency CNES, with contributions from Canada and UK. The primary instrument of SWOT is a Ka-band radar interferometer for measuring the elevation of water surface over land and ocean. The oceanographic objectives of the mission are to observe sea surface height (SSH) at scales approaching 15 km, depending on the sea state. SWOT will make SSH measurement over a swath of 120 km with a nadir gap of 20 km in a 21-day repeat orbit to map the entire ocean with minimal gaps. A conventional radar altimeter will provide measurement along the nadir gap. There will be a 90-day calibration/validation phase with a 1-day repeat orbit to assess the mission performance and study the high-frequency oceanographic processes. The coverage and resolution of the mission will represent significant improvement over conventional altimetry. This is an exploratory mission with applications in oceanography and hydrology. The increased spatial resolution offers an unprecedented opportunity to study finer-scale 2D ocean surface height processes and to address important questions about the ocean circulation, and its interaction with ocean tides. However, the limited temporal sampling poses challenges to map the evolution of the ocean variability that changes rapidly at small scales. The measurement technique and the development of the mission will be presented with an emphasis on its science program with an outlook on the opportunities and challenges.

  3. The Science of Mission Assurance

    Directory of Open Access Journals (Sweden)

    Kamal Jabbour

    2011-01-01

    Full Text Available The intent of this article is to describe—and prescribe—a scientific framework for assuring mission essential functions in a contested cyber environment. Such a framework has profound national security implications as the American military increasingly depends on cyberspace to execute critical mission sets. In setting forth this prescribed course of action, the article will first decompose information systems into atomic processes that manipulate information at all six phases of the information lifecycle, then systematically define the mathematical rules that govern mission assurance.

  4. High-efficiency UV/optical/NIR detectors for large aperture telescopes and UV explorer missions: development of and field observations with delta-doped arrays

    Science.gov (United States)

    Nikzad, Shouleh; Jewell, April D.; Hoenk, Michael E.; Jones, Todd J.; Hennessy, John; Goodsall, Tim; Carver, Alexander G.; Shapiro, Charles; Cheng, Samuel R.; Hamden, Erika T.; Kyne, Gillian; Martin, D. Christopher; Schiminovich, David; Scowen, Paul; France, Kevin; McCandliss, Stephan; Lupu, Roxana E.

    2017-07-01

    Exciting concepts are under development for flagship, probe class, explorer class, and suborbital class NASA missions in the ultraviolet/optical spectral range. These missions will depend on high-performance silicon detector arrays being delivered affordably and in high numbers. To that end, we have advanced delta-doping technology to high-throughput and high-yield wafer-scale processing, encompassing a multitude of state-of-the-art silicon-based detector formats and designs. We have embarked on a number of field observations, instrument integrations, and independent evaluations of delta-doped arrays. We present recent data and innovations from JPL's Advanced Detectors and Systems Program, including two-dimensional doping technology, JPL's end-to-end postfabrication processing of high-performance UV/optical/NIR arrays and advanced coatings for detectors. While this paper is primarily intended to provide an overview of past work, developments are identified and discussed throughout. Additionally, we present examples of past, in-progress, and planned observations and deployments of delta-doped arrays.

  5. 75 FR 6178 - Mission Statement

    Science.gov (United States)

    2010-02-08

    .... companies that are experienced exporters enter Indonesia for the first time in support of creating green... significant resource potential and its desire to invest in cutting-edge clean energy technologies. Mission...

  6. Mission Level Autonomy for USSV

    Science.gov (United States)

    Huntsberger, Terry; Stirb, Robert C.; Brizzolara, Robert

    2011-01-01

    On-water demonstration of a wide range of mission-proven, advanced technologies at TRL 5+ that provide a total integrated, modular approach to effectively address the majority of the key needs for full mission-level autonomous, cross-platform control of USV s. Wide baseline stereo system mounted on the ONR USSV was shown to be an effective sensing modality for tracking of dynamic contacts as a first step to automated retrieval operations. CASPER onboard planner/replanner successfully demonstrated realtime, on-water resource-based analysis for mission-level goal achievement and on-the-fly opportunistic replanning. Full mixed mode autonomy was demonstrated on-water with a seamless transition between operator over-ride and return to current mission plan. Autonomous cooperative operations for fixed asset protection and High Value Unit escort using 2 USVs (AMN1 & 14m RHIB) were demonstrated during Trident Warrior 2010 in JUN 2010

  7. Introduction to mission data system

    Science.gov (United States)

    Krasner, S.; Rasmussen, R.

    2001-01-01

    MDS state-based architecture. A system compromises project assets in the context of some external environments that influences them. The function of mission software is to monitor and control a system to meet operators' intents.

  8. Mission Critical: Preventing Antibiotic Resistance

    Science.gov (United States)

    ... this? Submit Button Past Emails Mission Critical: Preventing Antibiotic Resistance Recommend on Facebook Tweet Share Compartir Can you ... spp. So, what can we do to prevent antibiotic resistance in healthcare settings? Patients, healthcare providers, healthcare facility ...

  9. Lunam 2000 (Lunar Atmosphere Mission)

    Science.gov (United States)

    Barbieri, Cesare; Fornasier, Sonia; Lazzarin, Monica; Marchi, Simone; Rampazzi, Francesca; Verani, Stefano; Cremonese, Gabriele; Ragazzoni, Roberto; Dolci, Mauro; Benn, Chris R.; Mendillo, Michael; Baumgardner, Jeff; Chakrabarti, Supriya; Wilson, Jody

    LUNAM 2000 is a small mission dedicated to the coronagraphic imaging in the Na yellow doublet and to UV spectroscopy in the range 2800-3400 Å of the lunar atmosphere. These studies are possible only from Space. The scientific return of LUNAM 2000 has a wider appeal for the study of transient atmospheres of other celestial bodies, in particular of Mercury. The mission is in low Earth-orbit (about 350 km); a sun-synchronous or other orbits are under investigation. The payload has very small weight, dimensions and power requests, and is essentially made with off-the-shelf components. It can be built and launched in less than 3 years from the approval. This time frame nicely overlaps that of the European technological Mission SMART 1 and can greatly add to its scientific return. Furthermore, LUNAM 2000 can give very important information to define a mission to Mercury such as Bepi Colombo.

  10. Aeromobile forces in missions abroa

    Directory of Open Access Journals (Sweden)

    Kaja WYMYSŁOWSKA

    2014-12-01

    Full Text Available This article shows the role of Aeromobile forces during missions abroad. Firstly it will be explained how to understand the concept of “Aeromobile forces”. After that a short classification of missions abroad will be shown. Part I which is the main part of the article will introduce the using of helicopters Mi-17 and Mi-24 through the example from three different missions in Ethiopia, Chad and Afghanistan by showing their main tasks. Analysis included in this article should help with estimating capability of old types of helicopters. This chapter will raise an issue concern method for dealing with resistance. Last part will involve some costs explanation connected to helicopter operating costs. The conclusions contain the lessons learned from all missions mentioned in the article and some prospective possible solutions.

  11. Mission Readiness Measurement Aid (MIRMAID)

    National Research Council Canada - National Science Library

    Bowden, Tim

    2001-01-01

    .... The tool we have designed is intended to combine automated and observed measures of performance to provide the Commanding Officer feedback regarding the readiness of his unit to perform key missions...

  12. General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  13. Starshade Rendezvous Mission Probe Concept

    Science.gov (United States)

    Seager, Sara; Kasdin, Jeremy; Starshade Rendezvous Probe Team

    2018-01-01

    The Starshade Rendezvous Mission Concept Prove is a Starshade that works with the WFIRST Mission, but is built and launched separately, with a rendezvous on orbit. A 2015 Exo-S report first detailed the mission concept. In the current study we develop a new scientific vision for WFIRST exoplanet discovery and characterization, using the complementary coronagraph and starshade to execute the most sensitive and thorough direct imaging campaign ever attempted. The overarching goal of our proposal is to carry out the first “deep dive” direct imaging exploration of planetary systems orbiting the nearest sun-like stars in a search for Earth-like planets using only a fraction of the WFIRST telescope time. The study aims to improve on the Exo-S 2015 report with updated study of the key spacecraft and starshade technology development issues, as related to WFIRST design changes since 2015 that make the timely implementation of such a mission possible.

  14. Requirements for Common Bomber Mission Planning Environment

    National Research Council Canada - National Science Library

    White, III, Samuel G

    2006-01-01

    ...) level mission planning as a whole. Unfortunately, many of these initiatives have fallen short of seamlessly connecting the tactical level mission planning processes with the operational level or providing the unit-level mission...

  15. General Mission Analysis Tool (GMAT) Mathematical Specifications

    Science.gov (United States)

    Hughes, Steve

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

  16. EOS Terra: Mission Status Constellation MOWG

    Science.gov (United States)

    Mantziaras, Dimitrios

    2016-01-01

    This EOS Terra Mission Status Constellation MOWG will discuss mission summary; spacecraft subsystems summary, recent and planned activities; inclination adjust maneuvers, conjunction history, propellant usage and lifetime estimate; and end of mission plan.

  17. Overview of the Gaia Mission

    Science.gov (United States)

    Perryman, M. A. C.

    2005-10-01

    The overall goals and organisation of the Gaia mission are described: the role of the scientific community in the project; the organisation, structure, and goals of the scientific working groups; their interaction and influence on the satellite and payload design; the overall project schedule; the organisation and overall approach to the challenges of the data analysis; and the mission data products and their estimated release dates. Some of the potential for education and outreach activities are noted.

  18. Urinary albumin in space missions

    DEFF Research Database (Denmark)

    Cirillo, Massimo; De Santo, Natale G; Heer, Martina

    2002-01-01

    Proteinuria was hypothesized for space mission but research data are missing. Urinary albumin, as index of proteinuria, was analyzed in frozen urine samples collected by astronauts during space missions onboard MIR station and on ground (control). Urinary albumin was measured by a double antibody...... radioimmunoassay. On average, 24h urinary albumin was 27.4% lower in space than on ground; the difference was statistically significant. Low urinary albumin excretion could be another effect of exposure to weightlessness (microgravity)....

  19. Resident Participation in International Surgical Missions is Predictive of Future Volunteerism in Practice

    Directory of Open Access Journals (Sweden)

    Shruti Chudasama Tannan

    2015-03-01

    Full Text Available BackgroundInterest in global health and international mission trips among medical student and resident trainees is growing rapidly. How these electives and international mission experiences affect future practice is still being elucidated. No study has identified if participation in international surgical missions during residency is a predictor of participation in international surgical missions in practice after training completion.MethodsAll trainees of our plastic surgery residency program from 1990 to 2011, during the implementation of optional annual international surgical missions, were surveyed to determine if the graduate had gone on a mission as a resident and as a plastic surgeon. Data were compared between graduates who participated in missions as residents and graduates who did not, from 1990 to 2011 and 1990 to 2007.ResultsOf Plastic Surgery graduates from 1990 to 2011 who participated in international missions as residents, 60% participated in missions when in practice, versus 5.9% of graduates participating in missions in practice but not residency (P<0.0001. When excluding last 5 years, graduates participating in international missions in practice after doing so as residents increases to 85.7%, versus 7.41% who participate in practice but not residency P<0.002.ConclusionsResults reveal plastic surgeons who participate in international surgical missions as residents participate in international surgical missions in practice at higher rates than graduates who did not participate in missions during residency. International missions have significant intrinsic value both to trainee and international communities served, and this opportunity should be readily and easily accessible to all plastic surgery residents nationwide.

  20. NASA COAST and OCEANIA Airborne Missions Support Ecosystem and Water Quality Research in the Coastal Zone

    Science.gov (United States)

    Guild, Liane; Kudela, Raphael; Hooker, Stanford; Morrow, John; Russell, Philip; Palacios, Sherry; Livingston, John M.; Negrey, Kendra; Torres-Perez, Juan; Broughton, Jennifer

    2014-01-01

    NASA has a continuing requirement to collect high-quality in situ data for the vicarious calibration of current and next generation ocean color satellite sensors and to validate the algorithms that use the remotely sensed observations. Recent NASA airborne missions over Monterey Bay, CA, have demonstrated novel above- and in-water measurement capabilities supporting a combined airborne sensor approach (imaging spectrometer, microradiometers, and a sun photometer). The results characterize coastal atmospheric and aquatic properties through an end-to-end assessment of image acquisition, atmospheric correction, algorithm application, plus sea-truth observations from state-of-the-art instrument systems. The primary goal is to demonstrate the following in support of calibration and validation exercises for satellite coastal ocean color products: 1) the utility of a multi-sensor airborne instrument suite to assess the bio-optical properties of coastal California, including water quality; and 2) the importance of contemporaneous atmospheric measurements to improve atmospheric correction in the coastal zone. The imaging spectrometer (Headwall) is optimized in the blue spectral domain to emphasize remote sensing of marine and freshwater ecosystems. The novel airborne instrument, Coastal Airborne In-situ Radiometers (C-AIR) provides measurements of apparent optical properties with high dynamic range and fidelity for deriving exact water leaving radiances at the land-ocean boundary, including radiometrically shallow aquatic ecosystems. Simultaneous measurements supporting empirical atmospheric correction of image data are accomplished using the Ames Airborne Tracking Sunphotometer (AATS-14). Flight operations are presented for the instrument payloads using the Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS) Twin Otter flown over Monterey Bay during the seasonal fall algal bloom in 2011 (COAST) and 2013 (OCEANIA) to support bio-optical measurements of

  1. NASA COAST and OCEANIA Airborne Missions Support Ecosystem and Water Quality Research in the Coastal Zone

    Science.gov (United States)

    Guild, L. S.; Kudela, R. M.; Hooker, S. B.; Morrow, J. H.; Russell, P. B.; Palacios, S. L.; Livingston, J. M.; Negrey, K.; Torres-Perez, J. L.; Broughton, J.

    2014-12-01

    NASA has a continuing requirement to collect high-quality in situ data for the vicarious calibration of current and next generation ocean color satellite sensors and to validate the algorithms that use the remotely sensed observations. Recent NASA airborne missions over Monterey Bay, CA, have demonstrated novel above- and in-water measurement capabilities supporting a combined airborne sensor approach (imaging spectrometer, microradiometers, and a sun photometer). The results characterize coastal atmospheric and aquatic properties through an end-to-end assessment of image acquisition, atmospheric correction, algorithm application, plus sea-truth observations from state-of-the-art instrument systems. The primary goal is to demonstrate the following in support of calibration and validation exercises for satellite coastal ocean color products: 1) the utility of a multi-sensor airborne instrument suite to assess the bio-optical properties of coastal California, including water quality; and 2) the importance of contemporaneous atmospheric measurements to improve atmospheric correction in the coastal zone. The imaging spectrometer (Headwall) is optimized in the blue spectral domain to emphasize remote sensing of marine and freshwater ecosystems. The novel airborne instrument, Coastal Airborne In-situ Radiometers (C-AIR) provides measurements of apparent optical properties with high dynamic range and fidelity for deriving exact water leaving radiances at the land-ocean boundary, including radiometrically shallow aquatic ecosystems. Simultaneous measurements supporting empirical atmospheric correction of image data are accomplished using the Ames Airborne Tracking Sunphotometer (AATS-14). Flight operations are presented for the instrument payloads using the CIRPAS Twin Otter flown over Monterey Bay during the seasonal fall algal bloom in 2011 (COAST) and 2013 (OCEANIA) to support bio-optical measurements of phytoplankton for coastal zone research.

  2. Link Analysis in the Mission Planning Lab

    Science.gov (United States)

    McCarthy, Jessica A.; Cervantes, Benjamin W.; Daugherty, Sarah C.; Arroyo, Felipe; Mago, Divyang

    2011-01-01

    The legacy communications link analysis software currently used at Wallops Flight Facility involves processes that are different for command destruct, radar, and telemetry. There is a clear advantage to developing an easy-to-use tool that combines all the processes in one application. Link Analysis in the Mission Planning Lab (MPL) uses custom software and algorithms integrated with Analytical Graphics Inc. Satellite Toolkit (AGI STK). The MPL link analysis tool uses pre/post-mission data to conduct a dynamic link analysis between ground assets and the launch vehicle. Just as the legacy methods do, the MPL link analysis tool calculates signal strength and signal- to-noise according to the accepted processes for command destruct, radar, and telemetry assets. Graphs and other custom data are generated rapidly in formats for reports and presentations. STK is used for analysis as well as to depict plume angles and antenna gain patterns in 3D. The MPL has developed two interfaces with the STK software (see figure). The first interface is an HTML utility, which was developed in Visual Basic to enhance analysis for plume modeling and to offer a more user friendly, flexible tool. A graphical user interface (GUI) written in MATLAB (see figure upper right-hand corner) is also used to quickly depict link budget information for multiple ground assets. This new method yields a dramatic decrease in the time it takes to provide launch managers with the required link budgets to make critical pre-mission decisions. The software code used for these two custom utilities is a product of NASA's MPL.

  3. Cost overruns will affect Galileo mission

    Science.gov (United States)

    Bell, Peter M.

    Recent news of the cost overruns in the development of the shuttle's upper stages that will affect launching of the proposed Galileo mission prompted the following statement from Robert A. Frosch, on the eve of his resignation from the post of NASA Administrator: You know that we have been carrying out a concentrated study of Shuttle upper stages for 2 1/2 months now. This study was initiated in early November when we became concerned with the continued rapid escalation of estimated costs for the three-stage IUS (inertial upper stage). We have decided on the best course of action for the future, and I want to outline for you how I believe the nation should proceed.

  4. Rapid Active Sampling Package

    Science.gov (United States)

    Peters, Gregory

    2010-01-01

    A field-deployable, battery-powered Rapid Active Sampling Package (RASP), originally designed for sampling strong materials during lunar and planetary missions, shows strong utility for terrestrial geological use. The technology is proving to be simple and effective for sampling and processing materials of strength. Although this originally was intended for planetary and lunar applications, the RASP is very useful as a powered hand tool for geologists and the mining industry to quickly sample and process rocks in the field on Earth. The RASP allows geologists to surgically acquire samples of rock for later laboratory analysis. This tool, roughly the size of a wrench, allows the user to cut away swaths of weathering rinds, revealing pristine rock surfaces for observation and subsequent sampling with the same tool. RASPing deeper (.3.5 cm) exposes single rock strata in-situ. Where a geologist fs hammer can only expose unweathered layers of rock, the RASP can do the same, and then has the added ability to capture and process samples into powder with particle sizes less than 150 microns, making it easier for XRD/XRF (x-ray diffraction/x-ray fluorescence). The tool uses a rotating rasp bit (or two counter-rotating bits) that resides inside or above the catch container. The container has an open slot to allow the bit to extend outside the container and to allow cuttings to enter and be caught. When the slot and rasp bit are in contact with a substrate, the bit is plunged into it in a matter of seconds to reach pristine rock. A user in the field may sample a rock multiple times at multiple depths in minutes, instead of having to cut out huge, heavy rock samples for transport back to a lab for analysis. Because of the speed and accuracy of the RASP, hundreds of samples can be taken in one day. RASP-acquired samples are small and easily carried. A user can characterize more area in less time than by using conventional methods. The field-deployable RASP used a Ni

  5. The Asteroid Redirect Mission (ARM)

    Science.gov (United States)

    Abell, Paul; Gates, Michele; Johnson, Lindley; Chodas, Paul; Mazanek, Dan; Reeves, David; Ticker, Ronald

    2016-07-01

    To achieve its long-term goal of sending humans to Mars, the National Aeronautics and Space Administration (NASA) plans to proceed in a series of incrementally more complex human spaceflight missions. Today, human flight experience extends only to Low-Earth Orbit (LEO), and should problems arise during a mission, the crew can return to Earth in a matter of minutes to hours. The next logical step for human spaceflight is to gain flight experience in the vicinity of the Moon. These cis-lunar missions provide a "proving ground" for the testing of systems and operations while still accommodating an emergency return path to the Earth that would last only several days. Cis-lunar mission experience will be essential for more ambitious human missions beyond the Earth-Moon system, which will require weeks, months, or even years of transit time. In addition, NASA has been given a Grand Challenge to find all asteroid threats to human populations and know what to do about them. Obtaining knowledge of asteroid physical properties combined with performing technology demonstrations for planetary defense provide much needed information to address the issue of future asteroid impacts on Earth. Hence the combined objectives of human exploration and planetary defense give a rationale for the Asteroid Re-direct Mission (ARM). Mission Description: NASA's ARM consists of two mission segments: 1) the Asteroid Redirect Robotic Mission (ARRM), the first robotic mission to visit a large (greater than ~100 m diameter) near-Earth asteroid (NEA), collect a multi-ton boulder from its surface along with regolith samples, demonstrate a planetary defense technique, and return the asteroidal material to a stable orbit around the Moon; and 2) the Asteroid Redirect Crewed Mission (ARCM), in which astronauts will take the Orion capsule to rendezvous and dock with the robotic vehicle, conduct multiple extravehicular activities to explore the boulder, and return to Earth with samples. NASA's proposed

  6. A mission planning concept and mission planning system for future manned space missions

    Science.gov (United States)

    Wickler, Martin

    1994-01-01

    The international character of future manned space missions will compel the involvement of several international space agencies in mission planning tasks. Additionally, the community of users requires a higher degree of freedom for experiment planning. Both of these problems can be solved by a decentralized mission planning concept using the so-called 'envelope method,' by which resources are allocated to users by distributing resource profiles ('envelopes') which define resource availabilities at specified times. The users are essentially free to plan their activities independently of each other, provided that they stay within their envelopes. The new developments were aimed at refining the existing vague envelope concept into a practical method for decentralized planning. Selected critical functions were exercised by planning an example, founded on experience acquired by the MSCC during the Spacelab missions D-1 and D-2. The main activity regarding future mission planning tasks was to improve the existing MSCC mission planning system, using new techniques. An electronic interface was developed to collect all formalized user inputs more effectively, along with an 'envelope generator' for generation and manipulation of the resource envelopes. The existing scheduler and its data base were successfully replaced by an artificial intelligence scheduler. This scheduler is not only capable of handling resource envelopes, but also uses a new technology based on neuronal networks. Therefore, it is very well suited to solve the future scheduling problems more efficiently. This prototype mission planning system was used to gain new practical experience with decentralized mission planning, using the envelope method. In future steps, software tools will be optimized, and all data management planning activities will be embedded into the scheduler.

  7. Missional theological curricula and institutions

    Directory of Open Access Journals (Sweden)

    Kruger P. du Preez

    2014-01-01

    Full Text Available The article argues in favour of an all-embracing missional framework for curriculum development for theological institutions. When the curriculum of a subject such as ecclesiologyhas a missional hermeneutic, it will naturally lead to missional congregations. The authors use issues raised by the Network for African Congregational Theology (NetACT institutions and the decisions of the Third Lausanne Congress in Cape Town (2010 as reference points in this article. They argue for a broad understanding of the concept �missional� and are of the opinion that curricula that are integrative, normative, contextual and missional will lead to spiritual maturity and will result in a positive impact on church and society as a whole. The missio Deias the work of the Trinitarian God is seen as being God�s initiative. The incarnational modelof Jesus Christ forms the basis for a theology and missiology where humility, vulnerability and servanthood play a pivotal role in curricula. An appeal is made for holistic missions with a strong emphasis on social engagement and the inclusion of community development. The Holy Spirit is seen as the empowering presence of the missio Dei, and the role of pneumatologyin missional curriculum development is underscored. Theological institutes should become�proclamation� institutions. Curricula should be ecumenical by nature and should include reaching the unreached and unengaged people groups. Theological education by extension is presented as an alternative way of decent ralised theological education.Intradisciplinary and/or interdisciplinary implications: The article calls for theology to be done with a missional hermeneutic, both intradisciplinarily and interdisciplinarily. The article involves theology and education and calls for all disciplines dealing with community development to collaborate.

  8. Mission and system concepts for Mars robotic precursor missions

    Science.gov (United States)

    Scoon, George E. N.; Hechler, Martin

    1993-01-01

    Mission and system design concepts reflecting the status at about the midpoint of the Marsnet phase A study are reported. The objective of Marsnet is to place three to four small stations (approximately 80 kg) on the surface of Mars to perform scientific measurements in the areas of geophysics (seismology), geology, geochemistry, mineralogy, meteorology, and exobiology. The ESA Landers will constitute part of a global network to which NASA is planning to contribute up to 16 other stations. The Mars Global Network may be seen as a precursor to the exploration of Mars by mobile vehicles in terms of its scientific measurements. But, also, some aspects of mission and system design addressed may be applicable to more complex robotic missions to Mars, for example, the development and testing of feasible probe delivery concepts; the design of low mass, low power components, and solar arrays suited for the Mars environment; and the development of a low complexity mobile instrument deployment device.

  9. A Failing Mission in Afghanistan: Salvation is Possible

    Science.gov (United States)

    2010-05-13

    situation and resulted on hasty planning. The rapid fall of the Taliban coupled with limited and shifting political objectives put the mission in a...Contents TABLE OF CONTENTS PREFACE INTRODUCTION HISTORICAL AND POLITICAL CONTEXT MILITARY STRATEGY SOCIO-ECONOMIC STRATEGY STRATEGIC...remercier rna tendre epouse sans qui je n’aurais jamais termine ce projet. Ta patience envers moi meme lorsque je me suis desiste de certaines taches et

  10. Flora: A Proposed Hyperspectral Mission

    Science.gov (United States)

    Ungar, Stephen; Asner, Gregory; Green, Robert; Knox, Robert

    2006-01-01

    In early 2004, one of the authors (Stephen Ungar, NASA GSFC) presented a mission concept called "Spectrasat" at the AVIRIS Workshop in Pasadena, CA. This mission concept grew out of the lessons learned from the Earth Observing-One (EO-1) Hyperion Imaging Spectrometer and was structured to more effectively accomplish the types of studies conducted with Hyperion. The Spectrasat concept represented an evolution of the technologies and operation strategies employed on EO-I. The Spectrasat concept had been preceded by two community-based missions proposed by Susan Ustin, UC Davis and Robert Green, NASA JPL. As a result of community participation, starting at this AVIRIS Workshop, the Spectrasat proposal evolved into the Flora concept which now represents the combined visions of Gregory Asner (Carnegie Institute), Stephen Ungar, Robert Green and Robert Knox, NASA GSFC. Flora is a proposed imaging spectrometer mission, designed to address global carbon cycle science issues. This mission centers on measuring ecological disturbance for purposes of ascertaining changes in global carbon stocks and draws heavily on experience gained through AVIRIS airborne flights and Hyperion space born flights. The observing strategy exploits the improved ability of imaging spectrometers, as compared with multi-spectral observing systems, to identify vegetation functional groups, detect ecosystem response to disturbance and assess the related discovery. Flora will be placed in a sun synchronous orbit, with a 45 meter pixel size, a 90 km swath width and a 31 day repeat cycle. It covers the spectral range from 0.4 to 2.5 micrometers with a spectral sampling interval of 10 nm. These specifications meet the needs of the Flora science team under the leadership of Gregory Asner. Robert Green, has introduced a spectrometer design for Flora which is expected to have a SNR of 600: 1 in the VNIR and 450: 1 in the SWIR. The mission team at NASA GSFC is designing an Intelligent Payload Module (IPM

  11. KEPLER RAPIDLY ROTATING GIANT STARS

    Energy Technology Data Exchange (ETDEWEB)

    Costa, A. D.; Martins, B. L. Canto; Bravo, J. P.; Paz-Chinchón, F.; Chagas, M. L. das; Leão, I. C.; Oliveira, G. Pereira de; Silva, R. Rodrigues da; Roque, S.; Oliveira, L. L. A. de; Silva, D. Freire da; De Medeiros, J. R., E-mail: renan@dfte.ufrn.br [Departamento de Física Teórica e Experimental, Universidade Federal do Rio Grande do Norte, Campus Universitário, Natal RN (Brazil)

    2015-07-10

    Rapidly rotating giant stars are relatively rare and may represent important stages of stellar evolution, resulting from stellar coalescence of close binary systems or accretion of substellar companions by their hosting stars. In the present Letter, we report 17 giant stars observed in the scope of the Kepler space mission exhibiting rapid rotation behavior. For the first time, the abnormal rotational behavior for this puzzling family of stars is revealed by direct measurements of rotation, namely from photometric rotation period, exhibiting a very short rotation period with values ranging from 13 to 55 days. This finding points to remarkable surface rotation rates, up to 18 times the rotation of the Sun. These giants are combined with six others recently listed in the literature for mid-infrared (IR) diagnostics based on Wide-field Infrared Survey Explorer information, from which a trend for an IR excess is revealed for at least one-half of the stars, but at a level far lower than the dust excess emission shown by planet-bearing main-sequence stars.

  12. Extended mission life support systems

    Science.gov (United States)

    Quattrone, P. D.

    1985-01-01

    Extended manned space missions which include interplanetary missions require regenerative life support systems. Manned mission life support considerations are placed in perspective and previous manned space life support system technology, activities and accomplishments in current supporting research and technology (SR&T) programs are reviewed. The life support subsystem/system technologies required for an enhanced duration orbiter (EDO) and a space operations center (SOC), regenerative life support functions and technology required for manned interplanetary flight vehicles, and future development requirements are outlined. The Space Shuttle Orbiters (space transportation system) is space cabin atmosphere is maintained at Earth ambient pressure of 14.7 psia (20% O2 and 80% N2). The early Shuttle flights will be seven-day flights, and the life support system flight hardware will still utilize expendables.

  13. MISSION AMONG THE JEWS 1. INTRODUCTION

    African Journals Online (AJOL)

    The author discusses whether the issue of mission among the Jews deals with the basic question of mission or whether it is the core of the Christian faith. Although both Jews and Christians reject the idea and (more so) mission among the Jews, the author strongly argues for its need, for mission is not the expansion of ideas ...

  14. The Asteroid Impact Mission (AIM)

    Science.gov (United States)

    Küppers, M.; Carnelli, I.; Galvez, A.; Mellab, K.; Michel, P.; AIM Team

    2015-10-01

    The Asteroid Impact Mission (AIM) is ESA's contribution to an international cooperation targeting the demonstration of deflection of a hazardous nearearth asteroid as well as the first in-depth investigation of a binary asteroid. After launch in 2020, AIM will rendezvous the binary near-Earth asteroid (65803) Didymos in 2022 and observe the system before, during, and after the impact of NASA's Double Asteroid Redirection Test (DART) spacecraft. The AIM mission will test new technologies like optical telecommunications by laser and Cubesats with nano-payloads and will perform scientific measurements at the asteroid system.

  15. NASA Facts, The Viking Mission.

    Science.gov (United States)

    National Aeronautics and Space Administration, Washington, DC. Educational Programs Div.

    Presented is one of a series of publications of National Aeronautics and Space Administration (NASA) facts about the exploration of Mars. The Viking mission to Mars, consisting of two unmanned NASA spacecraft launched in August and September, 1975, is described. A description of the spacecraft and their paths is given. A diagram identifying the…

  16. Gravitational-wave Mission Study

    Science.gov (United States)

    Mcnamara, Paul; Jennrich, Oliver; Stebbins, Robin T.

    2014-01-01

    In November 2013, ESA selected the science theme, the "Gravitational Universe," for its third large mission opportunity, known as L3, under its Cosmic Vision Programme. The planned launch date is 2034. ESA is considering a 20% participation by an international partner, and NASA's Astrophysics Division has indicated an interest in participating. We have studied the design consequences of a NASA contribution, evaluated the science benefits and identified the technology requirements for hardware that could be delivered by NASA. The European community proposed a strawman mission concept, called eLISA, having two measurement arms, derived from the well studied LISA (Laser Interferometer Space Antenna) concept. The US community is promoting a mission concept known as SGO Mid (Space-based Gravitational-wave Observatory Mid-sized), a three arm LISA-like concept. If NASA were to partner with ESA, the eLISA concept could be transformed to SGO Mid by the addition of a third arm, augmenting science, reducing risk and reducing non-recurring engineering costs. The characteristics of the mission concepts and the relative science performance of eLISA, SGO Mid and LISA are described. Note that all results are based on models, methods and assumptions used in NASA studies

  17. The Europa Ocean Discovery mission

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, B.C. [Los Alamos National Lab., NM (United States); Chyba, C.F. [Univ. of Arizona, Tucson, AZ (United States); Abshire, J.B. [National Aeronautics and Space Administration, Greenbelt, MD (United States). Goddard Space Flight Center] [and others

    1997-06-01

    Since it was first proposed that tidal heating of Europa by Jupiter might lead to liquid water oceans below Europa`s ice cover, there has been speculation over the possible exobiological implications of such an ocean. Liquid water is the essential ingredient for life as it is known, and the existence of a second water ocean in the Solar System would be of paramount importance for seeking the origin and existence of life beyond Earth. The authors present here a Discovery-class mission concept (Europa Ocean Discovery) to determine the existence of a liquid water ocean on Europa and to characterize Europa`s surface structure. The technical goal of the Europa Ocean Discovery mission is to study Europa with an orbiting spacecraft. This goal is challenging but entirely feasible within the Discovery envelope. There are four key challenges: entering Europan orbit, generating power, surviving long enough in the radiation environment to return valuable science, and complete the mission within the Discovery program`s launch vehicle and budget constraints. The authors will present here a viable mission that meets these challenges.

  18. Modelling of the MICROSCOPE Mission

    Science.gov (United States)

    Bremer, Stefanie; List, Meike

    2010-03-01

    The French space mission MICROSCOPE aims at testing the Equivalence Principle (EP) up to an accuracy of 10-15. The experiment will be carried out on a satellite which is developed and produced within the CNES Myriade series. The measuring accuracy will be achieved by means of two high-precision capacitive differential accelerometers that are built by the French institute ONERA, see Touboul and Rodrigues (Class. Quantum Gravity 18:2487-2498, 2001). At ZARM, which is a member of the science team, the data evaluation process is prepared. Therefore, a comprehensive simulation of the real system including the science signal and all error sources is built for the development and testing of data reduction and data analysis algorithms to extract the EP violation signal. Currently, the ZARM Drag-Free simulator, a tool to support mission modelling, is adapted for the MICROSCOPE mission in order to simulate test mass and satellite dynamics. Models of environmental disturbances like solar radiation pressure are considered, also. Additionally, detailed modelling of the on-board capacitive sensors is done. The actual status of the mission modelling will be presented. Particularly, the modelling of disturbances forces will be discussed in detail.

  19. Mission and ethics in Galatians

    African Journals Online (AJOL)

    Test

    2010-09-30

    Sep 30, 2010 ... 10.4102/hts.v67i1.896. In this article, it is investigated how the concepts identity, ethics and ethos interrelate, and ... who speak of mission in Galatians, should speak about the role of identity, ethics and ethos in the letter. ...... the perspective of speech act theory, it could be argued that Paul not only spoke.

  20. SpinSat Mission Overview

    Science.gov (United States)

    2013-09-01

    of surface treatments. The exterior of each spacecraft has on it a beach ball type pattern consisting of gold irridite and black anodize type 2 class...cargo allotment on the SpaceX Dragon spacecraft launched by the SpaceX Falcon 9 two stage to orbit launch vehicle during the SPX-4 resupply mission

  1. The Europa Clipper Mission Concept

    Science.gov (United States)

    Pappalardo, Robert; Goldstein, Barry; Magner, Thomas; Prockter, Louise; Senske, David; Paczkowski, Brian; Cooke, Brian; Vance, Steve; Wes Patterson, G.; Craft, Kate

    2014-05-01

    A NASA-appointed Science Definition Team (SDT), working closely with a technical team from the Jet Propulsion Laboratory (JPL) and the Applied Physics Laboratory (APL), recently considered options for a future strategic mission to Europa, with the stated science goal: Explore Europa to investigate its habitability. The group considered several mission options, which were fully technically developed, then costed and reviewed by technical review boards and planetary science community groups. There was strong convergence on a favored architecture consisting of a spacecraft in Jupiter orbit making many close flybys of Europa, concentrating on remote sensing to explore the moon. Innovative mission design would use gravitational perturbations of the spacecraft trajectory to permit flybys at a wide variety of latitudes and longitudes, enabling globally distributed regional coverage of the moon's surface, with nominally 45 close flybys at altitudes from 25 to 100 km. We will present the science and reconnaissance goals and objectives, a mission design overview, and the notional spacecraft for this concept, which has become known as the Europa Clipper. The Europa Clipper concept provides a cost-efficient means to explore Europa and investigate its habitability, through understanding the satellite's ice and ocean, composition, and geology. The set of investigations derived from the Europa Clipper science objectives traces to a notional payload for science, consisting of: Ice Penetrating Radar (for sounding of ice-water interfaces within and beneath the ice shell), Topographical Imager (for stereo imaging of the surface), ShortWave Infrared Spectrometer (for surface composition), Neutral Mass Spectrometer (for atmospheric composition), Magnetometer and Langmuir Probes (for inferring the satellite's induction field to characterize an ocean), and Gravity Science (to confirm an ocean).The mission would also include the capability to perform reconnaissance for a future lander

  2. IntroductionThe Cluster mission

    Directory of Open Access Journals (Sweden)

    C. P. Escoubet

    2001-09-01

    Full Text Available The Cluster mission, ESA’s first cornerstone project, together with the SOHO mission, dating back to the first proposals in 1982, was finally launched in the summer of 2000. On 16 July and 9 August, respectively, two Russian Soyuz rockets blasted off from the Russian cosmodrome in Baikonour to deliver two Cluster spacecraft, each into their proper orbit. By the end of August 2000, the four Cluster satellites had reached their final tetrahedral constellation. The commissioning of 44 instruments, both individually and as an ensemble of complementary tools, was completed five months later to ensure the optimal use of their combined observational potential. On 1 February 2001, the mission was declared operational. The main goal of the Cluster mission is to study the small-scale plasma structures in three dimensions in key plasma regions, such as the solar wind, bow shock, magnetopause, polar cusps, magnetotail and the auroral zones. With its unique capabilities of three-dimensional spatial resolution, Cluster plays a major role in the International Solar Terrestrial Program (ISTP, where Cluster and the Solar and Heliospheric Observatory (SOHO are the European contributions. Cluster’s payload consists of state-of-the-art plasma instrumentation to measure electric and magnetic fields from the quasi-static up to high frequencies, and electron and ion distribution functions from energies of nearly 0 eV to a few MeV. The science operations are coordinated by the Joint Science Operations Centre (JSOC, at the Rutherford Appleton Laboratory (UK, and implemented by the European Space Operations Centre (ESOC, in Darmstadt, Germany. A network of eight national data centres has been set up for raw data processing, for the production of physical parameters, and their distribution to end users all over the world. The latest information on the Cluster mission can be found at http://sci.esa.int/cluster/.

  3. Rapid Prototyping Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The ARDEC Rapid Prototyping (RP) Laboratory was established in December 1992 to provide low cost RP capabilities to the ARDEC engineering community. The Stratasys,...

  4. Bomber Deterrence Missions: Criteria To Evaluate Mission Effectiveness

    Science.gov (United States)

    2016-02-16

    International Relations from Brigham Young University and a Master of Science degree in Aviation Safety Management from the University of Central...Africa.2 The next year, General LeMay sent “twenty one B-47 wings on practice missions over the North Pole : eight million combat-capable miles made...effective way to persuade the enemy into believing an attack will be unsuccessful. A review of two case studies, North Korea in 2013 and Russia’s

  5. Towards the Development of a Defensive Cyber Damage and Mission Impact Methodology

    Science.gov (United States)

    2007-03-01

    understand the impact and make the right decisions for recovery and mission operations ( Lala and Panda 2000, p. 300). Mission success can, and often does...assessment concerned primarily with rapid system restoration issues ( Lala and Panda 2000, p. 300). The Air Force Computer Emergency Response Team (AFCERT...networks. More importantly to this goal of this research, a central part of the IBRM approach focuses on return on IT investment, with securing

  6. Emirates Mars Mission (EMM) Overview

    Science.gov (United States)

    Sharaf, Omran; Amiri, Sarah; AlMheiri, Suhail; Alrais, Adnan; Wali, Mohammad; AlShamsi, Zakareyya; AlQasim, Ibrahim; AlHarmoodi, Khuloud; AlTeneiji, Nour; Almatroushi, Hessa; AlShamsi, Maryam; AlAwadhi, Mohsen; McGrath, Michael; Withnell, Pete; Ferrington, Nicolas; Reed, Heather; Landin, Brett; Ryan, Sean; Pramann, Brian

    2017-04-01

    United Arab Emirates (UAE) has entered the space exploration race with the announcement of Emirates Mars Mission (EMM), the first Arab Islamic mission to another planet, in 2014. Through this mission, UAE is to send an unmanned probe, called Hope probe, to be launched in summer 2020 and reach Mars by 2021 to coincide with UAE's 50th anniversary. Through a sequence of subsequent maneuvers, the spacecraft will enter a large science orbit that has a periapsis altitude of 20,000 km, an apoapsis altitude of 43,000 km, and an inclination of 25 degrees. The mission is designed to (1) characterize the state of the Martian lower atmosphere on global scales and its geographic, diurnal and seasonal variability, (2) correlate rates of thermal and photochemical atmospheric escape with conditions in the collisional Martian atmosphere, and (3) characterize the spatial structure and variability of key constituents in the Martian exosphere. These objectives will be met by four investigations with diurnal variability on sub-seasonal timescales which are (1) determining the three-dimensional thermal state of the lower atmosphere, (2) determining the geographic and diurnal distribution of key constituents in the lower atmosphere, (3) determining the abundance and spatial variability of key neutral species in the thermosphere, and (4) determining the three-dimensional structure and variability of key species in the exosphere. EMM will collect these information about the Mars atmospheric circulation and connections through a combination of three distinct instruments that image Mars in the visible, thermal infrared and ultraviolet wavelengths and they are the Emirates eXploration Imager (EXI), the Emirates Mars InfraRed Spectrometer (EMIRS), and the EMM Mars Ultraviolet Spectrometer (EMUS). EMM has passed its Mission Concept Review (MCR), System Requirements Review (SRR), System Design Review (SDR), and Preliminary Design Review (PDR) phases. The mission is led by Emiratis from Mohammed

  7. ESA's GOCE gravity gradiometer mission

    Science.gov (United States)

    Touboul, Pierre

    2010-02-01

    In the present decade, three space gravity missions, CHAMP, GRACE and GOCE provide unique information about mass and mass redistribution in the Earth system with a wide range of scientific returns like global ocean circulation, ice mass balance, glacial isostatic adjustment, continental ground water storage. On board the four satellites of these missions, similar electrostatic space inertial sensors deliver continuously, during quite nine years for the older, the accurate acceleration data needed for the missions. The sensor operation remains on the six axes electrostatic suspension of one solid metallic mass, which is servo-controlled motionless at the centre of the highly stable set of gold coated silica electrode plates. All degrees of freedom are measured with very sensitive capacitive sensors down to a few pico-m and the applied electrostatic forces to pico-N. With similar sensor design and technologies, full scale range and resolution can be adjusted according to the satellite environment and the mission requirements. The CHAMP and GRACE accelerometers have demonstrated their in orbit performance. They provides measurements of the satellite non gravitational surface forces like the atmospheric drag and radiation pressures in order to extract from the satellite measured orbital position and velocity fluctuations, the effects of gravity anomalies. The six GOCE accelerometers compose the three axes gradiometer, combined to the SST-high-low GPS tracking to provide higher precision and resolution of the Earth static field. They contribute also to the satellite attitude control and drag compensation system, allowing the heliosynchronous orbit at the very low 260 km altitude. So, the accelerometers are designed to exhibit a full range of 6.5 10-6 ms-2 and a resolution of 2 10-12 ms-2 Hz-1/2. Since the gradiometer switch on in April 09, they deliver data leading to the components of the gravity gradient tensor. The main characteristics of the GOCE accelerometers and

  8. White Label Space GLXP Mission

    Science.gov (United States)

    Barton, A.

    2012-09-01

    This poster presents a lunar surface mission concept and corresponding financing approach developed by the White Label Space team, an official competitor in the Google Lunar X PRIZE. The White Label Space team's origins were in the European Space Agency's ESTEC facility in the Netherlands. Accordingly the team's technical headquarters are located just outside ESTEC in the Space Business Park. The team has active partners in Europe, Japan and Australia. The team's goal is to provide a unique publicity opportunity for global brands to land on the moon and win the prestigious Google Lunar X PRIZE. The poster presents the main steps to achieve this goal, the cost estimates for the mission, describes the benefits to the potential sponsors and supporters, and details the progress achieved to date.

  9. Java Mission Evaluation Workstation System

    Science.gov (United States)

    Pettinger, Ross; Watlington, Tim; Ryley, Richard; Harbour, Jeff

    2006-01-01

    The Java Mission Evaluation Workstation System (JMEWS) is a collection of applications designed to retrieve, display, and analyze both real-time and recorded telemetry data. This software is currently being used by both the Space Shuttle Program (SSP) and the International Space Station (ISS) program. JMEWS was written in the Java programming language to satisfy the requirement of platform independence. An object-oriented design was used to satisfy additional requirements and to make the software easily extendable. By virtue of its platform independence, JMEWS can be used on the UNIX workstations in the Mission Control Center (MCC) and on office computers. JMEWS includes an interactive editor that allows users to easily develop displays that meet their specific needs. The displays can be developed and modified while viewing data. By simply selecting a data source, the user can view real-time, recorded, or test data.

  10. LISA Pathfinder: mission and status

    Energy Technology Data Exchange (ETDEWEB)

    Antonucci, F; Cavalleri, A; Congedo, G [Dipartimento di Fisica, Universita di Trento and INFN, Gruppo Collegato di Trento, 38050 Povo, Trento (Italy); Armano, M [European Space Astronomy Centre, European Space Agency, Villanueva de la Canada, 28692 Madrid (Spain); Audley, H; Bogenstahl, J; Danzmann, K [Albert-Einstein-Institut, Max-Planck-Institut fuer Gravitationsphysik und Universitaet Hannover, 30167 Hannover (Germany); Auger, G; Binetruy, P [APC UMR7164, Universite Paris Diderot, Paris (France); Benedetti, M [Dipartimento di Ingegneria dei Materiali e Tecnologie Industriali, Universita di Trento and INFN, Gruppo Collegato di Trento, Mesiano, Trento (Italy); Boatella, C [CNES, DCT/AQ/EC, 18 Avenue Edouard Belin, 31401 Toulouse, Cedex 9 (France); Bortoluzzi, D; Bosetti, P; Cristofolini, I [Dipartimento di Ingegneria Meccanica e Strutturale, Universita di Trento and INFN, Gruppo Collegato di Trento, Mesiano, Trento (Italy); Caleno, M; Cesa, M [European Space Technology Centre, European Space Agency, Keplerlaan 1, 2200 AG Noordwijk (Netherlands); Chmeissani, M [IFAE, Universitat Autonoma de Barcelona, E-08193 Bellaterra, Barcelona (Spain); Ciani, G [Department of Physics, University of Florida, Gainesville, FL 32611-8440 (United States); Conchillo, A [ICE-CSIC/IEEC, Facultat de Ciencies, E-08193 Bellaterra, Barcelona (Spain); Cruise, M, E-mail: Paul.McNamara@esa.int [Department of Physics and Astronomy, University of Birmingham, Birmingham (United Kingdom)

    2011-05-07

    LISA Pathfinder, the second of the European Space Agency's Small Missions for Advanced Research in Technology (SMART), is a dedicated technology demonstrator for the joint ESA/NASA Laser Interferometer Space Antenna (LISA) mission. The technologies required for LISA are many and extremely challenging. This coupled with the fact that some flight hardware cannot be fully tested on ground due to Earth-induced noise led to the implementation of the LISA Pathfinder mission to test the critical LISA technologies in a flight environment. LISA Pathfinder essentially mimics one arm of the LISA constellation by shrinking the 5 million kilometre armlength down to a few tens of centimetres, giving up the sensitivity to gravitational waves, but keeping the measurement technology: the distance between the two test masses is measured using a laser interferometric technique similar to one aspect of the LISA interferometry system. The scientific objective of the LISA Pathfinder mission consists then of the first in-flight test of low frequency gravitational wave detection metrology. LISA Pathfinder is due to be launched in 2013 on-board a dedicated small launch vehicle (VEGA). After a series of apogee raising manoeuvres using an expendable propulsion module, LISA Pathfinder will enter a transfer orbit towards the first Sun-Earth Lagrange point (L1). After separation from the propulsion module, the LPF spacecraft will be stabilized using the micro-Newton thrusters, entering a 500 000 km by 800 000 km Lissajous orbit around L1. Science results will be available approximately 2 months after launch.

  11. The Van Allen Probes mission

    CERN Document Server

    Burch, James

    2014-01-01

    This collection of articles provides broad and detailed information about NASA’s Van Allen Probes (formerly known as the Radiation Belt Storm Probes) twin-spacecraft Earth-orbiting mission. The mission has the objective of achieving predictive understanding of the dynamic, intense, energetic, dangerous, and presently unpredictable belts of energetic particles that are magnetically trapped in Earth’s space environment above the atmosphere. It documents the science of the radiation belts and the societal benefits of achieving predictive understanding. Detailed information is provided about the Van Allen Probes mission design, the spacecraft, the science investigations, and the onboard instrumentation that must all work together to make unprecedented measurements within a most unforgiving environment, the core of Earth’s most intense radiation regions.
 This volume is aimed at graduate students and researchers active in space science, solar-terrestrial interactions and studies of the up...

  12. ACTS FOR TODAY 'S MISSIONAL CHURCH

    African Journals Online (AJOL)

    2010-01-30

    Jan 30, 2010 ... 75 Elveram Street,. Lynnwood Glen, Pretoria. 0081, South Africa. Keywords: Acts; Bevansand Schroeder; contextualisation; discernment; globalisation; hospitality to strangers; inclusivity; missio Dei; missional church; missional listening; prophetic dialogue; resolving conflict. Dates: Received: 26 Aug. 2009.

  13. Low Energy Mission Planning Toolbox Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Low Energy Mission Planning Toolbox is designed to significantly reduce the resources and time spent on designing missions in multi-body gravitational...

  14. Apollo 13 Facts [Post Mission Honorary Ceremony

    Science.gov (United States)

    2001-01-01

    The Apollo 13 astronauts, James Lovell, Jr., John Swigert, Jr., and Fred Haise, Jr., are seen during this post mission honorary ceremony, led by President Richard Nixon. Lovell is shown during an interview, answering questions about the mission.

  15. UAV Mission Planning: From Robust to Agile

    NARCIS (Netherlands)

    Evers, L.; Barros, A.I.; Monsuur, H.; Wagelmans, A.

    2015-01-01

    Unmanned Aerial Vehicles (UAVs) are important assets for information gathering in Intelligence Surveillance and Reconnaissance (ISR) missions. Depending on the uncertainty in the planning parameters, the complexity of the mission and its constraints and requirements, different planning methods might

  16. SWARM - An earth Observation Mission investigating Geospace

    DEFF Research Database (Denmark)

    Friis-Christensen, Eigil; Lühr, H.; Knudsen, D.

    2008-01-01

    The Swarm mission was selected as the 5th mission in ESA's Earth Explorer Programme in 2004. This mission aims at measuring the Earth's magnetic field with unprecedented accuracy. This will be done by a constellation of three satellites, where two will fly at lower altitude, measuring the gradient...... of the Swarm science objectives, the mission concept, the scientific instrumentation, and the expected contribution to the ILWS programme will be summarized. (C) 2007 Published by Elsevier Ltd on behalf of COSPAR....

  17. CHEOPS: A transit photometry mission for ESA's small mission programme

    Directory of Open Access Journals (Sweden)

    Queloz D.

    2013-04-01

    Full Text Available Ground based radial velocity (RV searches continue to discover exoplanets below Neptune mass down to Earth mass. Furthermore, ground based transit searches now reach milli-mag photometric precision and can discover Neptune size planets around bright stars. These searches will find exoplanets around bright stars anywhere on the sky, their discoveries representing prime science targets for further study due to the proximity and brightness of their host stars. A mission for transit follow-up measurements of these prime targets is currently lacking. The first ESA S-class mission CHEOPS (CHaracterizing ExoPlanet Satellite will fill this gap. It will perform ultra-high precision photometric monitoring of selected bright target stars almost anywhere on the sky with sufficient precision to detect Earth sized transits. It will be able to detect transits of RV-planets by photometric monitoring if the geometric configuration results in a transit. For Hot Neptunes discovered from the ground, CHEOPS will be able to improve the transit light curve so that the radius can be determined precisely. Because of the host stars' brightness, high precision RV measurements will be possible for all targets. All planets observed in transit by CHEOPS will be validated and their masses will be known. This will provide valuable data for constraining the mass-radius relation of exoplanets, especially in the Neptune-mass regime. During the planned 3.5 year mission, about 500 targets will be observed. There will be 20% of open time available for the community to develop new science programmes.

  18. Rapid Airplane Parametric Input Design (RAPID)

    Science.gov (United States)

    Smith, Robert E.

    1995-01-01

    RAPID is a methodology and software system to define a class of airplane configurations and directly evaluate surface grids, volume grids, and grid sensitivity on and about the configurations. A distinguishing characteristic which separates RAPID from other airplane surface modellers is that the output grids and grid sensitivity are directly applicable in CFD analysis. A small set of design parameters and grid control parameters govern the process which is incorporated into interactive software for 'real time' visual analysis and into batch software for the application of optimization technology. The computed surface grids and volume grids are suitable for a wide range of Computational Fluid Dynamics (CFD) simulation. The general airplane configuration has wing, fuselage, horizontal tail, and vertical tail components. The double-delta wing and tail components are manifested by solving a fourth order partial differential equation (PDE) subject to Dirichlet and Neumann boundary conditions. The design parameters are incorporated into the boundary conditions and therefore govern the shapes of the surfaces. The PDE solution yields a smooth transition between boundaries. Surface grids suitable for CFD calculation are created by establishing an H-type topology about the configuration and incorporating grid spacing functions in the PDE equation for the lifting components and the fuselage definition equations. User specified grid parameters govern the location and degree of grid concentration. A two-block volume grid about a configuration is calculated using the Control Point Form (CPF) technique. The interactive software, which runs on Silicon Graphics IRIS workstations, allows design parameters to be continuously varied and the resulting surface grid to be observed in real time. The batch software computes both the surface and volume grids and also computes the sensitivity of the output grid with respect to the input design parameters by applying the precompiler tool

  19. Telecentre Network Startup : Bangladesh - Mission 2011 | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The second generation of telecentres has seen the emergence of national-level networks in various parts of the word including the Ugandan Telecentre Network, Mission 2007 in India and Mission Swaabhimaan in Nepal. Telecentre stakeholders in Bangladesh would like to replicate the methodology used in Mission 2007, ...

  20. Kepler's Third Law and NASA's "Kepler Mission"

    Science.gov (United States)

    Gould, Alan; Komatsu, Toshi; DeVore, Edna; Harman, Pamela; Koch, David

    2015-01-01

    NASA's "Kepler Mission" has been wildly successful in discovering exoplanets. This paper summarizes the mission goals, briefly explains the transit method of finding exoplanets and design of the mission, provides some key findings, and describes useful education materials available at the "Kepler" website.

  1. Telecentre Network Startup : Bangladesh - Mission 2011 | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The second generation of telecentres has seen the emergence of national-level networks in various parts of the word including the Ugandan Telecentre Network, Mission 2007 in India and Mission Swaabhimaan in Nepal. Telecentre stakeholders in Bangladesh would like to replicate the methodology used in Mission 2007, ...

  2. Rapid shallow breathing

    Science.gov (United States)

    ... the smallest air passages of the lungs in children ( bronchiolitis ) Pneumonia or other lung infection Transient tachypnea of the newborn Anxiety and panic Other serious lung disease Home Care Rapid, shallow breathing should not be treated at home. It is ...

  3. Rapid Strep Test

    Science.gov (United States)

    ... worse than normal. Your first thoughts turn to strep throat. A rapid strep test in your doctor’s office ... your suspicions.Viruses cause most sore throats. However, strep throat is an infection caused by the Group A ...

  4. Overview of EXIST mission science and implementation

    Science.gov (United States)

    Grindlay, J.; Gehrels, N.; Bloom, J.; Coppi, P.; Soderberg, Al.; Hong, J.; Allen, B.; Barthelmy, S.; Tagliaferri, G.; Moseley, H.; Kutyrev, A.; Fabbiano, G.; Fishman, G.; Ramsey, B.; Della Ceca, R.; Natalucci, L.; Ubertini, P., III

    2010-07-01

    The Energetic X-ray Imaging Survey Telescope (EXIST) is designed to i) use the birth of stellar mass black holes, as revealed by cosmic Gamma-Ray Bursts (GRBs), as probes of the very first stars and galaxies to exist in the Universe. Both their extreme luminosity (~104 times larger than the most luminous quasars) and their hard X-ray detectability over the full sky with wide-field imaging make them ideal "back-lights" to measure cosmic structure with X-ray, optical and near-IR (nIR) spectra over many sight lines to high redshift. The full-sky imaging detection and rapid followup narrowfield imaging and spectroscopy allow two additional primary science objectives: ii) novel surveys of supermassive black holes (SMBHs) accreting as very luminous but rare quasars, which can trace the birth and growth of the first SMBHs as well as quiescent SMBHs (non-accreting) which reveal their presence by X-ray flares from the tidal disruption of passing field stars; and iii) a multiwavelength Time Domain Astrophysics (TDA) survey to measure the temporal variability and physics of a wide range of objects, from birth to death of stars and from the thermal to non-thermal Universe. These science objectives are achieved with the telescopes and mission as proposed for EXIST described here.

  5. RAPID3? Aptly named!

    Science.gov (United States)

    Berthelot, J-M

    2014-01-01

    The RAPID3 score is the sum of three 0-10 patient self-report scores: pain, functional impairment on MDHAQ, and patient global estimate. It requires 5 seconds for scoring and can be used in all rheumatologic conditions, although it has mostly been used in rheumatoid arthritis where cutoffs for low disease activity (12/30) have been set. A RAPID3 score of ≤ 3/30 with 1 or 0 swollen joints (RAPID3 ≤ 3 + ≤ SJ1) provides remission criteria comparable to Boolean, SDAI, CDAI, and DAS28 remission criteria, in far less time than a formal joint count. RAPID3 performs as well as the DAS28 in separating active drugs from placebos in clinical trials. RAPID3 also predicts subsequent structural disease progression. RAPID3 can be determined at short intervals at home, allowing the determination of the area under the curve of disease activity between two visits and flare detection. However, RAPID3 should not be seen as a substitute for DAS28 and face to face visits in routine care. Monitoring patient status with only self-report information without a rheumatologist's advice (including joints and physical examination, and consideration of imaging and laboratory tests) may indeed be as undesirable for most patients than joint examination without a patient questionnaire. Conversely, combining the RAPID3 and the DAS28 may consist in faster or more sensitive confirmation that a medication is effective. Similarly, better enquiring of most important concerns of patients (pain, functional status and overall opinion on their disorder) should reinforces patients' confidence in their rheumatologist and treatments.

  6. The Ionospheric Connection Explorer Mission: Mission Goals and Design

    Science.gov (United States)

    Immel, T. J.; England, S. L.; Mende, S. B.; Heelis, R. A.; Englert, C. R.; Edelstein, J.; Frey, H. U.; Korpela, E. J.; Taylor, E. R.; Craig, W. W.; Harris, S. E.; Bester, M.; Bust, G. S.; Crowley, G.; Forbes, J. M.; Gérard, J.-C.; Harlander, J. M.; Huba, J. D.; Hubert, B.; Kamalabadi, F.; Makela, J. J.; Maute, A. I.; Meier, R. R.; Raftery, C.; Rochus, P.; Siegmund, O. H. W.; Stephan, A. W.; Swenson, G. R.; Frey, S.; Hysell, D. L.; Saito, A.; Rider, K. A.; Sirk, M. M.

    2018-02-01

    The Ionospheric Connection Explorer, or ICON, is a new NASA Explorer mission that will explore the boundary between Earth and space to understand the physical connection between our world and our space environment. This connection is made in the ionosphere, which has long been known to exhibit variability associated with the sun and solar wind. However, it has been recognized in the 21st century that equally significant changes in ionospheric conditions are apparently associated with energy and momentum propagating upward from our own atmosphere. ICON's goal is to weigh the competing impacts of these two drivers as they influence our space environment. Here we describe the specific science objectives that address this goal, as well as the means by which they will be achieved. The instruments selected, the overall performance requirements of the science payload and the operational requirements are also described. ICON's development began in 2013 and the mission is on track for launch in 2018. ICON is developed and managed by the Space Sciences Laboratory at the University of California, Berkeley, with key contributions from several partner institutions.

  7. Additional Mission Applications for NASA's 13.3-kW Ion Propulsion System

    Science.gov (United States)

    Snyder, John Steven; Manzella, David; Lisman, Doug; Lock, Robert E.; Nicholas, Austin; Woolley, Ryan

    2016-01-01

    NASA's Space Technology Mission Directorate has been recently developing critical technologies for high-power solar electric propulsion (SEP), including large deployable solar array structures and high-power electric propulsion components. An ion propulsion system based on these developments has been considered for many SEP technology demonstration missions, including the Asteroid Redirect Robotic Mission (ARRM) concept. These studies and the highpower SEP technology developments have generated excitement within NASA about the use of the ARRM ion propulsion system design for other types of potential missions. One application of interest is for Mars missions, especially with the types of orbiters now under consideration for flights in the early 2020's to replace the aging Mars Reconnaissance Orbiter. High-power SEP can deliver large payloads to Mars with many additional capabilities, including large orbital plane changes and roundtrip missions, compared to chemically-propelled spacecraft. Another application for high-power SEP is for exo-planet observation missions, where a large starshade spacecraft would need to be repositioned with respect to its companion telescope relatively frequently and rapidly. SEP is an enabling technology for the ambitious science goals of these types of missions. This paper will discuss the benefits of high-power SEP for these concepts based on the STMD technologies now under development.

  8. Magnetospheric Multiscale (MMS) Mission Status

    Science.gov (United States)

    Moore, T. E.; Black, R.; Burch, J. L.; Hesse, M.; Robertson, B. P.; Spidaliere, P. D.; Pope, S.; Tooley, C. R.; Torbert, R. B.

    2014-12-01

    The MMS mission, with its four fully instrumented reconnection probes, is manifested for launch in March 2015 from Kennedy Space Center (KSC). The initial orbits will be 12 RE geocentric radius by 1200 km altitude at 28˚ inclination, maneuvered into a resizable tetrahedral formation that will pass through the persistent sites of magnetic reconnection nearest to Earth. The Observatories, each with suite of instruments, underwent thermal vacuum testing serially beginning in late Nov 2013, with the final testing completed in July 2014. Pre-Shipment Review was held in late October 2014 prior to shipment of stacked pairs of Observatories to the launch processing site at KSC (Astrotech). They are now being processed in stacked pairs, pending full stacking as a constellation and installation on the Atlas V - Series 421 launch vehicle that will carry them into orbit. Final propulsion functional testing and launch rehearsal operations will be conducted this month. The Science and Engineering Team is preparing for commissioning and early operations immediately after launch by executing Mission Readiness Tests (MRTs) to exercise all systems including the "Scientist In The Loop" or SITL system that will provide human oversight of the prioritization of high resolution data segments for downloading to the ground. The Theory and Modeling team and three Interdisciplinary Science teams continue to develop virtual spacecraft data sets and displays as an aid to identification of features of interest during operations. Phase 1 operations will probe the dayside low latitude reconnection features, beginning in August 2015, as the constellation moves into the afternoon local time sector. More information is available at http://science.nasa.gov/missions/mms/, http://mms.gsfc.nasa.gov, and other linked sites.

  9. Mission Design of the Dutch-Chinese FAST Micro-Satellite Mission

    NARCIS (Netherlands)

    Maessen, D.C.; Guo, J.; Gill, E.; Laan, E.; Moon, S.; Zheng, G.T.

    2009-01-01

    The paper treats the mission design for the Dutch-Chinese FAST (Formation for Atmospheric Science and Technology demonstration) mission. The space segment of the 2.5 year mission consists out of two formation flying micro-satellites. During the mission, new technologies will be demonstrated and,

  10. Kepler planet-detection mission

    DEFF Research Database (Denmark)

    Borucki...[], William J.; Koch, David; Buchhave, Lars C. Astrup

    2010-01-01

    The Kepler mission was designed to determine the frequency of Earth-sized planets in and near the habitable zone of Sun-like stars. The habitable zone is the region where planetary temperatures are suitable for water to exist on a planet’s surface. During the first 6 weeks of observations, Kepler...... is one of the lowest-density planets (~0.17 gram per cubic centimeter) yet detected. Kepler-5b, -6b, and -8b confirm the existence of planets with densities lower than those predicted for gas giant planets....

  11. The GeoCarb Mission

    Science.gov (United States)

    Moore, B., III; Crowell, S.

    2016-12-01

    This paper presents a space mission (geoCARB) that would provide measurements of atmospheric carbon dioxide (CO2), methane (CH4), and carbon monoxide (CO) from geostationary orbit. The geoCARB mission would deliver multiple daily maps of column integrated mixing ratios of CO2, CH4, and CO over the observed landmasses at a spatial resolution of roughly 5 x 8 km., which will establish the scientific basis for CO2 and CH4 flux determination at the unprecedented time and space scale. This determination would produce a fundamental change in our scientific understanding of the terrestrial source/sink dynamics of carbon cycle as well as produce the kind of flux information that would be needed to support international agreements on greenhouse gas emission reductions. The instrument would exploit four spectral regions: The Oxygen A-band for pressure and aerosols, the weak and strong bands of CO2 near 1.61 and 2.06 microns, and a region near 2.32 microns for CO and CH4. The O2 and CO2 band selection are very similar to the instrument aboard OCO-2, and so we envision OCO-2 in geostationary orbit with the addition of a fourth channel to measure CO and CH4, but without an oceanic capability. The O2 A-band also provides for retrieval of Solar Induced Fluoresce (SIF). The geoCARB Mission's persistent fine-scale mapping-like measurements, multiple times a day under changing conditions, enable significant advances on an important range of CO2 issues: CO2 fertilization, change in primary production because of nitrogen deposition, and the influence of climatic patterns on terrestrial sources and sinks. Similarly, the mission's high space- and time-measurements of CH4 enable important analyses of human impacts via agriculture and industry vs. natural phenomena on methane sources. The geoCARB measurements of CO concentrations and SIF provide essential information for CO2 and CH4 source attribution. For example, CO helps distinguish between biotic fluxes of CO2 and CH4 from fluxes

  12. Asteroseismology with NASA's Kepler Mission

    DEFF Research Database (Denmark)

    Huber, Daniel; Chaplin, W. J.; Christensen-Dalsgaard, J.

    2013-01-01

    The measurement of stellar oscillations - also called asteroseismology - is among the most powerful observational tools to study the structure and evolution of stars. The high precision photometry collected by the Kepler space telescope has revolutionized asteroseismology over the past few years...... by boosting the number of stars with detected oscillations by nearly two orders of magnitude over ground-based efforts, and delivering data with unprecedented signal-to-noise. In this talk I will highlight some of the recent breakthrough discoveries by the Kepler Mission, focusing in particular...

  13. Bion 11 mission: primate experiments

    Science.gov (United States)

    Ilyin, E. A.; Korolkov, V. I.; Skidmore, M. G.; Viso, M.; Kozlovskaya, I. B.; Grindeland, R. E.; Lapin, B. A.; Gordeev, Y. V.; Krotov, V. P.; Fanton, J. W.; hide

    2000-01-01

    A summary is provided of the major operations required to conduct the wide range of primate experiments on the Bion 11 mission, which flew for 14 days beginning December 24, 1996. Information is given on preflight preparations, including flight candidate selection and training; attachment and implantation of bioinstrumentation; flight and ground experiment designs; onboard life support and test systems; ground and flight health monitoring; flight monkey selection and transport to the launch site; inflight procedures and data collection; postflight examinations and experiments; and assessment of results.

  14. Magnetic Satellite Missions and Data

    DEFF Research Database (Denmark)

    Olsen, Nils; Kotsiaros, Stavros

    2011-01-01

    Although the first satellite observations of the Earth’s magnetic field were already taken more than 50 years ago, continuous geomagnetic measurements from space are only available since 1999. The unprecedented time-space coverage of this recent data set opened revolutionary new possibilities...... for exploring the Earth’s magnetic field from space. In this chapter we discuss characteristics of satellites measuring the geomagnetic field and report on past, present and upcoming magnetic satellite missions. We conclude with some basics about space magnetic gradiometry as a possible path for future...... exploration of Earth’s magnetic field with satellites....

  15. An Advanced In-Situ Resource Utilization (ISRU) Production Plant Design for Robotic and Human Mars Missions

    Science.gov (United States)

    Simon, T.; Baird, R. S.; Trevathan, J.; Clark, L.

    2002-01-01

    The ability to produce the necessary consumables, rather than relying solely on what is brought from Earth decreases the launch mass, cost, and risk associated with a Mars mission while providing capabilities that enable the commercial development of space. The idea of using natural resources, or "living off the land", is termed In-Situ Resource Utilization (ISRU). Trade studies have shown that producing and utilizing consumables such as water, breathing oxygen, and propellant can reduce the launch mass for a human or robotic mission to Mars by 20-45%. The Johnson Space Center and Lockheed Martin Astronautics are currently designing and planning assembly of a complete collection-to-storage production plant design for producing methane (fuel), oxygen, and water from carbon dioxide (Martian atmosphere) and hydrogen (electrolyzed Martian water or Earth-originated), based on lessons learned and design enhancements from a 1st generation testbed. The design and testing of the major subsystems incorporated in the 2nd generation system, including a carbon dioxide freezer, Sabatier reactor, water electrolysis unit, and vacuum-jacketed, cryogenic, common-bulkhead storage tank, will be presented in detail with the goal of increasing the awareness of the readiness level of these technologies. These technologies are mass and power efficient as well as fundamentally simple and reliable. These technologies also have potential uses in Environmental Control and Life Support System (ECLSS) applications for removing and recycling crew-exhaled carbon dioxide. Each subsystem is sized for an ISRU-assisted sample return mission, producing in an 8-hour period 0.56 kg water and 0.26 kg methane from the Sabatier reactor and 0.50 kg oxygen from electrolyzed water. The testing of these technologies to date will be discussed as well as plans for integrating the subsystems for a complete end-to-end demonstration at Mars conditions. This paper will also address the history of these subsystem

  16. RAPID: Collaborative Commanding and Monitoring of Lunar Assets

    Science.gov (United States)

    Torres, Recaredo J.; Mittman, David S.; Powell, Mark W.; Norris, Jeffrey S.; Joswig, Joseph C.; Crockett, Thomas M.; Abramyan, Lucy; Shams, Khawaja S.; Wallick, Michael; Allan, Mark; hide

    2011-01-01

    RAPID (Robot Application Programming Interface Delegate) software utilizes highly robust technology to facilitate commanding and monitoring of lunar assets. RAPID provides the ability for intercenter communication, since these assets are developed in multiple NASA centers. RAPID is targeted at the task of lunar operations; specifically, operations that deal with robotic assets, cranes, and astronaut spacesuits, often developed at different NASA centers. RAPID allows for a uniform way to command and monitor these assets. Commands can be issued to take images, and monitoring is done via telemetry data from the asset. There are two unique features to RAPID: First, it allows any operator from any NASA center to control any NASA lunar asset, regardless of location. Second, by abstracting the native language for specific assets to a common set of messages, an operator may control and monitor any NASA lunar asset by being trained only on the use of RAPID, rather than the specific asset. RAPID is easier to use and more powerful than its predecessor, the Astronaut Interface Device (AID). Utilizing the new robust middleware, DDS (Data Distribution System), developing in RAPID has increased significantly over the old middleware. The API is built upon the Java Eclipse Platform, which combined with DDS, provides platform-independent software architecture, simplifying development of RAPID components. As RAPID continues to evolve and new messages are being designed and implemented, operators for future lunar missions will have a rich environment for commanding and monitoring assets.

  17. Mission design of a Pioneer Jupiter Orbiter

    Science.gov (United States)

    Friedman, L. D.; Nunamaker, R. R.

    1975-01-01

    The Mission analysis and design work performed in order to define a Pioneer mission to orbit Jupiter is described. This work arose from the interaction with a science advisory 'Mission Definition' team and led to the present mission concept. Building on the previous Jupiter Orbiter-Satellite Tour development at JPL a magnetospheric survey mission concept is developed. The geometric control of orbits which then provide extensive local time coverage of the Jovian system is analyzed and merged with the various science and program objectives. The result is a 'flower-orbit' mission design, yielding three large apoapse excursions at various local times and many interior orbits whose shape and orientation is under continual modification. This orbit design, together with a first orbit defined by delivery of an atmospheric probe, yields a mission of high scientific interest.

  18. Sleep, Circadian Rhythms, and Performance During Space Shuttle Missions

    Science.gov (United States)

    Neri, David F.; Czeisler, Charles A.; Dijk, Derk-Jan; Wyatt, James K.; Ronda, Joseph M.; Hughes, Rod J.

    2003-01-01

    Sleep and circadian rhythms may be disturbed during spaceflight, and these disturbances can affect crewmembers' performance during waking hours. The mechanisms underlying sleep and circadian rhythm disturbances in space are not well understood, and effective countermeasures are not yet available. We investigated sleep, circadian rhythms, cognitive performance, and light-dark cycles in five astronauts prior to, during, and after the 16-day STS-90 mission and the IO-day STS-95 mission. The efficacy of low-dose, alternative-night, oral melatonin administration as a countermeasure for sleep disturbances was evaluated. During these missions, scheduled rest activity cycles were 20-35 minutes shorter than 24 hours. Light levels on the middeck and in the Spacelab were very low; whereas on the flight deck (which has several windows), they were highly variable. Circadian rhythm abnormalities were observed. During the second half of the missions, the rhythm of urinary cortisol appeared to be delayed relative to the sleep-wake schedule. Performance during wakefulness was impaired. Astronauts slept only about 6.5 hours per day, and subjective sleep quality was lower in space. No beneficial effects of melatonin (0.3 mg administered prior to sleep episodes on alternate nights) were observed. A surprising finding was a marked increase in rapid eye movement (REM) sleep upon return to Earth. We conclude that these Space Shuttle missions were associated with circadian rhythm disturbances, sleep loss, decrements in neurobehavioral performance, and alterations in REM sleep homeostasis. Shorter than 24-hour rest-activity schedules and exposure to light-dark cycles inadequate for optimal circadian synchronization may have contributed to these disturbances.

  19. Astronaut Brian Duffy, mission commander for the STS-72 mission, prepares to ascend stairs to the

    Science.gov (United States)

    1996-01-01

    STS-72 TRAINING VIEW --- Astronaut Brian Duffy, mission commander for the STS-72 mission, prepares to ascend stairs to the flight deck of the fixed base Shuttle Mission Simulator (SMS) at the Johnson Space Center (JSC). Duffy will be joined by four other NASA astronauts and an international mission specialist aboard the Space Shuttle Endeavour for a scheduled nine-day mission, now set for the winter of this year.

  20. Astronaut Leroy Chiao, assigned as mission specialist for the mission, prepares to ascend stairs to

    Science.gov (United States)

    1996-01-01

    STS-72 TRAINING VIEW --- Astronaut Leroy Chiao, assigned as mission specialist for the mission, prepares to ascend stairs to the flight deck of the fixed base Shuttle Mission Simulator (SMS) at the Johnson Space Center (JSC). Chiao will join an international mission specialist and four other NASA astronauts aboard the Space Shuttle Endeavour for a scheduled nine-day mission, now set for the winter of this year.