WorldWideScience

Sample records for rapid end-to-end mission

  1. Composable Mission Framework for Rapid End-to-End Mission Design and Simulation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is the Composable Mission Framework (CMF) a model-based software framework that shall enable seamless continuity of mission design and...

  2. Human Assisted Robotic Vehicle Studies - A conceptual end-to-end mission architecture

    Science.gov (United States)

    Lehner, B. A. E.; Mazzotta, D. G.; Teeney, L.; Spina, F.; Filosa, A.; Pou, A. Canals; Schlechten, J.; Campbell, S.; Soriano, P. López

    2017-11-01

    With current space exploration roadmaps indicating the Moon as a proving ground on the way to human exploration of Mars, it is clear that human-robotic partnerships will play a key role for successful future human space missions. This paper details a conceptual end-to-end architecture for an exploration mission in cis-lunar space with a focus on human-robot interactions, called Human Assisted Robotic Vehicle Studies (HARVeSt). HARVeSt will build on knowledge of plant growth in space gained from experiments on-board the ISS and test the first growth of plants on the Moon. A planned deep space habitat will be utilised as the base of operations for human-robotic elements of the mission. The mission will serve as a technology demonstrator not only for autonomous tele-operations in cis-lunar space but also for key enabling technologies for future human surface missions. The successful approach of the ISS will be built on in this mission with international cooperation. Mission assets such as a modular rover will allow for an extendable mission and to scout and prepare the area for the start of an international Moon Village.

  3. End-to-End Trade-space Analysis for Designing Constellation Missions

    Science.gov (United States)

    LeMoigne, J.; Dabney, P.; Foreman, V.; Grogan, P.; Hache, S.; Holland, M. P.; Hughes, S. P.; Nag, S.; Siddiqi, A.

    2017-12-01

    cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The current GUI automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost. The end-to-end system will be demonstrated as part of the presentation.

  4. End-To-END Performance of the future MOMA intrument aboard the EXOMARS MISSION

    Science.gov (United States)

    Buch, A.; Pinnick, V. T.; Szopa, C.; Grand, N.; Danell, R.; van Amerom, F. H. W.; Freissinet, C.; Glavin, D. P.; Stalport, F.; Arevalo, R. D., Jr.; Coll, P. J.; Steininger, H.; Raulin, F.; Goesmann, F.; Mahaffy, P. R.; Brinckerhoff, W. B.

    2016-12-01

    After the SAM experiment aboard the curiosity rover, the Mars Organic Molecule Analyzer (MOMA) experiment aboard the future ExoMars mission will be the continuation of the search for the organic composition of the Mars surface with the advantage that the sample will be extracted as deep as 2 meters below the martian surface to minimize effects of radiation and oxidation on organic materials. To analyse the wide range of organic composition (volatile and non volatils compounds) of the martian soil MOMA is composed with an UV laser desorption / ionization (LDI) and a pyrolysis gas chromatography ion trap mass spectrometry (pyr-GC-ITMS). In order to analyse refractory organic compounds and chirality samples which undergo GC-ITMS analysis may be submitted to a derivatization process, consisting of the reaction of the sample components with specific reactants (MTBSTFA [1], DMF-DMA [2] or TMAH [3]). To optimize and test the performance of the GC-ITMS instrument we have performed several coupling tests campaigns between the GC, providing by the French team (LISA, LATMOS, CentraleSupelec), and the MS, providing by the US team (NASA, GSFC). Last campaign has been done with the ITU models wich is similar to the flight model and wich include the oven and the taping station providing by the German team (MPS). The results obtained demonstrate the current status of the end-to-end performance of the gas chromatography-mass spectrometry mode of operation. References:[1] Buch, A. et al. (2009) J chrom. A, 43, 143-151. [2] Freissinet et al. (2011) J Chrom A, 1306, 59-71. [3] Geffroy-Rodier, C. et al. (2009) JAAP, 85, 454-459. Acknowledgements: Funding provided by the Mars Exploration Program (point of contact, George Tahu, NASA/HQ). MOMA is a collaboration between NASA and ESA (PI Goesmann, MPS). MOMA-GC team acknowledges support from the French Space Agency (CNES), French National Programme of Planetology (PNP), National French Council (CNRS), Pierre Simon Laplace Institute.

  5. Development of an End-to-End Active Debris Removal (ADR) Mission Strategic Plan

    Data.gov (United States)

    National Aeronautics and Space Administration — The original proposal was to develop an ADR mission strategic plan. However, the task was picked up by the OCT. Subsequently the award was de-scoped to $30K to...

  6. An end-to-end microfluidic platform for engineering life supporting microbes in space exploration missions, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — HJ Science & Technology proposes a programmable, low-cost, and compact microfluidic platform capable of running automated end-to-end processes and optimization...

  7. End to End Travel

    Data.gov (United States)

    US Agency for International Development — E2 Solutions is a web based end-to-end travel management tool that includes paperless travel authorization and voucher document submissions, document approval...

  8. End-to-End Trajectory for Conjunction Class Mars Missions Using Hybrid Solar-Electric/Chemical Transportation System

    Science.gov (United States)

    Chai, Patrick R.; Merrill, Raymond G.; Qu, Min

    2016-01-01

    NASA's Human Spaceflight Architecture Team is developing a reusable hybrid transportation architecture in which both chemical and solar-electric propulsion systems are used to deliver crew and cargo to exploration destinations. By combining chemical and solar-electric propulsion into a single spacecraft and applying each where it is most effective, the hybrid architecture enables a series of Mars trajectories that are more fuel efficient than an all chemical propulsion architecture without significant increases to trip time. The architecture calls for the aggregation of exploration assets in cislunar space prior to departure for Mars and utilizes high energy lunar-distant high Earth orbits for the final staging prior to departure. This paper presents the detailed analysis of various cislunar operations for the EMC Hybrid architecture as well as the result of the higher fidelity end-to-end trajectory analysis to understand the implications of the design choices on the Mars exploration campaign.

  9. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The development of the Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large complex systems engineering challenge being addressed in part by focusing on the specific subsystems handling of off-nominal mission and fault tolerance. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML), the Mission and Fault Management (M&FM) algorithms are crafted and vetted in specialized Integrated Development Teams composed of multiple development disciplines. NASA also has formed an M&FM team for addressing fault management early in the development lifecycle. This team has developed a dedicated Vehicle Management End-to-End Testbed (VMET) that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. The flexibility of VMET enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the algorithms utilizing actual subsystem models. The intent is to validate the algorithms and substantiate them with performance baselines for each of the vehicle subsystems in an independent platform exterior to flight software test processes. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test processes. Risk reduction is addressed by working with other organizations such as S

  10. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    Science.gov (United States)

    Trevino, Luis; Patterson, Jonathan; Teare, David; Johnson, Stephen

    2015-01-01

    integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. Additionally, the team has developed processes for implementing and validating these algorithms for concept validation and risk reduction for the SLS program. The flexibility of the Vehicle Management End-to-end Testbed (VMET) enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the developed algorithms utilizing actual subsystem models such as MPS. The intent of VMET is to validate the M&FM algorithms and substantiate them with performance baselines for each of the target vehicle subsystems in an independent platform exterior to the flight software development infrastructure and its related testing entities. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test cases into flight software compounded with potential human errors throughout the development lifecycle. Risk reduction is addressed by the M&FM analysis group working with other organizations such as S&MA, Structures and Environments, GNC, Orion, the Crew Office, Flight Operations, and Ground Operations by assessing performance of the M&FM algorithms in terms of their ability to reduce Loss of Mission and Loss of Crew probabilities. In addition, through state machine and diagnostic modeling, analysis efforts investigate a broader suite of failure effects and associated detection and responses that can be tested in VMET to ensure that failures can be detected, and confirm that responses do not create additional risks or cause undesired states through interactive dynamic effects with other algorithms and systems. VMET further contributes to risk reduction by prototyping and exercising the M&FM algorithms early in their implementation and without any inherent hindrances such as meeting FSW

  11. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASAs Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The engineering development of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS) requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The nominal and off-nominal characteristics of SLS's elements and subsystems must be understood and matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex systems engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model-based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model-based algorithms and their development lifecycle from inception through FSW certification are an important focus of SLS's development effort to further ensure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. To test and validate these M&FM algorithms a dedicated test-bed was developed for full Vehicle Management End-to-End Testing (VMET). For addressing fault management (FM

  12. The Swarm End-to-End mission simulator study: A demonstration of separating the various contributions to Earth's magnetic field using synthetic data

    DEFF Research Database (Denmark)

    Olsen, Nils; Haagmans, R.; Sabaka, T.J.

    2006-01-01

    Swarm, a satellite constellation to measure Earth's magnetic field with unpreceded accuracy, has been selected by ESA for launch in 2009. The mission will provide the best ever survey of the geomagnetic field and its temporal evolution, in order to gain new insights into the Earth system...... to the science objectives of Swarm. In order to be able to use realistic parameters of the Earth's environment, the mission simulation starts at January 1, 1997 and lasts until re-entry of the lower satellites five years later. Synthetic magnetic field values were generated for all relevant contributions...

  13. SU-F-P-39: End-To-End Validation of a 6 MV High Dose Rate Photon Beam, Configured for Eclipse AAA Algorithm Using Golden Beam Data, for SBRT Treatments Using RapidArc

    Energy Technology Data Exchange (ETDEWEB)

    Ferreyra, M; Salinas Aranda, F; Dodat, D; Sansogne, R; Arbiser, S [Vidt Centro Medico, Ciudad Autonoma De Buenos Aires, Ciudad Autonoma de Buenos Aire (Argentina)

    2016-06-15

    Purpose: To use end-to-end testing to validate a 6 MV high dose rate photon beam, configured for Eclipse AAA algorithm using Golden Beam Data (GBD), for SBRT treatments using RapidArc. Methods: Beam data was configured for Varian Eclipse AAA algorithm using the GBD provided by the vendor. Transverse and diagonals dose profiles, PDDs and output factors down to a field size of 2×2 cm2 were measured on a Varian Trilogy Linac and compared with GBD library using 2% 2mm 1D gamma analysis. The MLC transmission factor and dosimetric leaf gap were determined to characterize the MLC in Eclipse. Mechanical and dosimetric tests were performed combining different gantry rotation speeds, dose rates and leaf speeds to evaluate the delivery system performance according to VMAT accuracy requirements. An end-to-end test was implemented planning several SBRT RapidArc treatments on a CIRS 002LFC IMRT Thorax Phantom. The CT scanner calibration curve was acquired and loaded in Eclipse. PTW 31013 ionization chamber was used with Keithley 35617EBS electrometer for absolute point dose measurements in water and lung equivalent inserts. TPS calculated planar dose distributions were compared to those measured using EPID and MapCheck, as an independent verification method. Results were evaluated with gamma criteria of 2% dose difference and 2mm DTA for 95% of points. Results: GBD set vs. measured data passed 2% 2mm 1D gamma analysis even for small fields. Machine performance tests show results are independent of machine delivery configuration, as expected. Absolute point dosimetry comparison resulted within 4% for the worst case scenario in lung. Over 97% of the points evaluated in dose distributions passed gamma index analysis. Conclusion: Eclipse AAA algorithm configuration of the 6 MV high dose rate photon beam using GBD proved efficient. End-to-end test dose calculation results indicate it can be used clinically for SBRT using RapidArc.

  14. Standardizing an End-to-end Accounting Service

    Science.gov (United States)

    Greenberg, Edward; Kazz, Greg

    2006-01-01

    Currently there are no space system standards available for space agencies to accomplish end-to-end accounting. Such a standard does not exist for spacecraft operations nor for tracing the relationship between the mission planning activities, the command sequences designed to perform those activities, the commands formulated to initiate those activities and the mission data and specifically the mission data products created by those activities. In order for space agencies to cross-support one another for data accountability/data tracing and for inter agency spacecraft to interoperate with each other, an international CCSDS standard for end-to-end data accountability/tracing needs to be developed. We will first describe the end-to-end accounting service model and functionality that supports the service. This model will describe how science plans that are ultimately transformed into commands can be associated with the telemetry products generated as a result of their execution. Moreover, the interaction between end-to-end accounting and service management will be explored. Finally, we will show how the standard end-to-end accounting service can be applied to a real life flight project i.e., the Mars Reconnaissance Orbiter project.

  15. End-to-End Security for Personal Telehealth

    NARCIS (Netherlands)

    Koster, R.P.; Asim, M.; Petkovic, M.

    2011-01-01

    Personal telehealth is in rapid development with innovative emerging applications like disease management. With personal telehealth people participate in their own care supported by an open distributed system with health services. This poses new end-to-end security and privacy challenges. In this

  16. OISI dynamic end-to-end modeling tool

    Science.gov (United States)

    Kersten, Michael; Weidler, Alexander; Wilhelm, Rainer; Johann, Ulrich A.; Szerdahelyi, Laszlo

    2000-07-01

    The OISI Dynamic end-to-end modeling tool is tailored to end-to-end modeling and dynamic simulation of Earth- and space-based actively controlled optical instruments such as e.g. optical stellar interferometers. `End-to-end modeling' is meant to denote the feature that the overall model comprises besides optical sub-models also structural, sensor, actuator, controller and disturbance sub-models influencing the optical transmission, so that the system- level instrument performance due to disturbances and active optics can be simulated. This tool has been developed to support performance analysis and prediction as well as control loop design and fine-tuning for OISI, Germany's preparatory program for optical/infrared spaceborne interferometry initiated in 1994 by Dornier Satellitensysteme GmbH in Friedrichshafen.

  17. Utilizing Domain Knowledge in End-to-End Audio Processing

    DEFF Research Database (Denmark)

    Tax, Tycho; Antich, Jose Luis Diez; Purwins, Hendrik

    2017-01-01

    to learn the commonly-used log-scaled mel-spectrogram transformation. Secondly, we demonstrate that upon initializing the first layers of an end-to-end CNN classifier with the learned transformation, convergence and performance on the ESC-50 environmental sound classification dataset are similar to a CNN......-based model trained on the highly pre-processed log-scaled mel-spectrogram features....

  18. Measurements and analysis of end-to-end Internet dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Paxson, Vern [Univ. of California, Berkeley, CA (United States). Computer Science Division

    1997-04-01

    Accurately characterizing end-to-end Internet dynamics - the performance that a user actually obtains from the lengthy series of network links that comprise a path through the Internet - is exceptionally difficult, due to the network`s immense heterogeneity. At the heart of this work is a `measurement framework` in which a number of sites around the Internet host a specialized measurement service. By coordinating `probes` between pairs of these sites one can measure end-to-end behavior along O(N2) paths for a framework consisting of N sites. Consequently, one obtains a superlinear scaling that allows measuring a rich cross-section of Internet behavior without requiring huge numbers of observation points. 37 sites participated in this study, allowing the author to measure more than 1,000 distinct Internet paths. The first part of this work looks at the behavior of end-to-end routing: the series of routers over which a connection`s packets travel. Based on 40,000 measurements made using this framework, the author analyzes: routing `pathologies` such as loops, outages, and flutter; the stability of routes over time; and the symmetry of routing along the two directions of an end-to-end path. The author finds that pathologies increased significantly over the course of 1995 and that Internet paths are heavily dominated by a single route. The second part of this work studies end-to-end Internet packet dynamics. The author analyzes 20,000 TCP transfers of 100 Kbyte each to investigate the performance of both the TCP endpoints and the Internet paths. The measurements used for this part of the study are much richer than those for the first part, but require a great degree of attention to issues of calibration, which are addressed by applying self-consistency checks to the measurements whenever possible. The author finds that packet filters are capable of a wide range of measurement errors, some of which, if undetected, can significantly taint subsequent analysis.

  19. End-to-End Operations in the ELT Era

    Science.gov (United States)

    Hainaut, O. R.; Bierwirth, T.; Brillant, S.; Mieske, S.; Patat, F.; Rejkuba, M.; Romaniello, M.; Sterzik, M.

    2018-03-01

    The Data Flow System is the infrastructure on which Very Large Telescope (VLT) observations are performed at the Observatory, before and after the observations themselves take place. Since its original conception in the late 1990s, it has evolved to accommodate new observing modes and new instruments on La Silla and Paranal. Several updates and upgrades are needed to overcome its obsolescence and to integrate requirements from the new instruments from the community and, of course, from ESO's Extremely Large Telescope (ELT), which will be integrated into Paranal's operations. We describe the end-to-end operations and the resulting roadmap guiding their further development.

  20. End-to-end plasma bubble PIC simulations on GPUs

    Science.gov (United States)

    Germaschewski, Kai; Fox, William; Matteucci, Jackson; Bhattacharjee, Amitava

    2017-10-01

    Accelerator technologies play a crucial role in eventually achieving exascale computing capabilities. The current and upcoming leadership machines at ORNL (Titan and Summit) employ Nvidia GPUs, which provide vast computational power but also need specifically adapted computational kernels to fully exploit them. In this work, we will show end-to-end particle-in-cell simulations of the formation, evolution and coalescence of laser-generated plasma bubbles. This work showcases the GPU capabilities of the PSC particle-in-cell code, which has been adapted for this problem to support particle injection, a heating operator and a collision operator on GPUs.

  1. Cyberinfrastructure for End-to-End Environmental Explorations

    Science.gov (United States)

    Merwade, V.; Kumar, S.; Song, C.; Zhao, L.; Govindaraju, R.; Niyogi, D.

    2007-12-01

    The design and implementation of a cyberinfrastructure for End-to-End Environmental Exploration (C4E4) is presented. The C4E4 framework addresses the need for an integrated data/computation platform for studying broad environmental impacts by combining heterogeneous data resources with state-of-the-art modeling and visualization tools. With Purdue being a TeraGrid Resource Provider, C4E4 builds on top of the Purdue TeraGrid data management system and Grid resources, and integrates them through a service-oriented workflow system. It allows researchers to construct environmental workflows for data discovery, access, transformation, modeling, and visualization. Using the C4E4 framework, we have implemented an end-to-end SWAT simulation and analysis workflow that connects our TeraGrid data and computation resources. It enables researchers to conduct comprehensive studies on the impact of land management practices in the St. Joseph watershed using data from various sources in hydrologic, atmospheric, agricultural, and other related disciplines.

  2. An end to end secure CBIR over encrypted medical database.

    Science.gov (United States)

    Bellafqira, Reda; Coatrieux, Gouenou; Bouslimi, Dalel; Quellec, Gwenole

    2016-08-01

    In this paper, we propose a new secure content based image retrieval (SCBIR) system adapted to the cloud framework. This solution allows a physician to retrieve images of similar content within an outsourced and encrypted image database, without decrypting them. Contrarily to actual CBIR approaches in the encrypted domain, the originality of the proposed scheme stands on the fact that the features extracted from the encrypted images are themselves encrypted. This is achieved by means of homomorphic encryption and two non-colluding servers, we however both consider as honest but curious. In that way an end to end secure CBIR process is ensured. Experimental results carried out on a diabetic retinopathy database encrypted with the Paillier cryptosystem indicate that our SCBIR achieves retrieval performance as good as if images were processed in their non-encrypted form.

  3. System of end-to-end symmetric database encryption

    Science.gov (United States)

    Galushka, V. V.; Aydinyan, A. R.; Tsvetkova, O. L.; Fathi, V. A.; Fathi, D. V.

    2018-05-01

    The article is devoted to the actual problem of protecting databases from information leakage, which is performed while bypassing access control mechanisms. To solve this problem, it is proposed to use end-to-end data encryption, implemented at the end nodes of an interaction of the information system components using one of the symmetric cryptographic algorithms. For this purpose, a key management method designed for use in a multi-user system based on the distributed key representation model, part of which is stored in the database, and the other part is obtained by converting the user's password, has been developed and described. In this case, the key is calculated immediately before the cryptographic transformations and is not stored in the memory after the completion of these transformations. Algorithms for registering and authorizing a user, as well as changing his password, have been described, and the methods for calculating parts of a key when performing these operations have been provided.

  4. End-to-end learning for digital hologram reconstruction

    Science.gov (United States)

    Xu, Zhimin; Zuo, Si; Lam, Edmund Y.

    2018-02-01

    Digital holography is a well-known method to perform three-dimensional imaging by recording the light wavefront information originating from the object. Not only the intensity, but also the phase distribution of the wavefront can then be computed from the recorded hologram in the numerical reconstruction process. However, the reconstructions via the traditional methods suffer from various artifacts caused by twin-image, zero-order term, and noise from image sensors. Here we demonstrate that an end-to-end deep neural network (DNN) can learn to perform both intensity and phase recovery directly from an intensity-only hologram. We experimentally show that the artifacts can be effectively suppressed. Meanwhile, our network doesn't need any preprocessing for initialization, and is comparably fast to train and test, in comparison with the recently published learning-based method. In addition, we validate that the performance improvement can be achieved by introducing a prior on sparsity.

  5. End-to-end System Performance Simulation: A Data-Centric Approach

    Science.gov (United States)

    Guillaume, Arnaud; Laffitte de Petit, Jean-Luc; Auberger, Xavier

    2013-08-01

    In the early times of space industry, the feasibility of Earth observation missions was directly driven by what could be achieved by the satellite. It was clear to everyone that the ground segment would be able to deal with the small amount of data sent by the payload. Over the years, the amounts of data processed by the spacecrafts have been increasing drastically, leading to put more and more constraints on the ground segment performances - and in particular on timeliness. Nowadays, many space systems require high data throughputs and short response times, with information coming from multiple sources and involving complex algorithms. It has become necessary to perform thorough end-to-end analyses of the full system in order to optimise its cost and efficiency, but even sometimes to assess the feasibility of the mission. This paper presents a novel framework developed by Astrium Satellites in order to meet these needs of timeliness evaluation and optimisation. This framework, named ETOS (for “End-to-end Timeliness Optimisation of Space systems”), provides a modelling process with associated tools, models and GUIs. These are integrated thanks to a common data model and suitable adapters, with the aim of building suitable space systems simulators of the full end-to-end chain. A big challenge of such environment is to integrate heterogeneous tools (each one being well-adapted to part of the chain) into a relevant timeliness simulation.

  6. End-to-end network models encompassing terrestrial, wireless, and satellite components

    Science.gov (United States)

    Boyarko, Chandler L.; Britton, John S.; Flores, Phil E.; Lambert, Charles B.; Pendzick, John M.; Ryan, Christopher M.; Shankman, Gordon L.; Williams, Ramon P.

    2004-08-01

    Development of network models that reflect true end-to-end architectures such as the Transformational Communications Architecture need to encompass terrestrial, wireless and satellite component to truly represent all of the complexities in a world wide communications network. Use of best-in-class tools including OPNET, Satellite Tool Kit (STK), Popkin System Architect and their well known XML-friendly definitions, such as OPNET Modeler's Data Type Description (DTD), or socket-based data transfer modules, such as STK/Connect, enable the sharing of data between applications for more rapid development of end-to-end system architectures and a more complete system design. By sharing the results of and integrating best-in-class tools we are able to (1) promote sharing of data, (2) enhance the fidelity of our results and (3) allow network and application performance to be viewed in the context of the entire enterprise and its processes.

  7. End-to-End Adversarial Retinal Image Synthesis.

    Science.gov (United States)

    Costa, Pedro; Galdran, Adrian; Meyer, Maria Ines; Niemeijer, Meindert; Abramoff, Michael; Mendonca, Ana Maria; Campilho, Aurelio

    2018-03-01

    In medical image analysis applications, the availability of the large amounts of annotated data is becoming increasingly critical. However, annotated medical data is often scarce and costly to obtain. In this paper, we address the problem of synthesizing retinal color images by applying recent techniques based on adversarial learning. In this setting, a generative model is trained to maximize a loss function provided by a second model attempting to classify its output into real or synthetic. In particular, we propose to implement an adversarial autoencoder for the task of retinal vessel network synthesis. We use the generated vessel trees as an intermediate stage for the generation of color retinal images, which is accomplished with a generative adversarial network. Both models require the optimization of almost everywhere differentiable loss functions, which allows us to train them jointly. The resulting model offers an end-to-end retinal image synthesis system capable of generating as many retinal images as the user requires, with their corresponding vessel networks, by sampling from a simple probability distribution that we impose to the associated latent space. We show that the learned latent space contains a well-defined semantic structure, implying that we can perform calculations in the space of retinal images, e.g., smoothly interpolating new data points between two retinal images. Visual and quantitative results demonstrate that the synthesized images are substantially different from those in the training set, while being also anatomically consistent and displaying a reasonable visual quality.

  8. Modeling and simulation of satellite subsystems for end-to-end spacecraft modeling

    Science.gov (United States)

    Schum, William K.; Doolittle, Christina M.; Boyarko, George A.

    2006-05-01

    During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems. Much of this research has occurred in the Distributed Architecture Simulation Laboratory (DASL). AFRL developers working in the DASL have effectively combined satellite power, attitude pointing, and communication link analysis subsystem models with robust satellite sensor models to create a first-order end-to-end satellite simulation capability. The merging of these two simulation areas has advanced the field of spacecraft simulation, design, and analysis, and enabled more in-depth mission and satellite utility analyses. A core capability of the DASL is the support of a variety of modeling and analysis efforts, ranging from physics and engineering-level modeling to mission and campaign-level analysis. The flexibility and agility of this simulation architecture will be used to support space mission analysis, military utility analysis, and various integrated exercises with other military and space organizations via direct integration, or through DOD standards such as Distributed Interaction Simulation. This paper discusses the results and lessons learned in modeling satellite communication link analysis, power, and attitude control subsystems for an end-to-end satellite simulation. It also discusses how these spacecraft subsystem simulations feed into and support military utility and space mission analyses.

  9. An overview of recent end-to-end wireless medical video telemedicine systems using 3G.

    Science.gov (United States)

    Panayides, A; Pattichis, M S; Pattichis, C S; Schizas, C N; Spanias, A; Kyriacou, E

    2010-01-01

    Advances in video compression, network technologies, and computer technologies have contributed to the rapid growth of mobile health (m-health) systems and services. Wide deployment of such systems and services is expected in the near future, and it's foreseen that they will soon be incorporated in daily clinical practice. This study focuses in describing the basic components of an end-to-end wireless medical video telemedicine system, providing a brief overview of the recent advances in the field, while it also highlights future trends in the design of telemedicine systems that are diagnostically driven.

  10. A Bayes Theory-Based Modeling Algorithm to End-to-end Network Traffic

    OpenAIRE

    Zhao Hong-hao; Meng Fan-bo; Zhao Si-wen; Zhao Si-hang; Lu Yi

    2016-01-01

    Recently, network traffic has exponentially increasing due to all kind of applications, such as mobile Internet, smart cities, smart transportations, Internet of things, and so on. the end-to-end network traffic becomes more important for traffic engineering. Usually end-to-end traffic estimation is highly difficult. This paper proposes a Bayes theory-based method to model the end-to-end network traffic. Firstly, the end-to-end network traffic is described as a independent identically distrib...

  11. Reversible end-to-end assembly of gold nanorods using a disulfide-modified polypeptide

    International Nuclear Information System (INIS)

    Walker, David A; Gupta, Vinay K

    2008-01-01

    Directing the self-assembly of colloidal particles into nanostructures is of great interest in nanotechnology. Here, reversible end-to-end assembly of gold nanorods (GNR) is induced by pH-dependent changes in the secondary conformation of a disulfide-modified poly(L-glutamic acid) (SSPLGA). The disulfide anchoring group drives chemisorption of the polyacid onto the end of the gold nanorods in an ethanolic solution. A layer of poly(vinyl pyrrolidone) is adsorbed on the positively charged, surfactant-stabilized GNR to screen the surfactant bilayer charge and provide stability for dispersion of the GNR in ethanol. For comparison, irreversible end-to-end assembly using a bidentate ligand, namely 1,6-hexanedithiol, is also performed. Characterization of the modified GNR and its end-to-end linking behavior using SSPLGA and hexanedithiol is performed using dynamic light scattering (DLS), UV-vis absorption spectroscopy and transmission electron microscopy (TEM). Experimental results show that, in a colloidal solution of GNR-SSPLGA at a pH∼3.5, where the PLGA is in an α-helical conformation, the modified GNR self-assemble into one-dimensional nanostructures. The linking behavior can be reversed by increasing the pH (>8.5) to drive the conformation of the polypeptide to a random coil and this reversal with pH occurs rapidly within minutes. Cycling the pH multiple times between low and high pH values can be used to drive the formation of the nanostructures of the GNR and disperse them in solution.

  12. End-to-End Traffic Flow Modeling of the Integrated SCaN Network

    Science.gov (United States)

    Cheung, K.-M.; Abraham, D. S.

    2012-05-01

    In this article, we describe the analysis and simulation effort of the end-to-end traffic flow for the Integrated Space Communications and Navigation (SCaN) Network. Using the network traffic derived for the 30-day period of July 2018 from the Space Communications Mission Model (SCMM), we generate the wide-area network (WAN) bandwidths of the ground links for different architecture options of the Integrated SCaN Network. We also develop a new analytical scheme to model the traffic flow and buffering mechanism of a store-and-forward network. It is found that the WAN bandwidth of the Integrated SCaN Network is an important differentiator of different architecture options, as the recurring circuit costs of certain architecture options can be prohibitively high.

  13. The end-to-end testbed of the optical metrology system on-board LISA Pathfinder

    Energy Technology Data Exchange (ETDEWEB)

    Steier, F; Cervantes, F Guzman; Marin, A F GarcIa; Heinzel, G; Danzmann, K [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut) and Universitaet Hannover (Germany); Gerardi, D, E-mail: frank.steier@aei.mpg.d [EADS Astrium Satellites GmbH, Friedrichshafen (Germany)

    2009-05-07

    LISA Pathfinder is a technology demonstration mission for the Laser Interferometer Space Antenna (LISA). The main experiment on-board LISA Pathfinder is the so-called LISA Technology Package (LTP) which has the aim to measure the differential acceleration between two free-falling test masses with an accuracy of 3 x 10{sup -14} ms{sup -2} Hz{sup -1/2} between 1 mHz and 30 mHz. This measurement is performed interferometrically by the optical metrology system (OMS) on-board LISA Pathfinder. In this paper, we present the development of an experimental end-to-end testbed of the entire OMS. It includes the interferometer and its sub-units, the interferometer backend which is a phasemeter and the processing of the phasemeter output data. Furthermore, three-axes piezo-actuated mirrors are used instead of the free-falling test masses for the characterization of the dynamic behaviour of the system and some parts of the drag-free and attitude control system (DFACS) which controls the test masses and the satellite. The end-to-end testbed includes all parts of the LTP that can reasonably be tested on earth without free-falling test masses. At its present status it consists mainly of breadboard components. Some of those have already been replaced by engineering models of the LTP experiment. In the next steps, further engineering and flight models will also be inserted in this testbed and tested against well-characterized breadboard components. The presented testbed is an important reference for the unit tests and can also be used for validation of the on-board experiment during the mission.

  14. AN ANALYSIS OF THE APPLICATION END TO END QUALITY OF SERVICE ON 3G TELECOMMUNICATION NETWORK

    Directory of Open Access Journals (Sweden)

    Cahya Lukito

    2012-05-01

    Full Text Available End to End Quality of Service is a way to provide data package service in a telecommunication network that based on Right Price, Right Service Level, and Right Quality. The goal of this research is to analyze the impact of End to End QoS use on 3G telecommunication network for voice service and data. This research uses an analysis method by doing the application on the lab. The result that is achieved in this research shows that End to End QoS is very influental to the Service Level Agreement to the users of the telecommunication service.Keywords: End to End Qos, SLA, Diffserv

  15. An end-to-end secure patient information access card system.

    Science.gov (United States)

    Alkhateeb, A; Singer, H; Yakami, M; Takahashi, T

    2000-03-01

    The rapid development of the Internet and the increasing interest in Internet-based solutions has promoted the idea of creating Internet-based health information applications. This will force a change in the role of IC cards in healthcare card systems from a data carrier to an access key medium. At the Medical Informatics Department of Kyoto University Hospital we are developing a smart card patient information project where patient databases are accessed via the Internet. Strong end-to-end data encryption is performed via Secure Socket Layers, transparent to transmit patient information. The smart card is playing the crucial role of access key to the database: user authentication is performed internally without ever revealing the actual key. For easy acceptance by healthcare professionals, the user interface is integrated as a plug-in for two familiar Web browsers, Netscape Navigator and MS Internet Explorer.

  16. Experimental demonstration of software defined data center optical networks with Tbps end-to-end tunability

    Science.gov (United States)

    Zhao, Yongli; Zhang, Jie; Ji, Yuefeng; Li, Hui; Wang, Huitao; Ge, Chao

    2015-10-01

    The end-to-end tunability is important to provision elastic channel for the burst traffic of data center optical networks. Then, how to complete the end-to-end tunability based on elastic optical networks? Software defined networking (SDN) based end-to-end tunability solution is proposed for software defined data center optical networks, and the protocol extension and implementation procedure are designed accordingly. For the first time, the flexible grid all optical networks with Tbps end-to-end tunable transport and switch system have been online demonstrated for data center interconnection, which are controlled by OpenDayLight (ODL) based controller. The performance of the end-to-end tunable transport and switch system has been evaluated with wavelength number tuning, bit rate tuning, and transmit power tuning procedure.

  17. A Bayes Theory-Based Modeling Algorithm to End-to-end Network Traffic

    Directory of Open Access Journals (Sweden)

    Zhao Hong-hao

    2016-01-01

    Full Text Available Recently, network traffic has exponentially increasing due to all kind of applications, such as mobile Internet, smart cities, smart transportations, Internet of things, and so on. the end-to-end network traffic becomes more important for traffic engineering. Usually end-to-end traffic estimation is highly difficult. This paper proposes a Bayes theory-based method to model the end-to-end network traffic. Firstly, the end-to-end network traffic is described as a independent identically distributed normal process. Then the Bases theory is used to characterize the end-to-end network traffic. By calculating the parameters, the model is determined correctly. Simulation results show that our approach is feasible and effective.

  18. End-to-end models for marine ecosystems: Are we on the precipice of a significant advance or just putting lipstick on a pig?

    Directory of Open Access Journals (Sweden)

    Kenneth A. Rose

    2012-02-01

    Full Text Available There has been a rapid rise in the development of end-to-end models for marine ecosystems over the past decade. Some reasons for this rise include need for predicting effects of climate change on biota and dissatisfaction with existing models. While the benefits of a well-implemented end-to-end model are straightforward, there are many challenges. In the short term, my view is that the major role of end-to-end models is to push the modelling community forward, and to identify critical data so that these data can be collected now and thus be available for the next generation of end-to-end models. I think we should emulate physicists and build theoretically-oriented models first, and then collect the data. In the long-term, end-to-end models will increase their skill, data collection will catch up, and end-to-end models will move towards site-specific applications with forecasting and management capabilities. One pathway into the future is individual efforts, over-promise, and repackaging of poorly performing component submodels (“lipstick on a pig”. The other pathway is a community-based collaborative effort, with appropriate caution and thoughtfulness, so that the needed improvements are achieved (“significant advance”. The promise of end-to-end modelling is great. We should act now to avoid missing a great opportunity.

  19. Model outputs - Developing end-to-end models of the Gulf of California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the northern Gulf of California, linking oceanography, biogeochemistry, food web...

  20. Physical oceanography - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  1. Atlantis model outputs - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  2. Rapid Automated Mission Planning System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed innovation is an automated UAS mission planning system that will rapidly identify emergency (contingency) landing sites, manage contingency routing, and...

  3. Urban Biomining Meets Printable Electronics: End-To-End at Destination Biological Recycling and Reprinting

    Science.gov (United States)

    Rothschild, Lynn J. (Principal Investigator); Koehne, Jessica; Gandhiraman, Ram; Navarrete, Jesica; Spangle, Dylan

    2017-01-01

    Space missions rely utterly on metallic components, from the spacecraft to electronics. Yet, metals add mass, and electronics have the additional problem of a limited lifespan. Thus, current mission architectures must compensate for replacement. In space, spent electronics are discarded; on earth, there is some recycling but current processes are toxic and environmentally hazardous. Imagine instead an end-to-end recycling of spent electronics at low mass, low cost, room temperature, and in a non-toxic manner. Here, we propose a solution that will not only enhance mission success by decreasing upmass and providing a fresh supply of electronics, but in addition has immediate applications to a serious environmental issue on the Earth. Spent electronics will be used as feedstock to make fresh electronic components, a process we will accomplish with so-called 'urban biomining' using synthetically enhanced microbes to bind metals with elemental specificity. To create new electronics, the microbes will be used as 'bioink' to print a new IC chip, using plasma jet electronics printing. The plasma jet electronics printing technology will have the potential to use martian atmospheric gas to print and to tailor the electronic and chemical properties of the materials. Our preliminary results have suggested that this process also serves as a purification step to enhance the proportion of metals in the 'bioink'. The presence of electric field and plasma can ensure printing in microgravity environment while also providing material morphology and electronic structure tunabiity and thus optimization. Here we propose to increase the TRL level of the concept by engineering microbes to dissolve the siliceous matrix in the IC, extract copper from a mixture of metals, and use the microbes as feedstock to print interconnects using mars gas simulant. To assess the ability of this concept to influence mission architecture, we will do an analysis of the infrastructure required to execute

  4. A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Michelle M. [Southern Illinois Univ., Carbondale, IL (United States); Wu, Chase Q. [Univ. of Memphis, TN (United States)

    2013-11-07

    Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization for this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.

  5. Automated Design of Propellant-Optimal, End-to-End, Low-Thrust Trajectories for Trojan Asteroid Tours

    Science.gov (United States)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2013-01-01

    The Sun-Jupiter Trojan asteroids are celestial bodies of great scientific interest as well as potential resources offering water and other mineral resources for longterm human exploration of the solar system. Previous investigations under this project have addressed the automated design of tours within the asteroid swarm. This investigation expands the current automation scheme by incorporating options for a complete trajectory design approach to the Trojan asteroids. Computational aspects of the design procedure are automated such that end-to-end trajectories are generated with a minimum of human interaction after key elements and constraints associated with a proposed mission concept are specified.

  6. End-to-side and end-to-end anastomoses give similar results in cervical oesophagogastrostomy.

    Science.gov (United States)

    Pierie, J P; De Graaf, P W; Poen, H; Van Der Tweel, I; Obertop, H

    1995-12-01

    To find out if there were any differences in healing between end-to-end and end-to-side anastomoses for oesophagogastrostomy. Open study with historical controls. University hospital, The Netherlands. 28 patients with end-to-end and 90 patients with end-to-side anastomoses after transhiatal oesophagectomy and partial gastrectomy for cancer of the oesophagus or oesophagogastric junction, with gastric tube reconstruction and cervical anastomosis. Leak and stricture rates, and the number of dilatations needed to relieve dysphagia. There were no significant differences in leak rates (end-to-end 4/28, 14%, and end-to-side 13/90, 14%) or anastomotic strictures (end-to-end 9/28, 32%, and end-to-side 26/90, 29%). The median number of dilatations needed to relieve dysphagia was 7 (1-33) after end-to-end and 9 (1-113) after end-to-side oesophagogastrostomy. There were no differences between the two methods of suture of cervical oesophagogastrostomy when leakage, stricture, and number of dilatations were used as criteria of good healing.

  7. Automatic provisioning of end-to-end QoS into the home

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Skoldström, Pontus; Nelis, Jelle

    2011-01-01

    Due to a growing number of high bandwidth applications today (such as HDTV), and an increasing amount of network and cloud based applications, service providers need to pay attention to QoS in their networks. We believe there is a need for an end-to-end approach reaching into the home as well....... The Home Gateway (HG) as a key component of the home network is crucial for enabling the end-to-end solutions. UPnP-QoS has been proposed as an inhome solution for resource reservations. In this paper we assess a solution for automatic QoS reservations, on behalf of non-UPnP-QoS aware applications....... Additionally we focus on an integrated end-to-end solution, combining GMPLS-based reservations in e.g., access/metro and UPnP-QoS based reservation in the home network....

  8. Security Considerations around End-to-End Security in the IP-based Internet of Things

    NARCIS (Netherlands)

    Brachmann, M.; Garcia-Mochon, O.; Keoh, S.L.; Kumar, S.S.

    2012-01-01

    The IP-based Internet of Things refers to the interconnection of smart objects in a Low-power and Lossy Network (LLN) with the Internetby means of protocols such as 6LoWPAN or CoAP. The provisioning of an end-to-end security connection is the key to ensure basic functionalities such as software

  9. QoC-based Optimization of End-to-End M-Health Data Delivery Services

    NARCIS (Netherlands)

    Widya, I.A.; van Beijnum, Bernhard J.F.; Salden, Alfons

    2006-01-01

    This paper addresses how Quality of Context (QoC) can be used to optimize end-to-end mobile healthcare (m-health) data delivery services in the presence of alternative delivery paths, which is quite common in a pervasive computing and communication environment. We propose min-max-plus based

  10. End-to-End Availability Analysis of IMS-Based Networks

    DEFF Research Database (Denmark)

    Kamyod, Chayapol; Nielsen, Rasmus Hjorth; Prasad, Neeli R.

    2013-01-01

    Generation Networks (NGNs). In this paper, an end-to-end availability model is proposed and evaluated using a combination of Reliability Block Diagrams (RBD) and a proposed five-state Markov model. The overall availability for intra- and inter domain communication in IMS is analyzed, and the state...

  11. Location Assisted Vertical Handover Algorithm for QoS Optimization in End-to-End Connections

    DEFF Research Database (Denmark)

    Dam, Martin S.; Christensen, Steffen R.; Mikkelsen, Lars M.

    2012-01-01

    implementation on Android based tablets. The simulations cover a wide range of scenarios for two mobile users in an urban area with ubiquitous cellular coverage, and shows our algorithm leads to increased throughput, with fewer handovers, when considering the end-to-end connection than to other handover schemes...

  12. End-to-End Delay Model for Train Messaging over Public Land Mobile Networks

    Directory of Open Access Journals (Sweden)

    Franco Mazzenga

    2017-11-01

    Full Text Available Modern train control systems rely on a dedicated radio network for train to ground communications. A number of possible alternatives have been analysed to adopt the European Rail Traffic Management System/European Train Control System (ERTMS/ETCS control system on local/regional lines to improve transport capacity. Among them, a communication system based on public networks (cellular&satellite provides an interesting, effective and alternative solution to proprietary and expensive radio networks. To analyse performance of this solution, it is necessary to model the end-to-end delay and message loss to fully characterize the message transfer process from train to ground and vice versa. Starting from the results of a railway test campaign over a 300 km railway line for a cumulative 12,000 traveled km in 21 days, in this paper, we derive a statistical model for the end-to-end delay required for delivering messages. In particular, we propose a two states model allowing for reproducing the main behavioral characteristics of the end-to-end delay as observed experimentally. Model formulation has been derived after deep analysis of the recorded experimental data. When it is applied to model a realistic scenario, it allows for explicitly accounting for radio coverage characteristics, the received power level, the handover points along the line and for the serving radio technology. As an example, the proposed model is used to generate the end-to-end delay profile in a realistic scenario.

  13. End-to-end Configuration of Wireless Realtime Communication over Heterogeneous Protocols

    DEFF Research Database (Denmark)

    Malinowsky, B.; Grønbæk, Jesper; Schwefel, Hans-Peter

    2015-01-01

    This paper describes a wireless real-time communication system design using two Time Division Multiple Access (TDMA) protocols. Messages are subject to prioritization and queuing. For this interoperation scenario, we show a method for end-to-end configuration of protocols and queue sizes. Such co...

  14. Coupling of a single quantum emitter to end-to-end aligned silver nanowires

    DEFF Research Database (Denmark)

    Kumar, Shailesh; Huck, Alexander; Chen, Yuntian

    2013-01-01

    We report on the observation of coupling a single nitrogen vacancy (NV) center in a nanodiamond crystal to a propagating plasmonic mode of silver nanowires. The nanocrystal is placed either near the apex of a single silver nanowire or in the gap between two end-to-end aligned silver nanowires. We...

  15. End-to-end tests using alanine dosimetry in scanned proton beams

    Science.gov (United States)

    Carlino, A.; Gouldstone, C.; Kragl, G.; Traneus, E.; Marrale, M.; Vatnitsky, S.; Stock, M.; Palmans, H.

    2018-03-01

    This paper describes end-to-end test procedures as the last fundamental step of medical commissioning before starting clinical operation of the MedAustron synchrotron-based pencil beam scanning (PBS) therapy facility with protons. One in-house homogeneous phantom and two anthropomorphic heterogeneous (head and pelvis) phantoms were used for end-to-end tests at MedAustron. The phantoms were equipped with alanine detectors, radiochromic films and ionization chambers. The correction for the ‘quenching’ effect of alanine pellets was implemented in the Monte Carlo platform of the evaluation version of RayStation TPS. During the end-to-end tests, the phantoms were transferred through the workflow like real patients to simulate the entire clinical workflow: immobilization, imaging, treatment planning and dose delivery. Different clinical scenarios of increasing complexity were simulated: delivery of a single beam, two oblique beams without and with range shifter. In addition to the dose comparison in the plastic phantoms the dose obtained from alanine pellet readings was compared with the dose determined with the Farmer ionization chamber in water. A consistent systematic deviation of about 2% was found between alanine dosimetry and the ionization chamber dosimetry in water and plastic materials. Acceptable agreement of planned and delivered doses was observed together with consistent and reproducible results of the end-to-end testing performed with different dosimetric techniques (alanine detectors, ionization chambers and EBT3 radiochromic films). The results confirmed the adequate implementation and integration of the new PBS technology at MedAustron. This work demonstrates that alanine pellets are suitable detectors for end-to-end tests in proton beam therapy and the developed procedures with customized anthropomorphic phantoms can be used to support implementation of PBS technology in clinical practice.

  16. End-to-End Flow Control for Visual-Haptic Communication under Bandwidth Change

    Science.gov (United States)

    Yashiro, Daisuke; Tian, Dapeng; Yakoh, Takahiro

    This paper proposes an end-to-end flow controller for visual-haptic communication. A visual-haptic communication system transmits non-real-time packets, which contain large-size visual data, and real-time packets, which contain small-size haptic data. When the transmission rate of visual data exceeds the communication bandwidth, the visual-haptic communication system becomes unstable owing to buffer overflow. To solve this problem, an end-to-end flow controller is proposed. This controller determines the optimal transmission rate of visual data on the basis of the traffic conditions, which are estimated by the packets for haptic communication. Experimental results confirm that in the proposed method, a short packet-sending interval and a short delay are achieved under bandwidth change, and thus, high-precision visual-haptic communication is realized.

  17. Providing end-to-end QoS for multimedia applications in 3G wireless networks

    Science.gov (United States)

    Guo, Katherine; Rangarajan, Samapth; Siddiqui, M. A.; Paul, Sanjoy

    2003-11-01

    As the usage of wireless packet data services increases, wireless carriers today are faced with the challenge of offering multimedia applications with QoS requirements within current 3G data networks. End-to-end QoS requires support at the application, network, link and medium access control (MAC) layers. We discuss existing CDMA2000 network architecture and show its shortcomings that prevent supporting multiple classes of traffic at the Radio Access Network (RAN). We then propose changes in RAN within the standards framework that enable support for multiple traffic classes. In addition, we discuss how Session Initiation Protocol (SIP) can be augmented with QoS signaling for supporting end-to-end QoS. We also review state of the art scheduling algorithms at the base station and provide possible extensions to these algorithms to support different classes of traffic as well as different classes of users.

  18. Rectovaginal fistula following colectomy with an end-to-end anastomosis stapler for a colorectal adenocarcinoma.

    Science.gov (United States)

    Klein, A; Scotti, S; Hidalgo, A; Viateau, V; Fayolle, P; Moissonnier, P

    2006-12-01

    An 11-year-old, female neutered Labrador retriever was presented with a micro-invasive differentiated papillar adenocarcinoma at the colorectal junction. A colorectal end-to-end anastomosis stapler device was used to perform resection and anastomosis using a transanal technique. A rectovaginal fistula was diagnosed two days later. An exploratory laparotomy was conducted and the fistula was identified and closed. Early dehiscence of the colon was also suspected and another colorectal anastomosis was performed using a manual technique. Comparison to a conventional manual technique of intestinal surgery showed that the use of an automatic staple device was quicker and easier. To the authors' knowledge, this is the first report of a rectovaginal fistula occurring after end-to-end anastomosis stapler colorectal resection-anastomosis in the dog. To minimise the risk of this potential complication associated with the limited surgical visibility, adequate tissue retraction and inspection of the anastomosis site are essential.

  19. End-to-end delay analysis in wireless sensor networks with service vacation

    KAUST Repository

    Alabdulmohsin, Ibrahim; Hyadi, Amal; Afify, Laila H.; Shihada, Basem

    2014-01-01

    In this paper, a delay-sensitive multi-hop wireless sensor network is considered, employing an M/G/1 with vacations framework. Sensors transmit measurements to a predefined data sink subject to maximum end-to-end delay constraint. In order to prolong the battery lifetime, a sleeping scheme is adopted throughout the network nodes. The objective of our proposed framework is to present an expression for maximum hop-count as well as an approximate expression of the probability of blocking at the sink node upon violating certain end-to-end delay threshold. Using numerical simulations, we validate the proposed analytical model and demonstrate that the blocking probability of the system for various vacation time distributions matches the simulation results.

  20. End-to-end delay analysis in wireless sensor networks with service vacation

    KAUST Repository

    Alabdulmohsin, Ibrahim

    2014-04-01

    In this paper, a delay-sensitive multi-hop wireless sensor network is considered, employing an M/G/1 with vacations framework. Sensors transmit measurements to a predefined data sink subject to maximum end-to-end delay constraint. In order to prolong the battery lifetime, a sleeping scheme is adopted throughout the network nodes. The objective of our proposed framework is to present an expression for maximum hop-count as well as an approximate expression of the probability of blocking at the sink node upon violating certain end-to-end delay threshold. Using numerical simulations, we validate the proposed analytical model and demonstrate that the blocking probability of the system for various vacation time distributions matches the simulation results.

  1. Development of a Dynamic, End-to-End Free Piston Stirling Convertor Model

    Science.gov (United States)

    Regan, Timothy F.; Gerber, Scott S.; Roth, Mary Ellen

    2003-01-01

    A dynamic model for a free-piston Stirling convertor is being developed at the NASA Glenn Research Center. The model is an end-to-end system model that includes the cycle thermodynamics, the dynamics, and electrical aspects of the system. The subsystems of interest are the heat source, the springs, the moving masses, the linear alternator, the controller and the end-user load. The envisioned use of the model will be in evaluating how changes in a subsystem could affect the operation of the convertor. The model under development will speed the evaluation of improvements to a subsystem and aid in determining areas in which most significant improvements may be found. One of the first uses of the end-to-end model will be in the development of controller architectures. Another related area is in evaluating changes to details in the linear alternator.

  2. Building dialogue POMDPs from expert dialogues an end-to-end approach

    CERN Document Server

    Chinaei, Hamidreza

    2016-01-01

    This book discusses the Partially Observable Markov Decision Process (POMDP) framework applied in dialogue systems. It presents POMDP as a formal framework to represent uncertainty explicitly while supporting automated policy solving. The authors propose and implement an end-to-end learning approach for dialogue POMDP model components. Starting from scratch, they present the state, the transition model, the observation model and then finally the reward model from unannotated and noisy dialogues. These altogether form a significant set of contributions that can potentially inspire substantial further work. This concise manuscript is written in a simple language, full of illustrative examples, figures, and tables. Provides insights on building dialogue systems to be applied in real domain Illustrates learning dialogue POMDP model components from unannotated dialogues in a concise format Introduces an end-to-end approach that makes use of unannotated and noisy dialogue for learning each component of dialogue POM...

  3. Circular myotomy as an aid to resection and end-to-end anastomosis of the esophagus.

    Science.gov (United States)

    Attum, A A; Hankins, J R; Ngangana, J; McLaughlin, J S

    1979-08-01

    Segments ranging from 40 to 70% of the thoracic esophagus were resected in 80 mongrel dogs. End-to-end anastomosis was effected after circular myotomy either proximal or distal, or both proximal and distal, to the anastomosis. Among dogs undergoing resection of 60% of the esophagus, distal myotomy enabled 6 of 8 animals to survive, and combined proximal and distal myotomy permitted 8 of 10 to survive. Cineesophagography was performed in a majority of the 50 surviving animals and showed no appreciable delay of peristalsis at the myotomy sites. When these sites were examined at postmortem examination up to 13 months after operation, 1 dog showed a small diverticulum but none showed dilatation or stricture. It is concluded that circular myotomy holds real promise as a means of extending the clinical application of esophageal resection with end-to-end anastomosis.

  4. Financing the End-to-end Supply Chain: A Reference Guide to Supply Chain Finance

    OpenAIRE

    Templar, Simon; Hofmann, Erik; Findlay, Charles

    2016-01-01

    Financing the End to End Supply Chain provides readers with a real insight into the increasingly important area of supply chain finance. It demonstrates the importance of the strategic relationship between the physical supply of goods and services and the associated financial flows. The book provides a clear introduction, demonstrating the importance of the strategic relationship between supply chain and financial communities within an organization. It contains vital information on how supply...

  5. STS/DBS power subsystem end-to-end stability margin

    Science.gov (United States)

    Devaux, R. N.; Vattimo, R. J.; Peck, S. R.; Baker, W. E.

    Attention is given to a full-up end-to-end subsystem stability test which was performed with a flight solar array providing power to a fully operational spacecraft. The solar array simulator is described, and a comparison is made between test results obtained with the simulator and those obtained with the actual array. It is concluded that stability testing with a fully integrated spacecraft is necessary to ensure that all elements have been adequately modeled.

  6. Testing Application (End-to-End Performance of Networks With EFT Traffic

    Directory of Open Access Journals (Sweden)

    Vlatko Lipovac

    2009-01-01

    Full Text Available This paper studies how end-to-end application peiformance(of Electronic Financial Transaction traffic, in particulardepends on the actual protocol stacks, operating systemsand network transmission rates. With this respect, the respectivesimulation tests of peiformance of TCP and UDP protocolsrunning on various operating systems, ranging from Windows,Sun Solmis, to Linux have been implemented, and thedifferences in peiformance addressed focusing on throughputand response time.

  7. Experimental evaluation of end-to-end delay in switched Ethernet application in the automotive domain

    OpenAIRE

    Beretis , Kostas; Symeonidis , Ieroklis

    2013-01-01

    International audience; This article presents an approach for deriving upper bound for end-to-end delay in a double star switched Ethernet network. Four traffic classes, following a strict priority queuing policy, were considered. The theoretical analysis was based on network calculus. An experimental setup, which accu-rately reflects an automotive communication network, was implemented in or-der to evaluate the theoretical model. The results obtained by the experiments provided valuable feed...

  8. CHEETAH: circuit-switched high-speed end-to-end transport architecture

    Science.gov (United States)

    Veeraraghavan, Malathi; Zheng, Xuan; Lee, Hyuk; Gardner, M.; Feng, Wuchun

    2003-10-01

    Leveraging the dominance of Ethernet in LANs and SONET/SDH in MANs and WANs, we propose a service called CHEETAH (Circuit-switched High-speed End-to-End Transport ArcHitecture). The service concept is to provide end hosts with high-speed, end-to-end circuit connectivity on a call-by-call shared basis, where a "circuit" consists of Ethernet segments at the ends that are mapped into Ethernet-over-SONET long-distance circuits. This paper focuses on the file-transfer application for such circuits. For this application, the CHEETAH service is proposed as an add-on to the primary Internet access service already in place for enterprise hosts. This allows an end host that is sending a file to first attempt setting up an end-to-end Ethernet/EoS circuit, and if rejected, fall back to the TCP/IP path. If the circuit setup is successful, the end host will enjoy a much shorter file-transfer delay than on the TCP/IP path. To determine the conditions under which an end host with access to the CHEETAH service should attempt circuit setup, we analyze mean file-transfer delays as a function of call blocking probability in the circuit-switched network, probability of packet loss in the IP network, round-trip times, link rates, and so on.

  9. QoS Modeling for End-to-End Performance Evaluation over Networks with Wireless Access

    Directory of Open Access Journals (Sweden)

    Gómez Gerardo

    2010-01-01

    Full Text Available This paper presents an end-to-end Quality of Service (QoS model for assessing the performance of data services over networks with wireless access. The proposed model deals with performance degradation across protocol layers using a bottom-up strategy, starting with the physical layer and moving on up to the application layer. This approach makes it possible to analytically assess performance at different layers, thereby facilitating a possible end-to-end optimization process. As a representative case, a scenario where a set of mobile terminals connected to a streaming server through an IP access node has been studied. UDP, TCP, and the new TCP-Friendly Rate Control (TFRC protocols were analyzed at the transport layer. The radio interface consisted of a variable-rate multiuser and multichannel subsystem, including retransmissions and adaptive modulation and coding. The proposed analytical QoS model was validated on a real-time emulator of an end-to-end network with wireless access and proved to be very useful for the purposes of service performance estimation and optimization.

  10. End-to-end simulations and planning of a small space telescopes: Galaxy Evolution Spectroscopic Explorer: a case study

    Science.gov (United States)

    Heap, Sara; Folta, David; Gong, Qian; Howard, Joseph; Hull, Tony; Purves, Lloyd

    2016-08-01

    Large astronomical missions are usually general-purpose telescopes with a suite of instruments optimized for different wavelength regions, spectral resolutions, etc. Their end-to-end (E2E) simulations are typically photons-in to flux-out calculations made to verify that each instrument meets its performance specifications. In contrast, smaller space missions are usually single-purpose telescopes, and their E2E simulations start with the scientific question to be answered and end with an assessment of the effectiveness of the mission in answering the scientific question. Thus, E2E simulations for small missions consist a longer string of calculations than for large missions, as they include not only the telescope and instrumentation, but also the spacecraft, orbit, and external factors such as coordination with other telescopes. Here, we illustrate the strategy and organization of small-mission E2E simulations using the Galaxy Evolution Spectroscopic Explorer (GESE) as a case study. GESE is an Explorer/Probe-class space mission concept with the primary aim of understanding galaxy evolution. Operation of a small survey telescope in space like GESE is usually simpler than operations of large telescopes driven by the varied scientific programs of the observers or by transient events. Nevertheless, both types of telescopes share two common challenges: maximizing the integration time on target, while minimizing operation costs including communication costs and staffing on the ground. We show in the case of GESE how these challenges can be met through a custom orbit and a system design emphasizing simplification and leveraging information from ground-based telescopes.

  11. Outcome of end-to-end urethroplasty in post-traumatic stricture of posterior urethra.

    Science.gov (United States)

    Hussain, Akbar; Pansota, Mudassar Saeed; Rasool, Mumtaz; Tabassum, Shafqat Ali; Ahmad, Iftikhar; Saleem, Muhammad Shahzad

    2013-04-01

    To determine the outcome of delayed end-to-end anastomotic urethroplasty in blind post-traumatic stricture of posterior urethra at our setup. Case series. Department of Urology and Renal Transplantation, Quaid-e-Azam Medical College/Bahawal Victoria Hospital, Bahawalpur, from January 2009 to June 2011. Adult patients with completely obliterated post-traumatic stricture of posterior urethra ≤ 2 cm were included in the study. Patients with post-prostatectomy (TUR-P, TVP) stricture, stricture more than 2 cm in size or patients of stricture with neurogenic bladder and patients with any perineal disease were excluded from the study. Retrograde urethrogram and voiding cysto-urethrogram was done in every patient to assess stricture length and location. Stricture excision and delayed end-to-end anastomosis of urethra with spatulation was performed in every patient. Minimum followup period was 6 months and maximum 18 months. There were 26 cases with road traffic accident (indirect) and 14 had history of fall/direct trauma to perineum or urethra. Majority of the patients (57.5%) were between 16 to 30 years of age. Twelve (30.0%) patients developed complications postoperatively. Early complications of wound infection occurred in 01 (2.5%) patient. Late complications were seen in 11 (27.5%) patients i.e. stricture recurrence in 7 (17.5%), erectile dysfunction in 2 (5.0%), urethrocutaneous fistula and urinary incontinence in one patient (2.5%) each. Success rate was 70.0% initially and 87.5% overall. Delayed end-to-end anastomotic urethroplasty is an effective procedure for traumatic posterior urethral strictures with success rate of about 87.5%.

  12. Outcome of end-to-end urethroplasty in post-traumatic stricture of posterior urethra

    International Nuclear Information System (INIS)

    Hussain, A.; Pansota, M. S.; Rasool, M.; Tabassum, S. A.; Ahmad, I.; Saleem, M. S.

    2013-01-01

    Objective: To determine the outcome of delayed end-to-end anastomotic urethroplasty in blind post-traumatic stricture of posterior urethra at our setup. Study Design: Case series. Place and Duration of Study: Department of Urology and Renal Transplantation, Quaid-e-Azam Medical College/ Bahawal Victoria Hospital, Bahawalpur, from January 2009 to June 2011. Methodology: Adult patients with completely obliterated post-traumatic stricture of posterior urethra 2 cm/sup 2/ were included in the study. Patients with post-prostatectomy (TUR-P, TVP) stricture, stricture more than 2 cm in size or patients of stricture with neurogenic bladder and patients with any perineal disease were excluded from the study. Retrograde urethrogram and voiding cysto-urethrogram was done in every patient to assess stricture length and location. Stricture excision and delayed end-to-end anastomosis of urethra with spatulation was performed in every patient. Minimum followup period was 6 months and maximum 18 months. Results: There were 26 cases with road traffic accident (indirect) and 14 had history of fall/direct trauma to perineum or urethra. Majority of the patients (57.5%) were between 16 to 30 years of age. Twelve (30.0%) patients developed complications postoperatively. Early complications of wound infection occurred in 01 (2.5%) patient. Late complications were seen in 11 (27.5%) patients i.e. stricture recurrence in 7 (17.5%), erectile dysfunction in 2 (5.0%), urethrocutaneous fistula and urinary incontinence in one patient (2.5%) each. Success rate was 70.0% initially and 87.5% overall. Conclusion: Delayed end-to-end anastomotic urethroplasty is an effective procedure for traumatic posterior urethral strictures with success rate of about 87.5%. (author)

  13. Outcome of end-to-end urethroplasty in post-traumatic stricture of posterior urethra

    Energy Technology Data Exchange (ETDEWEB)

    Hussain, A.; Pansota, M. S.; Rasool, M.; Tabassum, S. A.; Ahmad, I.; Saleem, M. S. [Bahawal Victoria Hospital, Bahawalpur (Pakistan). Dept. of Urology

    2013-04-15

    Objective: To determine the outcome of delayed end-to-end anastomotic urethroplasty in blind post-traumatic stricture of posterior urethra at our setup. Study Design: Case series. Place and Duration of Study: Department of Urology and Renal Transplantation, Quaid-e-Azam Medical College/ Bahawal Victoria Hospital, Bahawalpur, from January 2009 to June 2011. Methodology: Adult patients with completely obliterated post-traumatic stricture of posterior urethra 2 cm/sup 2/ were included in the study. Patients with post-prostatectomy (TUR-P, TVP) stricture, stricture more than 2 cm in size or patients of stricture with neurogenic bladder and patients with any perineal disease were excluded from the study. Retrograde urethrogram and voiding cysto-urethrogram was done in every patient to assess stricture length and location. Stricture excision and delayed end-to-end anastomosis of urethra with spatulation was performed in every patient. Minimum followup period was 6 months and maximum 18 months. Results: There were 26 cases with road traffic accident (indirect) and 14 had history of fall/direct trauma to perineum or urethra. Majority of the patients (57.5%) were between 16 to 30 years of age. Twelve (30.0%) patients developed complications postoperatively. Early complications of wound infection occurred in 01 (2.5%) patient. Late complications were seen in 11 (27.5%) patients i.e. stricture recurrence in 7 (17.5%), erectile dysfunction in 2 (5.0%), urethrocutaneous fistula and urinary incontinence in one patient (2.5%) each. Success rate was 70.0% initially and 87.5% overall. Conclusion: Delayed end-to-end anastomotic urethroplasty is an effective procedure for traumatic posterior urethral strictures with success rate of about 87.5%. (author)

  14. Optimizing End-to-End Big Data Transfers over Terabits Network Infrastructure

    International Nuclear Information System (INIS)

    Kim, Youngjae; Vallee, Geoffroy R.; Lee, Sangkeun; Shipman, Galen M.

    2016-01-01

    While future terabit networks hold the promise of significantly improving big-data motion among geographically distributed data centers, significant challenges must be overcome even on today's 100 gigabit networks to realize end-to-end performance. Multiple bottlenecks exist along the end-to-end path from source to sink, for instance, the data storage infrastructure at both the source and sink and its interplay with the wide-area network are increasingly the bottleneck to achieving high performance. In this study, we identify the issues that lead to congestion on the path of an end-to-end data transfer in the terabit network environment, and we present a new bulk data movement framework for terabit networks, called LADS. LADS exploits the underlying storage layout at each endpoint to maximize throughput without negatively impacting the performance of shared storage resources for other users. LADS also uses the Common Communication Interface (CCI) in lieu of the sockets interface to benefit from hardware-level zero-copy, and operating system bypass capabilities when available. It can further improve data transfer performance under congestion on the end systems using buffering at the source using flash storage. With our evaluations, we show that LADS can avoid congested storage elements within the shared storage resource, improving input/output bandwidth, and data transfer rates across the high speed networks. We also investigate the performance degradation problems of LADS due to I/O contention on the parallel file system (PFS), when multiple LADS tools share the PFS. We design and evaluate a meta-scheduler to coordinate multiple I/O streams while sharing the PFS, to minimize the I/O contention on the PFS. Finally, with our evaluations, we observe that LADS with meta-scheduling can further improve the performance by up to 14 percent relative to LADS without meta-scheduling.

  15. Performance Enhancements of UMTS networks using end-to-end QoS provisioning

    DEFF Research Database (Denmark)

    Wang, Haibo; Prasad, Devendra; Teyeb, Oumer

    2005-01-01

    This paper investigates the end-to-end(E2E) quality of service(QoS) provisioning approaches for UMTS networks together with DiffServ IP network. The effort was put on QoS classes mapping from DiffServ to UMTS, Access Control(AC), buffering and scheduling optimization. The DiffServ Code Point (DSCP......) was utilized in the whole UMTS QoS provisioning to differentiate different type of traffics. The overall algorithm was optimized to guarantee the E2E QoS parameters of each service class, especially for realtime applications, as well as to improve the bandwidth utilization. Simulation shows that the enhanced...

  16. WiMAX security and quality of service an end-to-end perspective

    CERN Document Server

    Tang, Seok-Yee; Sharif, Hamid

    2010-01-01

    WiMAX is the first standard technology to deliver true broadband mobility at speeds that enable powerful multimedia applications such as Voice over Internet Protocol (VoIP), online gaming, mobile TV, and personalized infotainment. WiMAX Security and Quality of Service, focuses on the interdisciplinary subject of advanced Security and Quality of Service (QoS) in WiMAX wireless telecommunication systems including its models, standards, implementations, and applications. Split into 4 parts, Part A of the book is an end-to-end overview of the WiMAX architecture, protocol, and system requirements.

  17. Wiretapping End-to-End Encrypted VoIP Calls: Real-World Attacks on ZRTP

    Directory of Open Access Journals (Sweden)

    Schürmann Dominik

    2017-07-01

    Full Text Available Voice calls are still one of the most common use cases for smartphones. Often, sensitive personal information but also confidential business information is shared. End-to-end security is required to protect against wiretapping of voice calls. For such real-time communication, the ZRTP key-agreement protocol has been proposed. By verbally comparing a small number of on-screen characters or words, called Short Authentication Strings, the participants can be sure that no one is wiretapping the call. Since 2011, ZRTP is an IETF standard implemented in several VoIP clients.

  18. Screening California Current fishery management scenarios using the Atlantis end-to-end ecosystem model

    Science.gov (United States)

    Kaplan, Isaac C.; Horne, Peter J.; Levin, Phillip S.

    2012-09-01

    End-to-end marine ecosystem models link climate and oceanography to the food web and human activities. These models can be used as forecasting tools, to strategically evaluate management options and to support ecosystem-based management. Here we report the results of such forecasts in the California Current, using an Atlantis end-to-end model. We worked collaboratively with fishery managers at NOAA’s regional offices and staff at the National Marine Sanctuaries (NMS) to explore the impact of fishery policies on management objectives at different spatial scales, from single Marine Sanctuaries to the entire Northern California Current. In addition to examining Status Quo management, we explored the consequences of several gear switching and spatial management scenarios. Of the scenarios that involved large scale management changes, no single scenario maximized all performance metrics. Any policy choice would involve trade-offs between stakeholder groups and policy goals. For example, a coast-wide 25% gear shift from trawl to pot or longline appeared to be one possible compromise between an increase in spatial management (which sacrificed revenue) and scenarios such as the one consolidating bottom impacts to deeper areas (which did not perform substantially differently from Status Quo). Judged on a coast-wide scale, most of the scenarios that involved minor or local management changes (e.g. within Monterey Bay NMS only) yielded results similar to Status Quo. When impacts did occur in these cases, they often involved local interactions that were difficult to predict a priori based solely on fishing patterns. However, judged on the local scale, deviation from Status Quo did emerge, particularly for metrics related to stationary species or variables (i.e. habitat and local metrics of landed value or bycatch). We also found that isolated management actions within Monterey Bay NMS would cause local fishers to pay a cost for conservation, in terms of reductions in landed

  19. End to end adaptive congestion control in TCP/IP networks

    CERN Document Server

    Houmkozlis, Christos N

    2012-01-01

    This book provides an adaptive control theory perspective on designing congestion controls for packet-switching networks. Relevant to a wide range of disciplines and industries, including the music industry, computers, image trading, and virtual groups, the text extensively discusses source oriented, or end to end, congestion control algorithms. The book empowers readers with clear understanding of the characteristics of packet-switching networks and their effects on system stability and performance. It provides schemes capable of controlling congestion and fairness and presents real-world app

  20. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    Science.gov (United States)

    Feinberg, Lee; Bolcar, Matt; Liu, Alice; Guyon, Olivier; Stark,Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance.

  1. On end-to-end performance of MIMO multiuser in cognitive radio networks

    KAUST Repository

    Yang, Yuli

    2011-12-01

    In this paper, a design for the multiple-input-multiple-output (MIMO) multiuser transmission in the cognitive radio network is developed and its end-to-end performance is investigated under spectrum-sharing constraints. Firstly, the overall average packet error rate is analyzed by considering the channel state information feedback delay and the multiuser scheduling. Then, we provide corresponding numerical results to measure the performance evaluation for several separate scenarios, which presents a convenient tool for the cognitive radio network design with multiple secondary MIMO users. © 2011 IEEE.

  2. On end-to-end performance of MIMO multiuser in cognitive radio networks

    KAUST Repository

    Yang, Yuli; Aissa, Sonia

    2011-01-01

    In this paper, a design for the multiple-input-multiple-output (MIMO) multiuser transmission in the cognitive radio network is developed and its end-to-end performance is investigated under spectrum-sharing constraints. Firstly, the overall average packet error rate is analyzed by considering the channel state information feedback delay and the multiuser scheduling. Then, we provide corresponding numerical results to measure the performance evaluation for several separate scenarios, which presents a convenient tool for the cognitive radio network design with multiple secondary MIMO users. © 2011 IEEE.

  3. Analytical Framework for End-to-End Delay Based on Unidirectional Highway Scenario

    Directory of Open Access Journals (Sweden)

    Aslinda Hassan

    2015-01-01

    Full Text Available In a sparse vehicular ad hoc network, a vehicle normally employs a carry and forward approach, where it holds the message it wants to transmit until the vehicle meets other vehicles or roadside units. A number of analyses in the literature have been done to investigate the time delay when packets are being carried by vehicles on both unidirectional and bidirectional highways. However, these analyses are focusing on the delay between either two disconnected vehicles or two disconnected vehicle clusters. Furthermore, majority of the analyses only concentrate on the expected value of the end-to-end delay when the carry and forward approach is used. Using regression analysis, we establish the distribution model for the time delay between two disconnected vehicle clusters as an exponential distribution. Consequently, a distribution is newly derived to represent the number of clusters on a highway using a vehicular traffic model. From there, we are able to formulate end-to-end delay model which extends the time delay model for two disconnected vehicle clusters to multiple disconnected clusters on a unidirectional highway. The analytical results obtained from the analytical model are then validated through simulation results.

  4. Increasing operations profitability using an end-to-end, wireless internet, gas monitoring system

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, M. [Northrock Resources Ltd., AB (Canada); Benterud, K. [zed.i solutions, inc., Calgary, AB (Canada)

    2004-10-01

    Implementation by Northrock Resources Ltd., a wholly-owned subsidiary of Unocal Corporation, of a fully integrated end-to-end gas measurement and production analysis system, is discussed. The system, dubbed Smart-Alek(TM), utilizes public wireless communications and a web browser only delivery system to provide seamless well visibility to a desk-top computer. Smart-Alek(TM) is an example of a new type of end-to-end electronic gas flow measurement system, known as FINE(TM), which is an acronym for Field Intelligence Network and End-User Interface. The system delivers easy-to-use, complete, reliable and cost effective production information, far more effective than is possible to obtain with conventional SCADA technology. By installing the system, Northrock was able to increase gas volumes with more accurate electronic flow measurement in place of mechanical charts, with very low technical maintenance, and at a reduced operating cost. It is emphasized that deploying the technology alone will produce only partial benefits; to realize full benefits it is also essential to change grass roots operating practices, aiming at timely decision-making at the field level. 5 refs., 5 figs.

  5. An End-to-End Model of Plant Pheromone Channel for Long Range Molecular Communication.

    Science.gov (United States)

    Unluturk, Bige D; Akyildiz, Ian F

    2017-01-01

    A new track in molecular communication is using pheromones which can scale up the range of diffusion-based communication from μm meters to meters and enable new applications requiring long range. Pheromone communication is the emission of molecules in the air which trigger behavioral or physiological responses in receiving organisms. The objective of this paper is to introduce a new end-to-end model which incorporates pheromone behavior with communication theory for plants. The proposed model includes both the transmission and reception processes as well as the propagation channel. The transmission process is the emission of pheromones from the leaves of plants. The dispersion of pheromones by the flow of wind constitutes the propagation process. The reception process is the sensing of pheromones by the pheromone receptors of plants. The major difference of pheromone communication from other molecular communication techniques is the dispersion channel acting under the laws of turbulent diffusion. In this paper, the pheromone channel is modeled as a Gaussian puff, i.e., a cloud of pheromone released instantaneously from the source whose dispersion follows a Gaussian distribution. Numerical results on the performance of the overall end-to-end pheromone channel in terms of normalized gain and delay are provided.

  6. End-to-End Airplane Detection Using Transfer Learning in Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Zhong Chen

    2018-01-01

    Full Text Available Airplane detection in remote sensing images remains a challenging problem due to the complexity of backgrounds. In recent years, with the development of deep learning, object detection has also obtained great breakthroughs. For object detection tasks in natural images, such as the PASCAL (Pattern Analysis, Statistical Modelling and Computational Learning VOC (Visual Object Classes Challenge, the major trend of current development is to use a large amount of labeled classification data to pre-train the deep neural network as a base network, and then use a small amount of annotated detection data to fine-tune the network for detection. In this paper, we use object detection technology based on deep learning for airplane detection in remote sensing images. In addition to using some characteristics of remote sensing images, some new data augmentation techniques have been proposed. We also use transfer learning and adopt a single deep convolutional neural network and limited training samples to implement end-to-end trainable airplane detection. Classification and positioning are no longer divided into multistage tasks; end-to-end detection attempts to combine them for optimization, which ensures an optimal solution for the final stage. In our experiment, we use remote sensing images of airports collected from Google Earth. The experimental results show that the proposed algorithm is highly accurate and meaningful for remote sensing object detection.

  7. End to end distribution functions for a class of polymer models

    International Nuclear Information System (INIS)

    Khandekar, D.C.; Wiegel, F.W.

    1988-01-01

    The two point end-to-end distribution functions for a class of polymer models have been obtained within the first cumulant approximation. The trial distribution function this purpose is chosen to correspond to a general non-local quadratic functional. An Exact expression for the trial distribution function is obtained. It is pointed out that these trial distribution functions themselves can be used to study certain aspects of the configurational behaviours of polymers. These distribution functions are also used to obtain the averaged mean square size 2 > of a polymer characterized by the non-local quadratic potential energy functional. Finally, we derive an analytic expression for 2 > of a polyelectrolyte model and show that for a long polymer a weak electrostatic interaction does not change the behaviour of 2 > from that of a free polymer. (author). 16 refs

  8. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    Science.gov (United States)

    Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Oliver; Stark, Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and Exo-Earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling it.

  9. End-to-End Beam Simulations for the New Muon G-2 Experiment at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Korostelev, Maxim [Cockcroft Inst. Accel. Sci. Tech.; Bailey, Ian [Lancaster U.; Herrod, Alexander [Liverpool U.; Morgan, James [Fermilab; Morse, William [RIKEN BNL; Stratakis, Diktys [RIKEN BNL; Tishchenko, Vladimir [RIKEN BNL; Wolski, Andrzej [Cockcroft Inst. Accel. Sci. Tech.

    2016-06-01

    The aim of the new muon g-2 experiment at Fermilab is to measure the anomalous magnetic moment of the muon with an unprecedented uncertainty of 140 ppb. A beam of positive muons required for the experiment is created by pion decay. Detailed studies of the beam dynamics and spin polarization of the muons are important to predict systematic uncertainties in the experiment. In this paper, we present the results of beam simulations and spin tracking from the pion production target to the muon storage ring. The end-to-end beam simulations are developed in Bmad and include the processes of particle decay, collimation (with accurate representation of all apertures) and spin tracking.

  10. End-to-end operations at the National Radio Astronomy Observatory

    Science.gov (United States)

    Radziwill, Nicole M.

    2008-07-01

    In 2006 NRAO launched a formal organization, the Office of End to End Operations (OEO), to broaden access to its instruments (VLA/EVLA, VLBA, GBT and ALMA) in the most cost-effective ways possible. The VLA, VLBA and GBT are mature instruments, and the EVLA and ALMA are currently under construction, which presents unique challenges for integrating software across the Observatory. This article 1) provides a survey of the new developments over the past year, and those planned for the next year, 2) describes the business model used to deliver many of these services, and 3) discusses the management models being applied to ensure continuous innovation in operations, while preserving the flexibility and autonomy of telescope software development groups.

  11. End-to-end interoperability and workflows from building architecture design to one or more simulations

    Science.gov (United States)

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  12. The role of sea ports in end-to-end maritime transport chain emissions

    International Nuclear Information System (INIS)

    Gibbs, David; Rigot-Muller, Patrick; Mangan, John; Lalwani, Chandra

    2014-01-01

    This paper's purpose is to investigate the role of sea ports in helping to mitigate the GHG emissions associated with the end-to-end maritime transport chain. The analysis is primarily focused on the UK, but is international in application. The paper is based on both the analysis of secondary data and information on actions taken by ports to reduce their emissions, with the latter data collected for the main UK ports via their published reports and/or via interviews. Only a small number of ports (representing 32% of UK port activity) actually measure and report their carbon emissions in the UK context. The emissions generated by ships calling at these ports are analysed using a method based on Department for Transport Maritime Statistics Data. In addition, a case example (Felixstowe) of emissions associated with HGV movements to and from ports is presented, and data on vessel emissions at berth are also considered. Our analyses indicate that emissions generated by ships during their voyages between ports are of a far greater magnitude than those generated by the port activities. Thus while reducing the ports' own emissions is worthwhile, the results suggest that ports might have more impact through focusing their efforts on reducing shipping emissions. - Highlights: • Investigates role of ports in mitigating GHG emissions in the end-to-end maritime transport chain. • Emissions generated both by ports and by ships calling at ports are analysed. • Shipping's emissions are far greater than those generated by port activities. • Ports may have more impact through focusing efforts on reducing shipping's emissions. • Options for ports to support and drive change in the maritime sector also considered

  13. Kinetics of end-to-end collision in short single-stranded nucleic acids.

    Science.gov (United States)

    Wang, Xiaojuan; Nau, Werner M

    2004-01-28

    A novel fluorescence-based method, which entails contact quenching of the long-lived fluorescent state of 2,3-diazabicyclo[2.2.2]-oct-2-ene (DBO), was employed to measure the kinetics of end-to-end collision in short single-stranded oligodeoxyribonucleotides of the type 5'-DBO-(X)n-dG with X = dA, dC, dT, or dU and n = 2 or 4. The fluorophore was covalently attached to the 5' end and dG was introduced as an efficient intrinsic quencher at the 3' terminus. The end-to-end collision rates, which can be directly related to the efficiency of intramolecular fluorescence quenching, ranged from 0.1 to 9.0 x 10(6) s(-1). They were strongly dependent on the strand length, the base sequence, as well as the temperature. Oligonucleotides containing dA in the backbone displayed much slower collision rates and significantly higher positive activation energies than strands composed of pyrimidine bases, suggesting a higher intrinsic rigidity of oligoadenylate. Comparison of the measured collision rates in short single-stranded oligodeoxyribonucleotides with the previously reported kinetics of hairpin formation indicates that the intramolecular collision is significantly faster than the nucleation step of hairpin closing. This is consistent with the configurational diffusion model suggested by Ansari et al. (Ansari, A.; Kuznetsov, S. V.; Shen, Y. Proc.Natl. Acad. Sci. USA 2001, 98, 7771-7776), in which the formation of misfolded loops is thought to slow hairpin formation.

  14. Internet end-to-end performance monitoring for the High Energy Nuclear and Particle Physics community

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, W.

    2000-02-22

    Modern High Energy Nuclear and Particle Physics (HENP) experiments at Laboratories around the world present a significant challenge to wide area networks. Petabytes (1015) or exabytes (1018) of data will be generated during the lifetime of the experiment. Much of this data will be distributed via the Internet to the experiment's collaborators at Universities and Institutes throughout the world for analysis. In order to assess the feasibility of the computing goals of these and future experiments, the HENP networking community is actively monitoring performance across a large part of the Internet used by its collaborators. Since 1995, the pingER project has been collecting data on ping packet loss and round trip times. In January 2000, there are 28 monitoring sites in 15 countries gathering data on over 2,000 end-to-end pairs. HENP labs such as SLAC, Fermi Lab and CERN are using Advanced Network's Surveyor project and monitoring performance from one-way delay of UDP packets. More recently several HENP sites have become involved with NLANR's active measurement program (AMP). In addition SLAC and CERN are part of the RIPE test-traffic project and SLAC is home for a NIMI machine. The large End-to-end performance monitoring infrastructure allows the HENP networking community to chart long term trends and closely examine short term glitches across a wide range of networks and connections. The different methodologies provide opportunities to compare results based on different protocols and statistical samples. Understanding agreement and discrepancies between results provides particular insight into the nature of the network. This paper will highlight the practical side of monitoring by reviewing the special needs of High Energy Nuclear and Particle Physics experiments and provide an overview of the experience of measuring performance across a large number of interconnected networks throughout the world with various methodologies. In particular, results

  15. Internet end-to-end performance monitoring for the High Energy Nuclear and Particle Physics community

    International Nuclear Information System (INIS)

    Matthews, W.

    2000-01-01

    Modern High Energy Nuclear and Particle Physics (HENP) experiments at Laboratories around the world present a significant challenge to wide area networks. Petabytes (1015) or exabytes (1018) of data will be generated during the lifetime of the experiment. Much of this data will be distributed via the Internet to the experiment's collaborators at Universities and Institutes throughout the world for analysis. In order to assess the feasibility of the computing goals of these and future experiments, the HENP networking community is actively monitoring performance across a large part of the Internet used by its collaborators. Since 1995, the pingER project has been collecting data on ping packet loss and round trip times. In January 2000, there are 28 monitoring sites in 15 countries gathering data on over 2,000 end-to-end pairs. HENP labs such as SLAC, Fermi Lab and CERN are using Advanced Network's Surveyor project and monitoring performance from one-way delay of UDP packets. More recently several HENP sites have become involved with NLANR's active measurement program (AMP). In addition SLAC and CERN are part of the RIPE test-traffic project and SLAC is home for a NIMI machine. The large End-to-end performance monitoring infrastructure allows the HENP networking community to chart long term trends and closely examine short term glitches across a wide range of networks and connections. The different methodologies provide opportunities to compare results based on different protocols and statistical samples. Understanding agreement and discrepancies between results provides particular insight into the nature of the network. This paper will highlight the practical side of monitoring by reviewing the special needs of High Energy Nuclear and Particle Physics experiments and provide an overview of the experience of measuring performance across a large number of interconnected networks throughout the world with various methodologies. In particular, results from each project

  16. MRI simulation: end-to-end testing for prostate radiation therapy using geometric pelvic MRI phantoms

    International Nuclear Information System (INIS)

    Sun, Jidi; Menk, Fred; Lambert, Jonathan; Martin, Jarad; Denham, James W; Greer, Peter B; Dowling, Jason; Rivest-Henault, David; Pichler, Peter; Parker, Joel; Arm, Jameen; Best, Leah

    2015-01-01

    To clinically implement MRI simulation or MRI-alone treatment planning requires comprehensive end-to-end testing to ensure an accurate process. The purpose of this study was to design and build a geometric phantom simulating a human male pelvis that is suitable for both CT and MRI scanning and use it to test geometric and dosimetric aspects of MRI simulation including treatment planning and digitally reconstructed radiograph (DRR) generation.A liquid filled pelvic shaped phantom with simulated pelvic organs was scanned in a 3T MRI simulator with dedicated radiotherapy couch-top, laser bridge and pelvic coil mounts. A second phantom with the same external shape but with an internal distortion grid was used to quantify the distortion of the MR image. Both phantoms were also CT scanned as the gold-standard for both geometry and dosimetry. Deformable image registration was used to quantify the MR distortion. Dose comparison was made using a seven-field IMRT plan developed on the CT scan with the fluences copied to the MR image and recalculated using bulk electron densities.Without correction the maximum distortion of the MR compared with the CT scan was 7.5 mm across the pelvis, while this was reduced to 2.6 and 1.7 mm by the vendor’s 2D and 3D correction algorithms, respectively. Within the locations of the internal organs of interest, the distortion was <1.5 and <1 mm with 2D and 3D correction algorithms, respectively. The dose at the prostate isocentre calculated on CT and MRI images differed by 0.01% (1.1 cGy). Positioning shifts were within 1 mm when setup was performed using MRI generated DRRs compared to setup using CT DRRs.The MRI pelvic phantom allows end-to-end testing of the MRI simulation workflow with comparison to the gold-standard CT based process. MRI simulation was found to be geometrically accurate with organ dimensions, dose distributions and DRR based setup within acceptable limits compared to CT. (paper)

  17. MO-B-BRB-04: 3D Dosimetry in End-To-End Dosimetry QA

    Energy Technology Data Exchange (ETDEWEB)

    Ibbott, G. [UT MD Anderson Cancer Center (United States)

    2016-06-15

    irradiated volume can help understand interplay effects during TomoTherapy or VMAT. Titania Juang: Special techniques in the clinic and research Understand the potential for 3D dosimetry in validating dose accumulation in deformable systems, and Observe the benefits of high resolution measurements for precision therapy in SRS and in MicroSBRT for small animal irradiators Geoffrey S. Ibbott: 3D Dosimetry in end-to-end dosimetry QA Understand the potential for 3D dosimetry for end-to-end radiation therapy process validation in the in-house and external credentialing setting. Canadian Institutes of Health Research; L. Schreiner, Modus QA, London, ON, Canada; T. Juang, NIH R01CA100835.

  18. Availability and End-to-end Reliability in Low Duty Cycle Multihop Wireless Sensor Networks.

    Science.gov (United States)

    Suhonen, Jukka; Hämäläinen, Timo D; Hännikäinen, Marko

    2009-01-01

    A wireless sensor network (WSN) is an ad-hoc technology that may even consist of thousands of nodes, which necessitates autonomic, self-organizing and multihop operations. A typical WSN node is battery powered, which makes the network lifetime the primary concern. The highest energy efficiency is achieved with low duty cycle operation, however, this alone is not enough. WSNs are deployed for different uses, each requiring acceptable Quality of Service (QoS). Due to the unique characteristics of WSNs, such as dynamic wireless multihop routing and resource constraints, the legacy QoS metrics are not feasible as such. We give a new definition to measure and implement QoS in low duty cycle WSNs, namely availability and reliability. Then, we analyze the effect of duty cycling for reaching the availability and reliability. The results are obtained by simulations with ZigBee and proprietary TUTWSN protocols. Based on the results, we also propose a data forwarding algorithm suitable for resource constrained WSNs that guarantees end-to-end reliability while adding a small overhead that is relative to the packet error rate (PER). The forwarding algorithm guarantees reliability up to 30% PER.

  19. Availability and End-to-end Reliability in Low Duty Cycle MultihopWireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Timo D. Hämäläinen

    2009-03-01

    Full Text Available A wireless sensor network (WSN is an ad-hoc technology that may even consist of thousands of nodes, which necessitates autonomic, self-organizing and multihop operations. A typical WSN node is battery powered, which makes the network lifetime the primary concern. The highest energy efficiency is achieved with low duty cycle operation, however, this alone is not enough. WSNs are deployed for different uses, each requiring acceptable Quality of Service (QoS. Due to the unique characteristics of WSNs, such as dynamic wireless multihop routing and resource constraints, the legacy QoS metrics are not feasible as such. We give a new definition to measure and implement QoS in low duty cycle WSNs, namely availability and reliability. Then, we analyze the effect of duty cycling for reaching the availability and reliability. The results are obtained by simulations with ZigBee and proprietary TUTWSN protocols. Based on the results, we also propose a data forwarding algorithm suitable for resource constrained WSNs that guarantees end-to-end reliability while adding a small overhead that is relative to the packet error rate (PER. The forwarding algorithm guarantees reliability up to 30% PER.

  20. End-to-End Multimodal Emotion Recognition Using Deep Neural Networks

    Science.gov (United States)

    Tzirakis, Panagiotis; Trigeorgis, George; Nicolaou, Mihalis A.; Schuller, Bjorn W.; Zafeiriou, Stefanos

    2017-12-01

    Automatic affect recognition is a challenging task due to the various modalities emotions can be expressed with. Applications can be found in many domains including multimedia retrieval and human computer interaction. In recent years, deep neural networks have been used with great success in determining emotional states. Inspired by this success, we propose an emotion recognition system using auditory and visual modalities. To capture the emotional content for various styles of speaking, robust features need to be extracted. To this purpose, we utilize a Convolutional Neural Network (CNN) to extract features from the speech, while for the visual modality a deep residual network (ResNet) of 50 layers. In addition to the importance of feature extraction, a machine learning algorithm needs also to be insensitive to outliers while being able to model the context. To tackle this problem, Long Short-Term Memory (LSTM) networks are utilized. The system is then trained in an end-to-end fashion where - by also taking advantage of the correlations of the each of the streams - we manage to significantly outperform the traditional approaches based on auditory and visual handcrafted features for the prediction of spontaneous and natural emotions on the RECOLA database of the AVEC 2016 research challenge on emotion recognition.

  1. End-to-End Neural Optical Music Recognition of Monophonic Scores

    Directory of Open Access Journals (Sweden)

    Jorge Calvo-Zaragoza

    2018-04-01

    Full Text Available Optical Music Recognition is a field of research that investigates how to computationally decode music notation from images. Despite the efforts made so far, there are hardly any complete solutions to the problem. In this work, we study the use of neural networks that work in an end-to-end manner. This is achieved by using a neural model that combines the capabilities of convolutional neural networks, which work on the input image, and recurrent neural networks, which deal with the sequential nature of the problem. Thanks to the use of the the so-called Connectionist Temporal Classification loss function, these models can be directly trained from input images accompanied by their corresponding transcripts into music symbol sequences. We also present the Printed Music Scores dataset, containing more than 80,000 monodic single-staff real scores in common western notation, that is used to train and evaluate the neural approach. In our experiments, it is demonstrated that this formulation can be carried out successfully. Additionally, we study several considerations about the codification of the output musical sequences, the convergence and scalability of the neural models, as well as the ability of this approach to locate symbols in the input score.

  2. End to End Digitisation and Analysis of Three-Dimensional Coral Models, from Communities to Corallites.

    Directory of Open Access Journals (Sweden)

    Luis Gutierrez-Heredia

    Full Text Available Coral reefs hosts nearly 25% of all marine species and provide food sources for half a billion people worldwide while only a very small percentage have been surveyed. Advances in technology and processing along with affordable underwater cameras and Internet availability gives us the possibility to provide tools and softwares to survey entire coral reefs. Holistic ecological analyses of corals require not only the community view (10s to 100s of meters, but also the single colony analysis as well as corallite identification. As corals are three-dimensional, classical approaches to determine percent cover and structural complexity across spatial scales are inefficient, time-consuming and limited to experts. Here we propose an end-to-end approach to estimate these parameters using low-cost equipment (GoPro, Canon and freeware (123D Catch, Meshmixer and Netfabb, allowing every community to participate in surveys and monitoring of their coral ecosystem. We demonstrate our approach on 9 species of underwater colonies in ranging size and morphology. 3D models of underwater colonies, fresh samples and bleached skeletons with high quality texture mapping and detailed topographic morphology were produced, and Surface Area and Volume measurements (parameters widely used for ecological and coral health studies were calculated and analysed. Moreover, we integrated collected sample models with micro-photogrammetry models of individual corallites to aid identification and colony and polyp scale analysis.

  3. End to End Digitisation and Analysis of Three-Dimensional Coral Models, from Communities to Corallites.

    Science.gov (United States)

    Gutierrez-Heredia, Luis; Benzoni, Francesca; Murphy, Emma; Reynaud, Emmanuel G

    2016-01-01

    Coral reefs hosts nearly 25% of all marine species and provide food sources for half a billion people worldwide while only a very small percentage have been surveyed. Advances in technology and processing along with affordable underwater cameras and Internet availability gives us the possibility to provide tools and softwares to survey entire coral reefs. Holistic ecological analyses of corals require not only the community view (10s to 100s of meters), but also the single colony analysis as well as corallite identification. As corals are three-dimensional, classical approaches to determine percent cover and structural complexity across spatial scales are inefficient, time-consuming and limited to experts. Here we propose an end-to-end approach to estimate these parameters using low-cost equipment (GoPro, Canon) and freeware (123D Catch, Meshmixer and Netfabb), allowing every community to participate in surveys and monitoring of their coral ecosystem. We demonstrate our approach on 9 species of underwater colonies in ranging size and morphology. 3D models of underwater colonies, fresh samples and bleached skeletons with high quality texture mapping and detailed topographic morphology were produced, and Surface Area and Volume measurements (parameters widely used for ecological and coral health studies) were calculated and analysed. Moreover, we integrated collected sample models with micro-photogrammetry models of individual corallites to aid identification and colony and polyp scale analysis.

  4. Mechanics of spatulated end-to-end artery-to-vein anastomoses.

    Science.gov (United States)

    Morasch, M D; Dobrin, P B; Dong, Q S; Mrkvicka, R

    1998-01-01

    It previously has been shown that in straight end-to-end artery-to-vein anastomoses, maximum dimensions are obtained with an interrupted suture line. Nearly equivalent dimensions are obtained with a continuous compliant polybutester suture (Novafil), and the smallest dimensions are obtained with a continuous noncompliant polypropylene suture (Surgilene). The present study was undertaken to examine these suture techniques in a spatulated or beveled anastomosis in living dogs. Anastomoses were constructed using continuous 6-0 polypropylene (Surgilene), continuous 6-0 polybutester (Novafil), or interrupted 6-0 polypropylene or polybutester. Thirty minutes after construction, the artery, vein, and beveled anastomoses were excised, restored to in situ length and pressurized with the lumen filled with a dilute suspension of barium sulfate. High resolution radiographs were obtained at 25 mmHg pressure increments up to 200 mmHg. Dimensions and compliance were determined from the radiographic images. Results showed that, unlike straight artery-to-vein anastomoses, there were no differences in the dimensions or compliance of spatulated anastomoses with continuous Surgilene, continuous Novafil, or interrupted suture techniques. Therefore a continuous suture technique is acceptable when constructing spatulated artery-to-vein anastomoses in patients.

  5. Impacts of the Deepwater Horizon oil spill evaluated using an end-to-end ecosystem model.

    Science.gov (United States)

    Ainsworth, Cameron H; Paris, Claire B; Perlin, Natalie; Dornberger, Lindsey N; Patterson, William F; Chancellor, Emily; Murawski, Steve; Hollander, David; Daly, Kendra; Romero, Isabel C; Coleman, Felicia; Perryman, Holly

    2018-01-01

    We use a spatially explicit biogeochemical end-to-end ecosystem model, Atlantis, to simulate impacts from the Deepwater Horizon oil spill and subsequent recovery of fish guilds. Dose-response relationships with expected oil concentrations were utilized to estimate the impact on fish growth and mortality rates. We also examine the effects of fisheries closures and impacts on recruitment. We validate predictions of the model by comparing population trends and age structure before and after the oil spill with fisheries independent data. The model suggests that recruitment effects and fishery closures had little influence on biomass dynamics. However, at the assumed level of oil concentrations and toxicity, impacts on fish mortality and growth rates were large and commensurate with observations. Sensitivity analysis suggests the biomass of large reef fish decreased by 25% to 50% in areas most affected by the spill, and biomass of large demersal fish decreased even more, by 40% to 70%. Impacts on reef and demersal forage caused starvation mortality in predators and increased reliance on pelagic forage. Impacts on the food web translated effects of the spill far away from the oiled area. Effects on age structure suggest possible delayed impacts on fishery yields. Recovery of high-turnover populations generally is predicted to occur within 10 years, but some slower-growing populations may take 30+ years to fully recover.

  6. End-to-end Information Flow Security Model for Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    D. Ju. Chaly

    2015-01-01

    Full Text Available Software-defined networks (SDN are a novel paradigm of networking which became an enabler technology for many modern applications such as network virtualization, policy-based access control and many others. Software can provide flexibility and fast-paced innovations in the networking; however, it has a complex nature. In this connection there is an increasing necessity of means for assuring its correctness and security. Abstract models for SDN can tackle these challenges. This paper addresses to confidentiality and some integrity properties of SDNs. These are critical properties for multi-tenant SDN environments, since the network management software must ensure that no confidential data of one tenant are leaked to other tenants in spite of using the same physical infrastructure. We define a notion of end-to-end security in context of software-defined networks and propose a semantic model where the reasoning is possible about confidentiality, and we can check that confidential information flows do not interfere with non-confidential ones. We show that the model can be extended in order to reason about networks with secure and insecure links which can arise, for example, in wireless environments.The article is published in the authors’ wording.

  7. Circumferential resection and "Z"-shape plastic end-to-end anastomosis of canine trachea.

    Science.gov (United States)

    Zhao, H; Li, Z; Fang, J; Fang, C

    1999-03-01

    To prevent anastomotic stricture of the trachea. Forty young mongrel dogs, weighing 5-7 kg, were randomly divided into two groups: experimental group and control group, with 20 dogs in each group. Four tracheal rings were removed from each dog. In the experimental group, two "Z"-shape tracheoplastic anastomoses were performed on each dog, one on the anterior wall and the other on the membranous part of the trachea. In the control group, each dog received only simple end-to-end anastomosis. Vicryl 3-0 absorbable suture and OB fibrin glue were used for both groups. All dogs were killed when their body weight doubled. The average sagittal stenotic ratio were 1.20 +/- 0.12 for the experimental group and 0.83 +/- 0.05 for the control group. The average cross-sectional area stenotic ratio were 0.90 +/- 0.12 and 0.69 +/- 0.09 and T values were 8.71 and 4.57 for the two groups (P anastomosis in preventing anastomotic stricture of canine trachea.

  8. Mucociliary clearance following tracheal resection and end-to-end anastomosis.

    Science.gov (United States)

    Toomes, H; Linder, A

    1989-10-01

    Mucociliary clearance is an important cleaning system of the bronchial tree. The complex transport system reacts sensitively to medicinal stimuli and inhaled substances. A disturbance causes secretion retention which encourages the development of acute and chronic pulmonary diseases. It is not yet known in which way sectional resection of the central airway effects mucociliary clearance. A large number of the surgical failures are attributable to septic complications in the area of the anastomosis. In order to study the transportation process over the anastomosis, ten dogs underwent a tracheal resection with end-to-end anastomosis, and the mucociliary activity was recorded using a bronchoscopic video-technical method. Recommencement of mucous transport was observed on the third, and transport over the anastomosis from the sixth to tenth, postoperative days. The mucociliary clearance had completely recovered on the twenty-first day in the majority of dogs. Histological examination of the anastomoses nine months postoperatively showed a flat substitute epithelium without cilia-bearing cells in all dogs. This contrasts with the quick restitution of the transport function. In case of undamaged respiratory mucosa, a good adaptation of the resection margins suffices for the mucous film to slide over the anastomosis.

  9. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  10. An end-to-end assessment of range uncertainty in proton therapy using animal tissues

    Science.gov (United States)

    Zheng, Yuanshui; Kang, Yixiu; Zeidan, Omar; Schreuder, Niek

    2016-11-01

    Accurate assessment of range uncertainty is critical in proton therapy. However, there is a lack of data and consensus on how to evaluate the appropriate amount of uncertainty. The purpose of this study is to quantify the range uncertainty in various treatment conditions in proton therapy, using transmission measurements through various animal tissues. Animal tissues, including a pig head, beef steak, and lamb leg, were used in this study. For each tissue, an end-to-end test closely imitating patient treatments was performed. This included CT scan simulation, treatment planning, image-guided alignment, and beam delivery. Radio-chromic films were placed at various depths in the distal dose falloff region to measure depth dose. Comparisons between measured and calculated doses were used to evaluate range differences. The dose difference at the distal falloff between measurement and calculation depends on tissue type and treatment conditions. The estimated range difference was up to 5, 6 and 4 mm for the pig head, beef steak, and lamb leg irradiation, respectively. Our study shows that the TPS was able to calculate proton range within about 1.5% plus 1.5 mm. Accurate assessment of range uncertainty in treatment planning would allow better optimization of proton beam treatment, thus fully achieving proton beams’ superior dose advantage over conventional photon-based radiation therapy.

  11. Telomere dynamics, end-to-end fusions and telomerase activation during the human fibroblast immortalization process.

    Science.gov (United States)

    Ducray, C; Pommier, J P; Martins, L; Boussin, F D; Sabatier, L

    1999-07-22

    Loss of telomeric repeats during cell proliferation could play a role in senescence. It has been generally assumed that activation of telomerase prevents further telomere shortening and is essential for cell immortalization. In this study, we performed a detailed cytogenetic and molecular characterization of four SV40 transformed human fibroblastic cell lines by regularly monitoring the size distribution of terminal restriction fragments, telomerase activity and the associated chromosomal instability throughout immortalization. The mean TRF lengths progressively decreased in pre-crisis cells during the lifespan of the cultures. At crisis, telomeres reached a critical size, different among the cell lines, contributing to the peak of dicentric chromosomes, which resulted mostly from telomeric associations. We observed a direct correlation between short telomere length at crisis and chromosomal instability. In two immortal cell lines, although telomerase was detected, mean telomere length still continued to decrease whereas the number of dicentric chromosomes associated was stabilized. Thus telomerase could protect specifically telomeres which have reached a critical size against end-to-end dicentrics, while long telomeres continue to decrease, although at a slower rate as before crisis. This suggests a balance between elongation by telomerase and telomere shortening, towards a stabilized 'optimal' length.

  12. Practical End-to-End Performance Testing Tool for High Speed 3G-Based Networks

    Science.gov (United States)

    Shinbo, Hiroyuki; Tagami, Atsushi; Ano, Shigehiro; Hasegawa, Toru; Suzuki, Kenji

    High speed IP communication is a killer application for 3rd generation (3G) mobile systems. Thus 3G network operators should perform extensive tests to check whether expected end-to-end performances are provided to customers under various environments. An important objective of such tests is to check whether network nodes fulfill requirements to durations of processing packets because a long duration of such processing causes performance degradation. This requires testers (persons who do tests) to precisely know how long a packet is hold by various network nodes. Without any tool's help, this task is time-consuming and error prone. Thus we propose a multi-point packet header analysis tool which extracts and records packet headers with synchronized timestamps at multiple observation points. Such recorded packet headers enable testers to calculate such holding durations. The notable feature of this tool is that it is implemented on off-the shelf hardware platforms, i.e., lap-top personal computers. The key challenges of the implementation are precise clock synchronization without any special hardware and a sophisticated header extraction algorithm without any drop.

  13. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: A vision for the future

    Science.gov (United States)

    Ramamurthy, M. K.

    2006-05-01

    yet revolutionary way of building applications and methods to connect and exchange information over the Web. This new approach, based on XML - a widely accepted format for exchanging data and corresponding semantics over the Internet - enables applications, computer systems, and information processes to work together in fundamentally different ways. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational landscape, discuss recent developments in cyberinfrastructure, and Unidata's role in and vision for providing easy-to use, robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  14. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: Unidata's Plans and Directions

    Science.gov (United States)

    Ramamurthy, M.

    2005-12-01

    work together in a fundamentally different way. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational drivers and discuss recent developments in cyberinfrastructure and Unidata's role and directions in providing robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  15. Design and end-to-end modelling of a deployable telescope

    Science.gov (United States)

    Dolkens, Dennis; Kuiper, Hans

    2017-09-01

    a closed-loop system based on measurements of the image sharpness as well as measurements obtained with edge sensors placed between the mirror segments. In addition, a phase diversity system will be used to recover residual wavefront aberrations. To aid the design of the deployable telescope, an end-to-end performance model was developed. The model is built around a dedicated ray-trace program written in Matlab. This program was built from the ground up for the purpose of modelling segmented telescope systems and allows for surface data computed with Finite Element Models (FEM) to be imported in the model. The program also contains modules which can simulate the closed-loop calibration of the telescope and it can use simulated images as an input for phase diversity and image processing algorithms. For a given thermo-mechanical state, the end-to-end model can predict the image quality that will be obtained after the calibration has been completed and the image has been processed. As such, the model is a powerful systems engineering tool, which can be used to optimize the in-orbit performance of a segmented, deployable telescope.

  16. Status report of the end-to-end ASKAP software system: towards early science operations

    Science.gov (United States)

    Guzman, Juan Carlos; Chapman, Jessica; Marquarding, Malte; Whiting, Matthew

    2016-08-01

    300 MHz bandwidth for Array Release 1; followed by the deployment of the real-time data processing components. In addition to the Central Processor, the first production release of the CSIRO ASKAP Science Data Archive (CASDA) has also been deployed in one of the Pawsey Supercomputing Centre facilities and it is integrated to the end-to-end ASKAP data flow system. This paper describes the current status of the "end-to-end" data flow software system from preparing observations to data acquisition, processing and archiving; and the challenges of integrating an HPC facility as a key part of the instrument. It also shares some lessons learned since the start of integration activities and the challenges ahead in preparation for the start of the Early Science program.

  17. SME2EM: Smart mobile end-to-end monitoring architecture for life-long diseases.

    Science.gov (United States)

    Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani

    2016-01-01

    Monitoring life-long diseases requires continuous measurements and recording of physical vital signs. Most of these diseases are manifested through unexpected and non-uniform occurrences and behaviors. It is impractical to keep patients in hospitals, health-care institutions, or even at home for long periods of time. Monitoring solutions based on smartphones combined with mobile sensors and wireless communication technologies are a potential candidate to support complete mobility-freedom, not only for patients, but also for physicians. However, existing monitoring architectures based on smartphones and modern communication technologies are not suitable to address some challenging issues, such as intensive and big data, resource constraints, data integration, and context awareness in an integrated framework. This manuscript provides a novel mobile-based end-to-end architecture for live monitoring and visualization of life-long diseases. The proposed architecture provides smartness features to cope with continuous monitoring, data explosion, dynamic adaptation, unlimited mobility, and constrained devices resources. The integration of the architecture׳s components provides information about diseases׳ recurrences as soon as they occur to expedite taking necessary actions, and thus prevent severe consequences. Our architecture system is formally model-checked to automatically verify its correctness against designers׳ desirable properties at design time. Its components are fully implemented as Web services with respect to the SOA architecture to be easy to deploy and integrate, and supported by Cloud infrastructure and services to allow high scalability, availability of processes and data being stored and exchanged. The architecture׳s applicability is evaluated through concrete experimental scenarios on monitoring and visualizing states of epileptic diseases. The obtained theoretical and experimental results are very promising and efficiently satisfy the proposed

  18. jade: An End-To-End Data Transfer and Catalog Tool

    Science.gov (United States)

    Meade, P.

    2017-10-01

    The IceCube Neutrino Observatory is a cubic kilometer neutrino telescope located at the Geographic South Pole. IceCube collects 1 TB of data every day. An online filtering farm processes this data in real time and selects 10% to be sent via satellite to the main data center at the University of Wisconsin-Madison. IceCube has two year-round on-site operators. New operators are hired every year, due to the hard conditions of wintering at the South Pole. These operators are tasked with the daily operations of running a complex detector in serious isolation conditions. One of the systems they operate is the data archiving and transfer system. Due to these challenging operational conditions, the data archive and transfer system must above all be simple and robust. It must also share the limited resource of satellite bandwidth, and collect and preserve useful metadata. The original data archive and transfer software for IceCube was written in 2005. After running in production for several years, the decision was taken to fully rewrite it, in order to address a number of structural drawbacks. The new data archive and transfer software (JADE2) has been in production for several months providing improved performance and resiliency. One of the main goals for JADE2 is to provide a unified system that handles the IceCube data end-to-end: from collection at the South Pole, all the way to long-term archive and preservation in dedicated repositories at the North. In this contribution, we describe our experiences and lessons learned from developing and operating the data archive and transfer software for a particle physics experiment in extreme operational conditions like IceCube.

  19. In vivo laser assisted end-to-end anastomosis with ICG-infused chitosan patches

    Science.gov (United States)

    Rossi, Francesca; Matteini, Paolo; Esposito, Giuseppe; Scerrati, Alba; Albanese, Alessio; Puca, Alfredo; Maira, Giulio; Rossi, Giacomo; Pini, Roberto

    2011-07-01

    Laser assisted vascular repair is a new optimized technique based on the use of ICG-infused chitosan patch to close a vessel wound, with or even without few supporting single stitches. We present an in vivo experimental study on an innovative end-to-end laser assisted vascular anastomotic (LAVA) technique, performed with the application of ICGinfused chitosan patches. The photostability and the mechanical properties of ICG-infused chitosan films were preliminary measured. The in vivo study was performed in 10 New Zealand rabbits. After anesthesia, a 3-cm segment of the right common carotid artery was exposed, thus clamped proximally and distally. The artery was then interrupted by means of a full thickness cut. Three single microsutures were used to approximate the two vessel edges. The ICG-infused chitosan patch was rolled all over the anastomotic site and welded by the use of a diode laser emitting at 810 nm and equipped with a 300 μm diameter optical fiber. Welding was obtained by delivering single laser spots to induce local patch/tissue adhesion. The result was an immediate closure of the anastomosis, with no bleeding at clamps release. Thus animals underwent different follow-up periods, in order to evaluate the welded vessels over time. At follow-up examinations, all the anastomoses were patent and no bleeding signs were documented. Samples of welded vessels underwent histological examinations. Results showed that this technique offer several advantages over conventional suturing methods: simplification of the surgical procedure, shortening of the operative time, better re-endothelization and optimal vascular healing process.

  20. NCAR Earth Observing Laboratory - An End-to-End Observational Science Enterprise

    Science.gov (United States)

    Rockwell, A.; Baeuerle, B.; Grubišić, V.; Hock, T. F.; Lee, W. C.; Ranson, J.; Stith, J. L.; Stossmeister, G.

    2017-12-01

    Researchers who want to understand and describe the Earth System require high-quality observations of the atmosphere, ocean, and biosphere. Making these observations not only requires capable research platforms and state-of-the-art instrumentation but also benefits from comprehensive in-field project management and data services. NCAR's Earth Observing Laboratory (EOL) is an end-to-end observational science enterprise that provides leadership in observational research to scientists from universities, U.S. government agencies, and NCAR. Deployment: EOL manages the majority of the NSF Lower Atmosphere Observing Facilities, which includes research aircraft, radars, lidars, profilers, and surface and sounding systems. This suite is designed to address a wide range of Earth system science - from microscale to climate process studies and from the planet's surface into the Upper Troposphere/Lower Stratosphere. EOL offers scientific, technical, operational, and logistics support to small and large field campaigns across the globe. Development: By working closely with the scientific community, EOL's engineering and scientific staff actively develop the next generation of observing facilities, staying abreast of emerging trends, technologies, and applications in order to improve our measurement capabilities. Through our Design and Fabrication Services, we also offer high-level engineering and technical expertise, mechanical design, and fabrication to the atmospheric research community. Data Services: EOL's platforms and instruments collect unique datasets that must be validated, archived, and made available to the research community. EOL's Data Management and Services deliver high-quality datasets and metadata in ways that are transparent, secure, and easily accessible. We are committed to the highest standard of data stewardship from collection to validation to archival. Discovery: EOL promotes curiosity about Earth science, and fosters advanced understanding of the

  1. SPoRT - An End-to-End R2O Activity

    Science.gov (United States)

    Jedlovec, Gary J.

    2009-01-01

    Established in 2002 to demonstrate the weather and forecasting application of real-time EOS measurements, the Short-term Prediction Research and Transition (SPoRT) program has grown to be an end-to-end research to operations activity focused on the use of advanced NASA modeling and data assimilation approaches, nowcasting techniques, and unique high-resolution multispectral observational data applications from EOS satellites to improve short-term weather forecasts on a regional and local scale. SPoRT currently partners with several universities and other government agencies for access to real-time data and products, and works collaboratively with them and operational end users at 13 WFOs to develop and test the new products and capabilities in a "test-bed" mode. The test-bed simulates key aspects of the operational environment without putting constraints on the forecaster workload. Products and capabilities which show utility in the test-bed environment are then transitioned experimentally into the operational environment for further evaluation and assessment. SPoRT focuses on a suite of data and products from MODIS, AMSR-E, and AIRS on the NASA Terra and Aqua satellites, and total lightning measurements from ground-based networks. Some of the observations are assimilated into or used with various versions of the WRF model to provide supplemental forecast guidance to operational end users. SPoRT is enhancing partnerships with NOAA / NESDIS for new product development and data access to exploit the remote sensing capabilities of instruments on the NPOESS satellites to address short term weather forecasting problems. The VIIRS and CrIS instruments on the NPP and follow-on NPOESS satellites provide similar observing capabilities to the MODIS and AIRS instruments on Terra and Aqua. SPoRT will be transitioning existing and new capabilities into the AWIIPS II environment to continue the continuity of its activities.

  2. IDENTIFYING ELUSIVE ELECTROMAGNETIC COUNTERPARTS TO GRAVITATIONAL WAVE MERGERS: AN END-TO-END SIMULATION

    International Nuclear Information System (INIS)

    Nissanke, Samaya; Georgieva, Alexandra; Kasliwal, Mansi

    2013-01-01

    Combined gravitational wave (GW) and electromagnetic (EM) observations of compact binary mergers should enable detailed studies of astrophysical processes in the strong-field gravity regime. This decade, ground-based GW interferometers promise to routinely detect compact binary mergers. Unfortunately, networks of GW interferometers have poor angular resolution on the sky and their EM signatures are predicted to be faint. Therefore, a challenging goal will be to unambiguously pinpoint the EM counterparts of GW mergers. We perform the first comprehensive end-to-end simulation that focuses on: (1) GW sky localization, distance measures, and volume errors with two compact binary populations and four different GW networks; (2) subsequent EM detectability by a slew of multiwavelength telescopes; and (3) final identification of the merger counterpart amidst a sea of possible astrophysical false positives. First, we find that double neutron star binary mergers can be detected out to a maximum distance of 400 Mpc (or 750 Mpc) by three (or five) detector GW networks, respectively. Neutron-star-black-hole binary mergers can be detected a factor of 1.5 further out; their median to maximum sky localizations are 50-170 deg 2 (or 6-65 deg 2 ) for a three (or five) detector GW network. Second, by optimizing depth, cadence, and sky area, we quantify relative fractions of optical counterparts that are detectable by a suite of different aperture-size telescopes across the globe. Third, we present five case studies to illustrate the diversity of scenarios in secure identification of the EM counterpart. We discuss the case of a typical binary, neither beamed nor nearby, and the challenges associated with identifying an EM counterpart at both low and high Galactic latitudes. For the first time, we demonstrate how construction of low-latency GW volumes in conjunction with local universe galaxy catalogs can help solve the problem of false positives. We conclude with strategies that would

  3. An End-to-End System to Enable Quick, Easy and Inexpensive Deployment of Hydrometeorological Stations

    Science.gov (United States)

    Celicourt, P.; Piasecki, M.

    2014-12-01

    The high cost of hydro-meteorological data acquisition, communication and publication systems along with limited qualified human resources is considered as the main reason why hydro-meteorological data collection remains a challenge especially in developing countries. Despite significant advances in sensor network technologies which gave birth to open hardware and software, low-cost (less than $50) and low-power (in the order of a few miliWatts) sensor platforms in the last two decades, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome, and thus expensive task. These factors give rise for the need to develop a affordable, simple to deploy, scalable and self-organizing end-to-end (from sensor to publication) system suitable for deployment in such countries. The design of the envisioned system will consist of a few Sensed-And-Programmed Arduino-based sensor nodes with low-cost sensors measuring parameters relevant to hydrological processes and a Raspberry Pi micro-computer hosting the in-the-field back-end data management. This latter comprises the Python/Django model of the CUAHSI Observations Data Model (ODM) namely DjangODM backed by a PostgreSQL Database Server. We are also developing a Python-based data processing script which will be paired with the data autoloading capability of Django to populate the DjangODM database with the incoming data. To publish the data, the WOFpy (WaterOneFlow Web Services in Python) developed by the Texas Water Development Board for 'Water Data for Texas' which can produce WaterML web services from a variety of back-end database installations such as SQLite, MySQL, and PostgreSQL will be used. A step further would be the development of an appealing online visualization tool using Python statistics and analytics tools (Scipy, Numpy, Pandas) showing the spatial distribution of variables across an entire watershed as a time variant layer on top of a basemap.

  4. Unidata's Vision for Providing Comprehensive and End-to-end Data Services

    Science.gov (United States)

    Ramamurthy, M. K.

    2009-05-01

    This paper presents Unidata's vision for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users no matter where they are or how they are connected to the Internetwill be able to find and access a plethora of geosciences data and use Unidata-provided tools and services both productively and creatively in their research and education. What that vision means for the Unidata community is elucidated by drawing a simple analogy. Most of users are familiar with Amazon and eBay e-commerce sites and content sharing sites like YouTube and Flickr. On the eBay marketplace, people can sell practically anything at any time and buyers can share their experience of purchasing a product or the reputation of a seller. Likewise, at Amazon, thousands of merchants sell their goods and millions of customers not only buy those goods, but provide a review or opinion of the products they buy and share their experiences as purchasers. Similarly, YouTube and Flickr are sites tailored to video- and photo-sharing, respectively, where users can upload their own content and share it with millions of other users, including family and friends. What all these sites, together with social-networking applications like MySpace and Facebook, have enabled is a sense of a virtual community in which users can search and browse products or content, comment and rate those products from anywhere, at any time, and via any Internet- enabled device like an iPhone, laptop, or a desktop computer. In essence, these enterprises have fundamentally altered people's buying modes and behavior toward purchases. Unidata believes that similar approaches, appropriately tailored to meet the needs of the scientific

  5. Ocean Acidification Scientific Data Stewardship: An approach for end-to-end data management and integration

    Science.gov (United States)

    Arzayus, K. M.; Garcia, H. E.; Jiang, L.; Michael, P.

    2012-12-01

    As the designated Federal permanent oceanographic data center in the United States, NOAA's National Oceanographic Data Center (NODC) has been providing scientific stewardship for national and international marine environmental and ecosystem data for over 50 years. NODC is supporting NOAA's Ocean Acidification Program and the science community by providing end-to-end scientific data management of ocean acidification (OA) data, dedicated online data discovery, and user-friendly access to a diverse range of historical and modern OA and other chemical, physical, and biological oceanographic data. This effort is being catalyzed by the NOAA Ocean Acidification Program, but the intended reach is for the broader scientific ocean acidification community. The first three years of the project will be focused on infrastructure building. A complete ocean acidification data content standard is being developed to ensure that a full spectrum of ocean acidification data and metadata can be stored and utilized for optimal data discovery and access in usable data formats. We plan to develop a data access interface capable of allowing users to constrain their search based on real-time and delayed mode measured variables, scientific data quality, their observation types, the temporal coverage, methods, instruments, standards, collecting institutions, and the spatial coverage. In addition, NODC seeks to utilize the existing suite of international standards (including ISO 19115-2 and CF-compliant netCDF) to help our data producers use those standards for their data, and help our data consumers make use of the well-standardized metadata-rich data sets. These tools will be available through our NODC Ocean Acidification Scientific Data Stewardship (OADS) web page at http://www.nodc.noaa.gov/oceanacidification. NODC also has a goal to provide each archived dataset with a unique ID, to ensure a means of providing credit to the data provider. Working with partner institutions, such as the

  6. Common Patterns with End-to-end Interoperability for Data Access

    Science.gov (United States)

    Gallagher, J.; Potter, N.; Jones, M. B.

    2010-12-01

    At first glance, using common storage formats and open standards should be enough to ensure interoperability between data servers and client applications, but that is often not the case. In the REAP (Realtime Environment for Analytical Processing; NSF #0619060) project we integrated access to data from OPeNDAP servers into the Kepler workflow system and found that, as in previous cases, we spent the bulk of our effort addressing the twin issues of data model compatibility and integration strategies. Implementing seamless data access between a remote data source and a client application (data sink) can be broken down into two kinds of issues. First, the solution must address any differences in the data models used by the data source (OPeNDAP) and the data sink (the Kepler workflow system). If these models match completely, there is little work to be done. However, that is rarely the case. To map OPeNDAP's data model to Kepler's, we used two techniques (ignoring trivial conversions): On-the-fly type mapping and out-of-band communication. Type conversion takes place both for data and metadata because Kepler requires a priori knowledge of some aspects (e.g., syntactic metadata) of the data to build a workflow. In addition, OPeNDAP's constraint expression syntax was used to send out-of-band information to restrict the data requested from the server, facilitating changes in the returned data's type. This technique provides a way for users to exert fine-grained control over the data request, a potentially useful technique, at the cost of requiring that users understand a little about the data source's processing capabilities. The second set of issues for end-to-end data access are integration strategies. OPeNDAP provides several different tools for bringing data into an application: C++, C and Java libraries that provide functions for newly written software; The netCDF library which enables existing applications to read from servers using an older interface; and simple

  7. On the importance of risk knowledge for an end-to-end tsunami early warning system

    Science.gov (United States)

    Post, Joachim; Strunz, Günter; Riedlinger, Torsten; Mück, Matthias; Wegscheider, Stephanie; Zosseder, Kai; Steinmetz, Tilmann; Gebert, Niklas; Anwar, Herryal

    2010-05-01

    context has been worked out. The generated results contribute significantly in the fields of (1) warning decision and warning levels, (2) warning dissemination and warning message content, (3) early warning chain planning, (4) increasing response capabilities and protective systems, (5) emergency relief and (6) enhancing communities' awareness and preparedness towards tsunami threats. Additionally examples will be given on the potentials of an operational use of risk information in early warning systems as first experiences exist for the tsunami early warning center in Jakarta, Indonesia. Beside this the importance of linking national level early warning information with tsunami risk information available at the local level (e.g. linking warning message information on expected intensity with respective tsunami hazard zone maps at community level for effective evacuation) will be demonstrated through experiences gained in three pilot areas in Indonesia. The presentation seeks to provide new insights on benefits using risk information in early warning and will provide further evidence that practical use of risk information is an important and indispensable component of end-to-end early warning.

  8. The End-To-End Safety Verification Process Implemented to Ensure Safe Operations of the Columbus Research Module

    Science.gov (United States)

    Arndt, J.; Kreimer, J.

    2010-09-01

    The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough

  9. A vision for end-to-end data services to foster international partnerships through data sharing

    Science.gov (United States)

    Ramamurthy, M.; Yoksas, T.

    2009-04-01

    Increasingly, the conduct of science requires scientific partnerships and sharing of knowledge, information, and other assets. This is particularly true in our field where the highly-coupled Earth system and its many linkages have heightened the importance of collaborations across geographic, disciplinary, and organizational boundaries. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. As articulated in the NSF Strategic Plan: FY 2006-2011, "…discovery increasingly requires expertise of individuals from different disciplines, with diverse perspectives, and often from different nations, working together to accommodate the extraordinary complexity of today's science and engineering challenges." The Nobel Prize winning IPCC assessments are a prime example of such an effort. Earth science education is also uniquely suited to drawing connections between the dynamic Earth system and societal issues. Events like the 2004 Indian Ocean tsunami and Hurricane Katrina provide ample evidence of this relevance, as they underscore the importance of timely and interdisciplinary integration and synthesis of data. Our success in addressing such complex problems and advancing geosciences depends on the availability of a state-of-the-art and robust cyberinfrastructure, transparent and timely access to high-quality data from diverse sources, and requisite tools to integrate and use the data effectively, toward creating new knowledge. To that end, Unidata's vision calls for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users — no matter where they are, how they are connected to the Internet, or what

  10. An end-to-end coupled model ROMS-N 2 P 2 Z 2 D 2 -OSMOSE of ...

    African Journals Online (AJOL)

    An end-to-end coupled model ROMS-N 2 P 2 Z 2 D 2 -OSMOSE of the southern Benguela foodweb: parameterisation, calibration and pattern-oriented validation. ... We also highlight the capacity of this model for tracking indicators at various hierarchical levels. Keywords: individual-based model, model validation, ...

  11. GROWTH OF THE HYPOPLASTIC AORTIC-ARCH AFTER SIMPLE COARCTATION RESECTION AND END-TO-END ANASTOMOSIS

    NARCIS (Netherlands)

    BROUWER, MHJ; CROMMEDIJKHUIS, AH; EBELS, T; EIJGELAAR, A

    Surgical treatment of a hypoplastic aortic arch associated with an aortic coarctation is controversial. The controversy concerns the claimed need to surgically enlarge the diameter of the hypoplastic arch, in addition to resection and end-to-end anastomosis. The purpose of this prospective study is

  12. SciBox, an end-to-end automated science planning and commanding system

    Science.gov (United States)

    Choo, Teck H.; Murchie, Scott L.; Bedini, Peter D.; Steele, R. Josh; Skura, Joseph P.; Nguyen, Lillian; Nair, Hari; Lucks, Michael; Berman, Alice F.; McGovern, James A.; Turner, F. Scott

    2014-01-01

    SciBox is a new technology for planning and commanding science operations for Earth-orbital and planetary space missions. It has been incrementally developed since 2001 and demonstrated on several spaceflight projects. The technology has matured to the point that it is now being used to plan and command all orbital science operations for the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) mission to Mercury. SciBox encompasses the derivation of observing sequences from science objectives, the scheduling of those sequences, the generation of spacecraft and instrument commands, and the validation of those commands prior to uploading to the spacecraft. Although the process is automated, science and observing requirements are incorporated at each step by a series of rules and parameters to optimize observing opportunities, which are tested and validated through simulation and review. Except for limited special operations and tests, there is no manual scheduling of observations or construction of command sequences. SciBox reduces the lead time for operations planning by shortening the time-consuming coordination process, reduces cost by automating the labor-intensive processes of human-in-the-loop adjudication of observing priorities, reduces operations risk by systematically checking constraints, and maximizes science return by fully evaluating the trade space of observing opportunities to meet MESSENGER science priorities within spacecraft recorder, downlink, scheduling, and orbital-geometry constraints.

  13. Advanced Camera Image Cropping Approach for CNN-Based End-to-End Controls on Sustainable Computing

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2018-03-01

    Full Text Available Recent research on deep learning has been applied to a diversity of fields. In particular, numerous studies have been conducted on self-driving vehicles using end-to-end approaches based on images captured by a single camera. End-to-end controls learn the output vectors of output devices directly from the input vectors of available input devices. In other words, an end-to-end approach learns not by analyzing the meaning of input vectors, but by extracting optimal output vectors based on input vectors. Generally, when end-to-end control is applied to self-driving vehicles, the steering wheel and pedals are controlled autonomously by learning from the images captured by a camera. However, high-resolution images captured from a car cannot be directly used as inputs to Convolutional Neural Networks (CNNs owing to memory limitations; the image size needs to be efficiently reduced. Therefore, it is necessary to extract features from captured images automatically and to generate input images by merging the parts of the images that contain the extracted features. This paper proposes a learning method for end-to-end control that generates input images for CNNs by extracting road parts from input images, identifying the edges of the extracted road parts, and merging the parts of the images that contain the detected edges. In addition, a CNN model for end-to-end control is introduced. Experiments involving the Open Racing Car Simulator (TORCS, a sustainable computing environment for cars, confirmed the effectiveness of the proposed method for self-driving by comparing the accumulated difference in the angle of the steering wheel in the images generated by it with those of resized images containing the entire captured area and cropped images containing only a part of the captured area. The results showed that the proposed method reduced the accumulated difference by 0.839% and 0.850% compared to those yielded by the resized images and cropped images

  14. End-to-end simulation of the C-ADS injector Ⅱ with a 3-D field map

    International Nuclear Information System (INIS)

    Wang Zhijun; He Yuan; Li Chao; Wang Wangsheng; Liu Shuhui; Jia Huan; Xu Xianbo; Chen Ximeng

    2013-01-01

    The Injector II, one of the two parallel injectors of the high-current superconducting proton driver linac for the China Accelerator-Driven System (C-ADS) project, is being designed and constructed by the Institute of Modern Physics. At present, the design work for the injector is almost finished. End-to-end simulation has been carried out using the TRACK multiparticle simulation code to check the match between each acceleration section and the performance of the injector as a whole. Moreover, multiparticle simulations with all kinds of errors and misalignments have been performed to define the requirements of each device. The simulation results indicate that the lattice design is robust. In this paper, the results of end-to-end simulation and error simulation with a 3-D field map are presented. (authors)

  15. Exploring the requirements for multimodal interaction for mobile devices in an end-to-end journey context.

    Science.gov (United States)

    Krehl, Claudia; Sharples, Sarah

    2012-01-01

    The paper investigates the requirements for multimodal interaction on mobile devices in an end-to-end journey context. Traditional interfaces are deemed cumbersome and inefficient for exchanging information with the user. Multimodal interaction provides a different user-centred approach allowing for more natural and intuitive interaction between humans and computers. It is especially suitable for mobile interaction as it can overcome additional constraints including small screens, awkward keypads, and continuously changing settings - an inherent property of mobility. This paper is based on end-to-end journeys where users encounter several contexts during their journeys. Interviews and focus groups explore the requirements for multimodal interaction design for mobile devices by examining journey stages and identifying the users' information needs and sources. Findings suggest that multimodal communication is crucial when users multitask. Choosing suitable modalities depend on user context, characteristics and tasks.

  16. Minimizing End-to-End Interference in I/O Stacks Spanning Shared Multi-Level Buffer Caches

    Science.gov (United States)

    Patrick, Christina M.

    2011-01-01

    This thesis presents an end-to-end interference minimizing uniquely designed high performance I/O stack that spans multi-level shared buffer cache hierarchies accessing shared I/O servers to deliver a seamless high performance I/O stack. In this thesis, I show that I can build a superior I/O stack which minimizes the inter-application interference…

  17. Sleep/wake scheduling scheme for minimizing end-to-end delay in multi-hop wireless sensor networks

    OpenAIRE

    Madani Sajjad; Nazir Babar; Hasbullah Halabi

    2011-01-01

    Abstract We present a sleep/wake schedule protocol for minimizing end-to-end delay for event driven multi-hop wireless sensor networks. In contrast to generic sleep/wake scheduling schemes, our proposed algorithm performs scheduling that is dependent on traffic loads. Nodes adapt their sleep/wake schedule based on traffic loads in response to three important factors, (a) the distance of the node from the sink node, (b) the importance of the node's location from connectivity's perspective, and...

  18. Multi-institutional evaluation of end-to-end protocol for IMRT/VMAT treatment chains utilizing conventional linacs.

    Science.gov (United States)

    Loughery, Brian; Knill, Cory; Silverstein, Evan; Zakjevskii, Viatcheslav; Masi, Kathryn; Covington, Elizabeth; Snyder, Karen; Song, Kwang; Snyder, Michael

    2018-03-20

    We conducted a multi-institutional assessment of a recently developed end-to-end monthly quality assurance (QA) protocol for external beam radiation therapy treatment chains. This protocol validates the entire treatment chain against a baseline to detect the presence of complex errors not easily found in standard component-based QA methods. Participating physicists from 3 institutions ran the end-to-end protocol on treatment chains that include Imaging and Radiation Oncology Core (IROC)-credentialed linacs. Results were analyzed in the form of American Association of Physicists in Medicine (AAPM) Task Group (TG)-119 so that they may be referenced by future test participants. Optically stimulated luminescent dosimeter (OSLD), EBT3 radiochromic film, and A1SL ion chamber readings were accumulated across 10 test runs. Confidence limits were calculated to determine where 95% of measurements should fall. From calculated confidence limits, 95% of measurements should be within 5% error for OSLDs, 4% error for ionization chambers, and 4% error for (96% relative gamma pass rate) radiochromic film at 3% agreement/3 mm distance to agreement. Data were separated by institution, model of linac, and treatment protocol (intensity-modulated radiation therapy [IMRT] vs volumetric modulated arc therapy [VMAT]). A total of 97% of OSLDs, 98% of ion chambers, and 93% of films were within the confidence limits; measurements were found outside these limits by a maximum of 4%, consistent despite institutional differences in OSLD reading equipment and radiochromic film calibration techniques. Results from this test may be used by clinics for data comparison. Areas of improvement were identified in the end-to-end protocol that can be implemented in an updated version. The consistency of our data demonstrates the reproducibility and ease-of-use of such tests and suggests a potential role for their use in broad end-to-end QA initiatives. Copyright © 2018 American Association of Medical

  19. Sleep/wake scheduling scheme for minimizing end-to-end delay in multi-hop wireless sensor networks

    Directory of Open Access Journals (Sweden)

    Madani Sajjad

    2011-01-01

    Full Text Available Abstract We present a sleep/wake schedule protocol for minimizing end-to-end delay for event driven multi-hop wireless sensor networks. In contrast to generic sleep/wake scheduling schemes, our proposed algorithm performs scheduling that is dependent on traffic loads. Nodes adapt their sleep/wake schedule based on traffic loads in response to three important factors, (a the distance of the node from the sink node, (b the importance of the node's location from connectivity's perspective, and (c if the node is in the proximity where an event occurs. Using these heuristics, the proposed scheme reduces end-to-end delay and maximizes the throughput by minimizing the congestion at nodes having heavy traffic load. Simulations are carried out to evaluate the performance of the proposed protocol, by comparing its performance with S-MAC and Anycast protocols. Simulation results demonstrate that the proposed protocol has significantly reduced the end-to-end delay, as well as has improved the other QoS parameters, like average energy per packet, average delay, packet loss ratio, throughput, and coverage lifetime.

  20. Hybrid monitoring scheme for end-to-end performance enhancement of multicast-based real-time media

    Science.gov (United States)

    Park, Ju-Won; Kim, JongWon

    2004-10-01

    As real-time media applications based on IP multicast networks spread widely, end-to-end QoS (quality of service) provisioning for these applications have become very important. To guarantee the end-to-end QoS of multi-party media applications, it is essential to monitor the time-varying status of both network metrics (i.e., delay, jitter and loss) and system metrics (i.e., CPU and memory utilization). In this paper, targeting the multicast-enabled AG (Access Grid) a next-generation group collaboration tool based on multi-party media services, the applicability of hybrid monitoring scheme that combines active and passive monitoring is investigated. The active monitoring measures network-layer metrics (i.e., network condition) with probe packets while the passive monitoring checks both application-layer metrics (i.e., user traffic condition by analyzing RTCP packets) and system metrics. By comparing these hybrid results, we attempt to pinpoint the causes of performance degradation and explore corresponding reactions to improve the end-to-end performance. The experimental results show that the proposed hybrid monitoring can provide useful information to coordinate the performance improvement of multi-party real-time media applications.

  1. Debris mitigation measures by satellite design and operational methods - Findings from the DLR space debris End-to-End Service

    Science.gov (United States)

    Sdunnus, H.; Beltrami, P.; Janovsky, R.; Koppenwallner, G.; Krag, H.; Reimerdes, H.; Schäfer, F.

    Debris Mitigation has been recognised as an issue to be addressed by the space faring nations around the world. Currently, there are various activities going on, aiming at the establishment of debris mitigation guidelines on various levels, reaching from the UN down to national space agencies. Though guidelines established on the national level already provide concrete information how things should be done (rather that specifying what should be done or providing fundamental principles) potential users of the guidelines will still have the need to explore the technical, management, and financial implications of the guidelines for their projects. Those questions are addressed by the so called "Space Debris End-to-End Service" project, which has been initiated as a national initiative of the German Aerospace Centre (DLR). Based on a review of already existing mitigation guidelines or guidelines under development and following an identification of needs from a circle of industrial users the "End-to-End Service Gu idelines" have been established for designer and operators of spacecraft. The End-to-End Service Guidelines are based on requirements addressed by the mitigation guidelines and provide recommendations how and when the technical consideration of the mitigation guidelines should take place. By referencing requirements from the mitigation guidelines, the End-to-End Service Guidelines address the consideration of debris mitigation measures by spacecraft design and operational measures. This paper will give an introduction to the End-to-End Service Guidelines. It will focus on the proposals made for mitigation measures by the S/C system design, i.e. on protective design measures inside the spacecraft and on design measures, e.g. innovative protective (shielding) systems. Furthermore, approaches on the analytical optimisation of protective systems will be presented, aiming at the minimisation of shield mass under conservation of the protective effects. On the

  2. Understanding Effect of Constraint Release Environment on End-to-End Vector Relaxation of Linear Polymer Chains

    KAUST Repository

    Shivokhin, Maksim E.

    2017-05-30

    We propose and verify methods based on the slip-spring (SSp) model [ Macromolecules 2005, 38, 14 ] for predicting the effect of any monodisperse, binary, or ternary environment of topological constraints on the relaxation of the end-to-end vector of a linear probe chain. For this purpose we first validate the ability of the model to consistently predict both the viscoelastic and dielectric response of monodisperse and binary mixtures of type A polymers, based on published experimental data. We also report the synthesis of new binary and ternary polybutadiene systems, the measurement of their linear viscoelastic response, and the prediction of these data by the SSp model. We next clarify the relaxation mechanisms of probe chains in these constraint release (CR) environments by analyzing a set of "toy" SSp models with simplified constraint release rates, by examining fluctuations of the end-to-end vector. In our analysis, the longest relaxation time of the probe chain is determined by a competition between the longest relaxation times of the effective CR motions of the fat and thin tubes and the motion of the chain itself in the thin tube. This picture is tested by the analysis of four model systems designed to separate and estimate every single contribution involved in the relaxation of the probe\\'s end-to-end vector in polydisperse systems. We follow the CR picture of Viovy et al. [ Macromolecules 1991, 24, 3587 ] and refine the effective chain friction in the thin and fat tubes based on Read et al. [ J. Rheol. 2012, 56, 823 ]. The derived analytical equations form a basis for generalizing the proposed methodology to polydisperse mixtures of linear and branched polymers. The consistency between the SSp model and tube model predictions is a strong indicator of the compatibility between these two distinct mesoscopic frameworks.

  3. Crosstalk in an FDM Laboratory Setup and the Athena X-IFU End-to-End Simulator

    Science.gov (United States)

    den Hartog, R.; Kirsch, C.; de Vries, C.; Akamatsu, H.; Dauser, T.; Peille, P.; Cucchetti, E.; Jackson, B.; Bandler, S.; Smith, S.; Wilms, J.

    2018-04-01

    The impact of various crosstalk mechanisms on the performance of the Athena X-IFU instrument has been assessed with detailed end-to-end simulations. For the crosstalk in the electrical circuit, a detailed model has been developed. In this contribution, we test this model against measurements made with an FDM laboratory setup and discuss the assumption of deterministic crosstalk in the context of the weak link effect in the detectors. We conclude that crosstalk levels predicted by the model are conservative with respect to the observed levels.

  4. AN AUTOMATED END-TO-END MULTI-AGENT QOS BASED ARCHITECTURE FOR SELECTION OF GEOSPATIAL WEB SERVICES

    Directory of Open Access Journals (Sweden)

    M. Shah

    2012-07-01

    With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  5. Comparison of Direct Side-to-End and End-to-End Hypoglossal-Facial Anastomosis for Facial Nerve Repair.

    Science.gov (United States)

    Samii, Madjid; Alimohamadi, Maysam; Khouzani, Reza Karimi; Rashid, Masoud Rafizadeh; Gerganov, Venelin

    2015-08-01

    The hypoglossal facial anastomosis (HFA) is the gold standard for facial reanimation in patients with severe facial nerve palsy. The major drawbacks of the classic HFA technique are lingual morbidities due to hypoglossal nerve transection. The side-to-end HFA is a modification of the classic technique with fewer tongue-related morbidities. In this study we compared the outcome of the classic end-to-end and the direct side-to-end HFA surgeries performed at our center in regards to the facial reanimation success rate and tongue-related morbidities. Twenty-six successive cases of HFA were enrolled. In 9 of them end-to-end anastomoses were performed, and 17 had direct side-to-end anastomoses. The House-Brackmann (HB) and Pitty and Tator (PT) scales were used to document surgical outcome. The hemiglossal atrophy, swallowing, and hypoglossal nerve function were assessed at follow-up. The original pathology was vestibular schwannoma in 15, meningioma in 4, brain stem glioma in 4, and other pathologies in 3. The mean interval between facial palsy and HFA was 18 months (range: 0-60). The median follow-up period was 20 months. The PT grade at follow-up was worse in patients with a longer interval from facial palsy and HFA (P value: 0.041). The lesion type was the only other factor that affected PT grade (the best results in vestibular schwannoma and the worst in the other pathologies group, P value: 0.038). The recovery period for facial tonicity was longer in patients with radiation therapy before HFA (13.5 vs. 8.5 months) and those with a longer than 2-year interval from facial palsy to HFA (13.5 vs. 8.5 months). Although no significant difference between the side-to-end and the end-to-end groups was seen in terms of facial nerve functional recovery, patients from the side-to-end group had a significantly lower rate of lingual morbidities (tongue hemiatrophy: 100% vs. 5.8%, swallowing difficulty: 55% vs. 11.7%, speech disorder 33% vs. 0%). With the side-to-end HFA

  6. Risk Factors for Dehiscence of Stapled Functional End-to-End Intestinal Anastomoses in Dogs: 53 Cases (2001-2012).

    Science.gov (United States)

    Snowdon, Kyle A; Smeak, Daniel D; Chiang, Sharon

    2016-01-01

    To identify risk factors for dehiscence in stapled functional end-to-end anastomoses (SFEEA) in dogs. Retrospective case series. Dogs (n = 53) requiring an enterectomy. Medical records from a single institution for all dogs undergoing an enterectomy (2001-2012) were reviewed. Surgeries were included when gastrointestinal (GIA) and thoracoabdominal (TA) stapling equipment was used to create a functional end-to-end anastomosis between segments of small intestine or small and large intestine in dogs. Information regarding preoperative, surgical, and postoperative factors was recorded. Anastomotic dehiscence was noted in 6 of 53 cases (11%), with a mortality rate of 83%. The only preoperative factor significantly associated with dehiscence was the presence of inflammatory bowel disease (IBD). Surgical factors significantly associated with dehiscence included the presence, duration, and number of intraoperative hypotensive periods, and location of anastomosis, with greater odds of dehiscence in anastomoses involving the large intestine. IBD, location of anastomosis, and intraoperative hypotension are risk factors for intestinal anastomotic dehiscence after SFEEA in dogs. Previously suggested risk factors (low serum albumin concentration, preoperative septic peritonitis, and intestinal foreign body) were not confirmed in this study. © Copyright 2015 by The American College of Veterinary Surgeons.

  7. A new technique for end-to-end ureterostomy in the rat, using an indwelling reabsorbable stent.

    Science.gov (United States)

    Carmignani, G; Farina, F P; De Stefani, S; Maffezzini, M

    1983-01-01

    The restoration of the continuity of the urinary tract represents one of the major problems in rat renal transplantation. End-to-end ureterostomy is the most physiologically effective technique; however, it involves noteworthy technical difficulties because of the extremely thin caliber of the ureter in the rat and the high incidence of postoperative hydronephrosis. We describe a new technique for end-to-end ureterostomy in the rat, where the use of an absorbable ureteral stent is recommended. A 5-0 plain catgut thread is used as a stent. The anastomosis is performed under an operating microscope at X 25-40 magnification with interrupted sutures of 11-0 Vicryl. The use of the indwelling stent facilitates the performance of the anastomosis and yields optimal results. The macroscopical, radiological, and histological controls in a group of rats operated on with this technique showed a very high percentage of success with no complications, a result undoubtedly superior to that obtained with conventional methods.

  8. A multicentre 'end to end' dosimetry audit of motion management (4DCT-defined motion envelope) in radiotherapy.

    Science.gov (United States)

    Palmer, Antony L; Nash, David; Kearton, John R; Jafari, Shakardokht M; Muscat, Sarah

    2017-12-01

    External dosimetry audit is valuable for the assurance of radiotherapy quality. However, motion management has not been rigorously audited, despite its complexity and importance for accuracy. We describe the first end-to-end dosimetry audit for non-SABR (stereotactic ablative body radiotherapy) lung treatments, measuring dose accumulation in a moving target, and assessing adequacy of target dose coverage. A respiratory motion lung-phantom with custom-designed insert was used. Dose was measured with radiochromic film, employing triple-channel dosimetry and uncertainty reduction. The host's 4DCT scan, outlining and planning techniques were used. Measurements with the phantom static and then moving at treatment delivery separated inherent treatment uncertainties from motion effects. Calculated and measured dose distributions were compared by isodose overlay, gamma analysis, and we introduce the concept of 'dose plane histograms' for clinically relevant interpretation of film dosimetry. 12 radiotherapy centres and 19 plans were audited: conformal, IMRT (intensity modulated radiotherapy) and VMAT (volumetric modulated radiotherapy). Excellent agreement between planned and static-phantom results were seen (mean gamma pass 98.7% at 3% 2 mm). Dose blurring was evident in the moving-phantom measurements (mean gamma pass 88.2% at 3% 2 mm). Planning techniques for motion management were adequate to deliver the intended moving-target dose coverage. A novel, clinically-relevant, end-to-end dosimetry audit of motion management strategies in radiotherapy is reported. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. End-to-end self-assembly of gold nanorods in isopropanol solution: experimental and theoretical studies

    Energy Technology Data Exchange (ETDEWEB)

    Gordel, M., E-mail: marta.gordel@pwr.edu.pl [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland); Piela, K., E-mail: katarzyna.piela@pwr.edu.pl [Wrocław University of Technology, Department of Physical and Quantum Chemistry (Poland); Kołkowski, R. [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland); Koźlecki, T. [Wrocław University of Technology, Department of Chemical Engineering, Faculty of Chemistry (Poland); Buckle, M. [CNRS, École Normale Supérieure de Cachan, Laboratoire de Biologie et Pharmacologie Appliquée (France); Samoć, M. [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland)

    2015-12-15

    We describe here a modification of properties of colloidal gold nanorods (NRs) resulting from the chemical treatment used to carry out their transfer into isopropanol (IPA) solution. The NRs acquire a tendency to attach one to another by their ends (end-to-end assembly). We focus on the investigation of the change in position and shape of the longitudinal surface plasmon (l-SPR) band after self-assembly. The experimental results are supported by a theoretical calculation, which rationalizes the dramatic change in optical properties when the NRs are positioned end-to-end at short distances. The detailed spectroscopic characterization performed at the consecutive stages of transfer of the NRs from water into IPA solution revealed the features of the interaction between the polymers used as ligands and their contribution to the final stage, when the NRs were dispersed in IPA solution. The efficient method of aligning the NRs detailed here may facilitate applications of the self-assembled NRs as building blocks for optical materials and biological sensing.Graphical Abstract.

  10. Analysis of the relationship between end-to-end distance and activity of single-chain antibody against colorectal carcinoma.

    Science.gov (United States)

    Zhang, Jianhua; Liu, Shanhong; Shang, Zhigang; Shi, Li; Yun, Jun

    2012-08-22

    We investigated the relationship of End-to-end distance between VH and VL with different peptide linkers and the activity of single-chain antibodies by computer-aided simulation. First, we developed (G4S)n (where n = 1-9) as the linker to connect VH and VL, and estimated the 3D structure of single-chain Fv antibody (scFv) by homologous modeling. After molecular models were evaluated and optimized, the coordinate system of every protein was built and unified into one coordinate system, and End-to-end distances calculated using 3D space coordinates. After expression and purification of scFv-n with (G4S)n as n = 1, 3, 5, 7 or 9, the immunoreactivity of purified ND-1 scFv-n was determined by ELISA. A multi-factorial relationship model was employed to analyze the structural factors affecting scFv: rn=ABn-ABO2+CDn-CDO2+BCn-BCst2. The relationship between immunoreactivity and r-values revealed that fusion protein structure approached the desired state when the r-value = 3. The immunoreactivity declined as the r-value increased, but when the r-value exceeded a certain threshold, it stabilized. We used a linear relationship to analyze structural factors affecting scFv immunoreactivity.

  11. End-to-End Joint Antenna Selection Strategy and Distributed Compress and Forward Strategy for Relay Channels

    Directory of Open Access Journals (Sweden)

    Rahul Vaze

    2009-01-01

    Full Text Available Multihop relay channels use multiple relay stages, each with multiple relay nodes, to facilitate communication between a source and destination. Previously, distributed space-time codes were proposed to maximize the achievable diversity-multiplexing tradeoff; however, they fail to achieve all the points of the optimal diversity-multiplexing tradeoff. In the presence of a low-rate feedback link from the destination to each relay stage and the source, this paper proposes an end-to-end antenna selection (EEAS strategy as an alternative to distributed space-time codes. The EEAS strategy uses a subset of antennas of each relay stage for transmission of the source signal to the destination with amplifying and forwarding at each relay stage. The subsets are chosen such that they maximize the end-to-end mutual information at the destination. The EEAS strategy achieves the corner points of the optimal diversity-multiplexing tradeoff (corresponding to maximum diversity gain and maximum multiplexing gain and achieves better diversity gain at intermediate values of multiplexing gain, versus the best-known distributed space-time coding strategies. A distributed compress and forward (CF strategy is also proposed to achieve all points of the optimal diversity-multiplexing tradeoff for a two-hop relay channel with multiple relay nodes.

  12. Weighted-DESYNC and Its Application to End-to-End Throughput Fairness in Wireless Multihop Network

    Directory of Open Access Journals (Sweden)

    Ui-Seong Yu

    2017-01-01

    Full Text Available The end-to-end throughput of a routing path in wireless multihop network is restricted by a bottleneck node that has the smallest bandwidth among the nodes on the routing path. In this study, we propose a method for resolving the bottleneck-node problem in multihop networks, which is based on multihop DESYNC (MH-DESYNC algorithm that is a bioinspired resource allocation method developed for use in multihop environments and enables fair resource allocation among nearby (up to two hops neighbors. Based on MH-DESYNC, we newly propose weighted-DESYNC (W-DESYNC as a tool artificially to control the amount of resource allocated to the specific user and thus to achieve throughput fairness over a routing path. Proposed W-DESYNC employs the weight factor of a link to determine the amount of bandwidth allocated to a node. By letting the weight factor be the link quality of a routing path and making it the same across a routing path via Cucker-Smale flocking model, we can obtain throughput fairness over a routing path. The simulation results show that the proposed algorithm achieves throughput fairness over a routing path and can increase total end-to-end throughput in wireless multihop networks.

  13. End-to-end Structural Restriction of α-Synuclein and Its Influence on Amyloid Fibril Formation

    International Nuclear Information System (INIS)

    Hong, Chul Suk; Park, Jae Hyung; Choe, Young Jun; Paik, Seung R.

    2014-01-01

    Relationship between molecular freedom of amyloidogenic protein and its self-assembly into amyloid fibrils has been evaluated with α-synuclein, an intrinsically unfolded protein related to Parkinson's disease, by restricting its structural plasticity through an end-to-end disulfide bond formation between two newly introduced cysteine residues on the N- and C-termini. Although the resulting circular form of α-synuclein exhibited an impaired fibrillation propensity, the restriction did not completely block the protein's interactive core since co-incubation with wild-type α-synuclein dramatically facilitated the fibrillation by producing distinctive forms of amyloid fibrils. The suppressed fibrillation propensity was instantly restored as the structural restriction was unleashed with β-mercaptoethanol. Conformational flexibility of the accreting amyloidogenic protein to pre-existing seeds has been demonstrated to be critical for fibrillar extension process by exerting structural adjustment to a complementary structure for the assembly

  14. End-to-end Structural Restriction of α-Synuclein and Its Influence on Amyloid Fibril Formation

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Chul Suk; Park, Jae Hyung; Choe, Young Jun; Paik, Seung R. [Seoul National University, Seoul (Korea, Republic of)

    2014-09-15

    Relationship between molecular freedom of amyloidogenic protein and its self-assembly into amyloid fibrils has been evaluated with α-synuclein, an intrinsically unfolded protein related to Parkinson's disease, by restricting its structural plasticity through an end-to-end disulfide bond formation between two newly introduced cysteine residues on the N- and C-termini. Although the resulting circular form of α-synuclein exhibited an impaired fibrillation propensity, the restriction did not completely block the protein's interactive core since co-incubation with wild-type α-synuclein dramatically facilitated the fibrillation by producing distinctive forms of amyloid fibrils. The suppressed fibrillation propensity was instantly restored as the structural restriction was unleashed with β-mercaptoethanol. Conformational flexibility of the accreting amyloidogenic protein to pre-existing seeds has been demonstrated to be critical for fibrillar extension process by exerting structural adjustment to a complementary structure for the assembly.

  15. Self-assembled nanogaps via seed-mediated growth of end-to-end linked gold nanorods

    DEFF Research Database (Denmark)

    Jain, Titoo; Westerlund, Axel Rune Fredrik; Johnson, Erik

    2009-01-01

    Gold nanorods (AuNRs) are of interest for a wide range of applications, ranging from imaging to molecular electronics, and they have been studied extensively for the past decade. An important issue in AuNR applications is the ability to self-assemble the rods in predictable structures...... on the nanoscale. We here present a new way to end-to-end link AuNRs with a single or few linker molecules. Whereas methods reported in the literature so far rely on modification of the AuNRs after the synthesis, we here dimerize gold nanoparticle seeds with a water-soluble dithiol-functionalized polyethylene...... that a large fraction of the rods are flexible around the hinging molecule in solution, as expected for a molecularly linked nanogap. By using excess of gold nanoparticles relative to the linking dithiol molecule, this method can provide a high probability that a single molecule is connecting the two rods...

  16. Increasing gas producer profitability with virtual well visibility via an end-to-end wireless Internet gas monitoring system

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, M. [Northrock Resources Ltd., Calgary, AB (Canada); Benterud, K. [Zed.i solutions, Calgary, AB (Canada)

    2003-07-01

    This PowerPoint presentation describes how Northrock Resources Ltd. increased profitability using Smart-Alek{sup TM} while avoiding high implementation costs. Smart-Alek is a new type of fully integrated end-to-end electronic gas flow measurement (GFM) system based on Field Intelligence Network and End User Interference (FINE). Smart-Alek can analyze gas production through public wireless communications and a web-browser delivery system. The system has enabled Northrock to increase gas volumes with more accurate measurement and reduced downtime. In addition, operating costs have decreased because the frequency of well visits has been reduced and the administrative procedures of data collection is more efficient. The real-time well visibility of the tool has proven to be very effective in optimizing business profitability. 7 figs.

  17. End-to-end performance of cooperative relaying in spectrum-sharing systems with quality of service requirements

    KAUST Repository

    Asghari, Vahid Reza

    2011-07-01

    We propose adopting a cooperative relaying technique in spectrum-sharing cognitive radio (CR) systems to more effectively and efficiently utilize available transmission resources, such as power, rate, and bandwidth, while adhering to the quality of service (QoS) requirements of the licensed (primary) users of the shared spectrum band. In particular, we first consider that the cognitive (secondary) user\\'s communication is assisted by an intermediate relay that implements the decode-and-forward (DF) technique onto the secondary user\\'s relayed signal to help with communication between the corresponding source and the destination nodes. In this context, we obtain first-order statistics pertaining to the first- and second-hop transmission channels, and then, we investigate the end-to-end performance of the proposed spectrum-sharing cooperative relaying system under resource constraints defined to assure that the primary QoS is unaffected. Specifically, we investigate the overall average bit error rate (BER), ergodic capacity, and outage probability of the secondary\\'s communication subject to appropriate constraints on the interference power at the primary receivers. We then consider a general scenario where a cluster of relays is available between the secondary source and destination nodes. In this case, making use of the partial relay selection method, we generalize our results for the single-relay scheme and obtain the end-to-end performance of the cooperative spectrum-sharing system with a cluster of L available relays. Finally, we examine our theoretical results through simulations and comparisons, illustrating the overall performance of the proposed spectrum-sharing cooperative system and quantify its advantages for different operating scenarios and conditions. © 2011 IEEE.

  18. SU-E-T-282: Dose Measurements with An End-To-End Audit Phantom for Stereotactic Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Jones, R; Artschan, R [Calvary Mater Newcastle, Newcastle, NSW (Australia); Thwaites, D [University of Sydney, Sydney, NSW (Australia); Lehmann, J [Calvary Mater Newcastle, Newcastle, NSW (Australia); University of Sydney, Sydney, NSW (Australia)

    2015-06-15

    Purpose: Report on dose measurements as part of an end-to-end test for stereotactic radiotherapy, using a new audit tool, which allows audits to be performed efficiently either by an onsite team or as a postal audit. Methods: Film measurements have been performed with a new Stereotactic Cube Phantom. The phantom has been designed to perform Winston Lutz type position verification measurements and dose measurements in one setup. It comprises a plastic cube with a high density ball in its centre (used for MV imaging with film or EPID) and low density markers in the periphery (used for Cone Beam Computed Tomography, CBCT imaging). It also features strategically placed gold markers near the posterior and right surfaces, which can be used to calculate phantom rotations on MV images. Slit-like openings allow insertion of film or other detectors.The phantom was scanned and small field treatment plans were created. The fields do not traverse any inhomogeneities of the phantom on their paths to the measurement location. The phantom was setup at the delivery system using CBCT imaging. The calculated treatment fields were delivered, each with a piece of radiochromic film (EBT3) placed in the anterior film holder of the phantom. MU had been selected in planning to achieve similar exposures on all films. Calibration films were exposed in solid water for dose levels around the expected doses. Films were scanned and analysed following established procedures. Results: Setup of the cube showed excellent suitability for CBCT 3D alignment. MV imaging with EPID allowed for clear identification of all markers. Film based dose measurements showed good agreement for MLC created fields down to 0.5 mm × 0.5 mm. Conclusion: An end-to-end audit phantom for stereotactic radiotherapy has been developed and tested.

  19. Evaluation of Techniques to Detect Significant Network Performance Problems using End-to-End Active Network Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Cottrell, R.Les; Logg, Connie; Chhaparia, Mahesh; /SLAC; Grigoriev, Maxim; /Fermilab; Haro, Felipe; /Chile U., Catolica; Nazir, Fawad; /NUST, Rawalpindi; Sandford, Mark

    2006-01-25

    End-to-End fault and performance problems detection in wide area production networks is becoming increasingly hard as the complexity of the paths, the diversity of the performance, and dependency on the network increase. Several monitoring infrastructures are built to monitor different network metrics and collect monitoring information from thousands of hosts around the globe. Typically there are hundreds to thousands of time-series plots of network metrics which need to be looked at to identify network performance problems or anomalous variations in the traffic. Furthermore, most commercial products rely on a comparison with user configured static thresholds and often require access to SNMP-MIB information, to which a typical end-user does not usually have access. In our paper we propose new techniques to detect network performance problems proactively in close to realtime and we do not rely on static thresholds and SNMP-MIB information. We describe and compare the use of several different algorithms that we have implemented to detect persistent network problems using anomalous variations analysis in real end-to-end Internet performance measurements. We also provide methods and/or guidance for how to set the user settable parameters. The measurements are based on active probes running on 40 production network paths with bottlenecks varying from 0.5Mbits/s to 1000Mbit/s. For well behaved data (no missed measurements and no very large outliers) with small seasonal changes most algorithms identify similar events. We compare the algorithms' robustness with respect to false positives and missed events especially when there are large seasonal effects in the data. Our proposed techniques cover a wide variety of network paths and traffic patterns. We also discuss the applicability of the algorithms in terms of their intuitiveness, their speed of execution as implemented, and areas of applicability. Our encouraging results compare and evaluate the accuracy of our

  20. Unmanned Aircraft Systems Minimum Operations Performance Standards End-to-End Verification and Validation (E2-V2) Simulation

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Vincent, Michael J.; Sturdy, James L.; Munoz, Cesar A.; Hoffler, Keith D.; Dutle, Aaron M.; Myer, Robert R.; Dehaven, Anna M.; hide

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The current NAS relies on pilot's vigilance and judgement to remain Well Clear (CFR 14 91.113) of other aircraft. RTCA SC-228 has defined DAA Well Clear (DAAWC) to provide a quantified Well Clear volume to allow systems to be designed and measured against. Extended research efforts have been conducted to understand and quantify system requirements needed to support a UAS pilot's ability to remain well clear of other aircraft. The efforts have included developing and testing sensor, algorithm, alerting, and display requirements. More recently, sensor uncertainty and uncertainty mitigation strategies have been evaluated. This paper discusses results and lessons learned from an End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS). NASA Langley Research Center (LaRC) was called upon to develop a system that evaluates a specific set of encounters, in a variety of geometries, with end-to-end DAA functionality including the use of sensor and tracker models, a sensor uncertainty mitigation model, DAA algorithmic guidance in both vertical and horizontal maneuvering, and a pilot model which maneuvers the ownship aircraft to remain well clear from intruder aircraft, having received collective input from the previous modules of the system. LaRC developed a functioning batch simulation and added a sensor/tracker model from the Federal Aviation Administration (FAA) William J. Hughes Technical Center, an in-house developed sensor uncertainty mitigation strategy, and implemented a pilot

  1. Double 90 Degrees Counterrotated End-to-End-Anastomosis: An Experimental Study of an Intestinal Anastomosis Technique.

    Science.gov (United States)

    Holzner, Philipp; Kulemann, Birte; Seifert, Gabriel; Glatz, Torben; Chikhladze, Sophia; Höppner, Jens; Hopt, Ulrich; Timme, Sylvia; Bronsert, Peter; Sick, Olivia; Zhou, Cheng; Marjanovic, Goran

    2015-06-01

    The aim of the article is to investigate a new anastomotic technique compared with standardized intestinal anastomotic procedures. A total of 32 male Wistar rats were randomized to three groups. In the Experimental Group (n = 10), the new double 90 degrees inversely rotated anastomosis was used, in the End Group (n = 10) a single-layer end-to-end anastomosis, and in the Side Group (n = 12) a single-layer side-to-side anastomosis. All anastomoses were done using interrupted sutures. On postoperative day 4, rats were relaparotomized. Bursting pressure, hydroxyproline concentration, a semiquantitative adhesion score and two histological anastomotic healing scores (mucosal healing according to Chiu and overall anastomotic healing according to Verhofstad) were collected. Most data are presented as median (range). p < 0.05 was considered significant. Anastomotic insufficiency occurred only in one rat of the Side Group. Median bursting pressure in the Experimental Group was 105 mm Hg (range = 72-161 mm Hg), significantly higher in the End Group (164 mm Hg; range = 99-210 mm Hg; p = 0.021) and lower in the Side Group by trend (81 mm Hg; range = 59-122 mm Hg; p = 0.093). Hydroxyproline concentration did not differ significantly in between the groups. The adhesion score was 2.5 (range = 1-3) in the Experimental Group, 2 (range = 1-2) in the End Group, but there were significantly more adhesions in the Side Group (range = 3-4); p = 0.020 versus Experimental Group, p < 0.001 versus End Group. The Chiu Score showed the worst mucosal healing in the Experimental Group. The overall Verhofstad Score was significantly worse (mean = 2.032; standard deviation [SD] = 0.842) p = 0.031 and p = 0.002 in the Experimental Group, compared with the Side Group (mean = 1.729; SD = 0.682) and the End Group (mean = 1.571; SD = 0.612). The new anastomotic technique is feasible and did not show any relevant complication. Even though it was superior to the side-to-side anastomosis by trend with

  2. A novel end-to-end classifier using domain transferred deep convolutional neural networks for biomedical images.

    Science.gov (United States)

    Pang, Shuchao; Yu, Zhezhou; Orgun, Mehmet A

    2017-03-01

    Highly accurate classification of biomedical images is an essential task in the clinical diagnosis of numerous medical diseases identified from those images. Traditional image classification methods combined with hand-crafted image feature descriptors and various classifiers are not able to effectively improve the accuracy rate and meet the high requirements of classification of biomedical images. The same also holds true for artificial neural network models directly trained with limited biomedical images used as training data or directly used as a black box to extract the deep features based on another distant dataset. In this study, we propose a highly reliable and accurate end-to-end classifier for all kinds of biomedical images via deep learning and transfer learning. We first apply domain transferred deep convolutional neural network for building a deep model; and then develop an overall deep learning architecture based on the raw pixels of original biomedical images using supervised training. In our model, we do not need the manual design of the feature space, seek an effective feature vector classifier or segment specific detection object and image patches, which are the main technological difficulties in the adoption of traditional image classification methods. Moreover, we do not need to be concerned with whether there are large training sets of annotated biomedical images, affordable parallel computing resources featuring GPUs or long times to wait for training a perfect deep model, which are the main problems to train deep neural networks for biomedical image classification as observed in recent works. With the utilization of a simple data augmentation method and fast convergence speed, our algorithm can achieve the best accuracy rate and outstanding classification ability for biomedical images. We have evaluated our classifier on several well-known public biomedical datasets and compared it with several state-of-the-art approaches. We propose a robust

  3. SU-F-J-177: A Novel Image Analysis Technique (center Pixel Method) to Quantify End-To-End Tests

    Energy Technology Data Exchange (ETDEWEB)

    Wen, N; Chetty, I [Henry Ford Health System, Detroit, MI (United States); Snyder, K [Henry Ford Hospital System, Detroit, MI (United States); Scheib, S [Varian Medical System, Barton (Switzerland); Qin, Y; Li, H [Henry Ford Health System, Detroit, Michigan (United States)

    2016-06-15

    Purpose: To implement a novel image analysis technique, “center pixel method”, to quantify end-to-end tests accuracy of a frameless, image guided stereotactic radiosurgery system. Methods: The localization accuracy was determined by delivering radiation to an end-to-end prototype phantom. The phantom was scanned with 0.8 mm slice thickness. The treatment isocenter was placed at the center of the phantom. In the treatment room, CBCT images of the phantom (kVp=77, mAs=1022, slice thickness 1 mm) were acquired to register to the reference CT images. 6D couch correction were applied based on the registration results. Electronic Portal Imaging Device (EPID)-based Winston Lutz (WL) tests were performed to quantify the errors of the targeting accuracy of the system at 15 combinations of gantry, collimator and couch positions. The images were analyzed using two different methods. a) The classic method. The deviation was calculated by measuring the radial distance between the center of the central BB and the full width at half maximum of the radiation field. b) The center pixel method. Since the imager projection offset from the treatment isocenter was known from the IsoCal calibration, the deviation was determined between the center of the BB and the central pixel of the imager panel. Results: Using the automatic registration method to localize the phantom and the classic method of measuring the deviation of the BB center, the mean and standard deviation of the radial distance was 0.44 ± 0.25, 0.47 ± 0.26, and 0.43 ± 0.13 mm for the jaw, MLC and cone defined field sizes respectively. When the center pixel method was used, the mean and standard deviation was 0.32 ± 0.18, 0.32 ± 0.17, and 0.32 ± 0.19 mm respectively. Conclusion: Our results demonstrated that the center pixel method accurately analyzes the WL images to evaluate the targeting accuracy of the radiosurgery system. The work was supported by a Research Scholar Grant, RSG-15-137-01-CCE from the American

  4. End-to-end probability for an interacting center vortex world line in Yang-Mills theory

    International Nuclear Information System (INIS)

    Teixeira, Bruno F.I.; Lemos, Andre L.L. de; Oxman, Luis E.

    2011-01-01

    Full text: The understanding of quark confinement is a very important open problem in Yang-Mills theory. In this regard, nontrivial topological defects are expected to play a relevant role to achieve a solution. Here we are interested in how to deal with these structures, relying on the Cho-Faddeev-Niemi decomposition and the possibility it offers to describe defects in terms of a local color frame. In particular, the path integral for a single center vortex is a fundamental object to handle the ensemble integration. As is well-known, in three dimensions center vortices are string-like and the associated physics is closely related with that of polymers. Using recent techniques developed in the latter context, we present in this work a detailed derivation of the equation for the end-to-end probability for a center vortex world line, including the effects of interactions. Its solution can be associated with a Green function that depends on the position and orientation at the boundaries, where monopole-like instantons are placed. In the limit of semi flexible polymers, an expansion only keeping the lower angular momenta for the final orientation leads to a reduced Green function for a complex vortex field minimally coupled to the dual Yang-Mills fields. This constitutes a key ingredient to propose an effective model for correlated monopoles, center vortices and the dual fields. (author)

  5. Minimizing Barriers in Learning for On-Call Radiology Residents-End-to-End Web-Based Resident Feedback System.

    Science.gov (United States)

    Choi, Hailey H; Clark, Jennifer; Jay, Ann K; Filice, Ross W

    2018-02-01

    Feedback is an essential part of medical training, where trainees are provided with information regarding their performance and further directions for improvement. In diagnostic radiology, feedback entails a detailed review of the differences between the residents' preliminary interpretation and the attendings' final interpretation of imaging studies. While the on-call experience of independently interpreting complex cases is important to resident education, the more traditional synchronous "read-out" or joint review is impossible due to multiple constraints. Without an efficient method to compare reports, grade discrepancies, convey salient teaching points, and view images, valuable lessons in image interpretation and report construction are lost. We developed a streamlined web-based system, including report comparison and image viewing, to minimize barriers in asynchronous communication between attending radiologists and on-call residents. Our system provides real-time, end-to-end delivery of case-specific and user-specific feedback in a streamlined, easy-to-view format. We assessed quality improvement subjectively through surveys and objectively through participation metrics. Our web-based feedback system improved user satisfaction for both attending and resident radiologists, and increased attending participation, particularly with regards to cases where substantive discrepancies were identified.

  6. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy

    International Nuclear Information System (INIS)

    Gallas, Raya R.; Huenemohr, Nora; Runz, Armin; Niebuhr, Nina I.; Greilich, Steffen; Jaekel, Oliver

    2015-01-01

    With the increasing complexity of external beam therapy ''end-to-end'' tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification.

  7. Vision-based mobile robot navigation through deep convolutional neural networks and end-to-end learning

    Science.gov (United States)

    Zhang, Yachu; Zhao, Yuejin; Liu, Ming; Dong, Liquan; Kong, Lingqin; Liu, Lingling

    2017-09-01

    In contrast to humans, who use only visual information for navigation, many mobile robots use laser scanners and ultrasonic sensors along with vision cameras to navigate. This work proposes a vision-based robot control algorithm based on deep convolutional neural networks. We create a large 15-layer convolutional neural network learning system and achieve the advanced recognition performance. Our system is trained from end to end to map raw input images to direction in supervised mode. The images of data sets are collected in a wide variety of weather conditions and lighting conditions. Besides, the data sets are augmented by adding Gaussian noise and Salt-and-pepper noise to avoid overfitting. The algorithm is verified by two experiments, which are line tracking and obstacle avoidance. The line tracking experiment is proceeded in order to track the desired path which is composed of straight and curved lines. The goal of obstacle avoidance experiment is to avoid the obstacles indoor. Finally, we get 3.29% error rate on the training set and 5.1% error rate on the test set in the line tracking experiment, 1.8% error rate on the training set and less than 5% error rate on the test set in the obstacle avoidance experiment. During the actual test, the robot can follow the runway centerline outdoor and avoid the obstacle in the room accurately. The result confirms the effectiveness of the algorithm and our improvement in the network structure and train parameters

  8. SampleCNN: End-to-End Deep Convolutional Neural Networks Using Very Small Filters for Music Classification

    Directory of Open Access Journals (Sweden)

    Jongpil Lee

    2018-01-01

    Full Text Available Convolutional Neural Networks (CNN have been applied to diverse machine learning tasks for different modalities of raw data in an end-to-end fashion. In the audio domain, a raw waveform-based approach has been explored to directly learn hierarchical characteristics of audio. However, the majority of previous studies have limited their model capacity by taking a frame-level structure similar to short-time Fourier transforms. We previously proposed a CNN architecture which learns representations using sample-level filters beyond typical frame-level input representations. The architecture showed comparable performance to the spectrogram-based CNN model in music auto-tagging. In this paper, we extend the previous work in three ways. First, considering the sample-level model requires much longer training time, we progressively downsample the input signals and examine how it affects the performance. Second, we extend the model using multi-level and multi-scale feature aggregation technique and subsequently conduct transfer learning for several music classification tasks. Finally, we visualize filters learned by the sample-level CNN in each layer to identify hierarchically learned features and show that they are sensitive to log-scaled frequency.

  9. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Gallas, Raya R.; Huenemohr, Nora; Runz, Armin; Niebuhr, Nina I.; Greilich, Steffen [German Cancer Research Center (DKFZ), Heidelberg (Germany). Div. of Medical Physics in Radiation Oncology; National Center for Radiation Research in Oncology, Heidelberg (Germany). Heidelberg Institute of Radiation Oncology (HIRO); Jaekel, Oliver [German Cancer Research Center (DKFZ), Heidelberg (Germany). Div. of Medical Physics in Radiation Oncology; National Center for Radiation Research in Oncology, Heidelberg (Germany). Heidelberg Institute of Radiation Oncology (HIRO); Heidelberg University Hospital (Germany). Dept. of Radiation Oncology; Heidelberg Ion-Beam Therapy Center (HIT), Heidelberg (Germany)

    2015-07-01

    With the increasing complexity of external beam therapy ''end-to-end'' tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification.

  10. Innovative strategy for effective critical laboratory result management: end-to-end process using automation and manual call centre.

    Science.gov (United States)

    Ti, Lian Kah; Ang, Sophia Bee Leng; Saw, Sharon; Sethi, Sunil Kumar; Yip, James W L

    2012-08-01

    Timely reporting and acknowledgement are crucial steps in critical laboratory results (CLR) management. The authors previously showed that an automated pathway incorporating short messaging system (SMS) texts, auto-escalation, and manual telephone back-up improved the rate and speed of physician acknowledgement compared with manual telephone calling alone. This study investigated if it also improved the rate and speed of physician intervention to CLR and whether utilising the manual back-up affected intervention rates. Data from seven audits between November 2007 and January 2011 were analysed. These audits were carried out to assess the robustness of CLR reporting process in the authors' institution. Comparisons were made in the rate and speed of acknowledgement and intervention between the audits performed before and after automation. Using the automation audits, the authors compared intervention data between communication with SMS only and when manual intervention was required. 1680 CLR were reported during the audit periods. Automation improved the rate (100% vs 84.2%; pautomation audits, the use of SMS only did not improve physician intervention rates. The automated communication pathway improved physician intervention rate and time in tandem with improved acknowledgement rate and time when compared with manual telephone calling. The use of manual intervention to augment automation did not adversely affect physician intervention rate, implying that an end-to-end pathway was more important than automation alone.

  11. Delayed primary end-to-end anastomosis for traumatic long segment urethral stricture and its short-term outcomes

    Directory of Open Access Journals (Sweden)

    Rajarshi Kumar

    2017-01-01

    Full Text Available Background: The purpose of this study is to evaluate the aetiology of posterior urethral stricture in children and analysis of results after delayed primary repair with extensive distal urethral mobilisation. Materials and Methods: This was a retrospective study carried out in a tertiary care centre from January 2009 to December 2013. Results: Eight children with median age 7.5 years (range 4–11 years, underwent delayed anastomotic urethroplasty: Six through perineal and two through combined perineal and transpubic approach. All the eight children had long-segment >2 cm stricture: Three posterior and five anterior urethral stricture. On a mean follow-up period of 33 months (range 24–48 m, all were passing urine with good flow and stream. Conclusion: End-to-end anastomosis in post-traumatic long segment posterior urethral stricture between prostatic and penile urethra in children is possible by perineal or combined perineal and transpubic approach with good results without any urethral replacement.

  12. A real-time 3D end-to-end augmented reality system (and its representation transformations)

    Science.gov (United States)

    Tytgat, Donny; Aerts, Maarten; De Busser, Jeroen; Lievens, Sammy; Rondao Alface, Patrice; Macq, Jean-Francois

    2016-09-01

    The new generation of HMDs coming to the market is expected to enable many new applications that allow free viewpoint experiences with captured video objects. Current applications usually rely on 3D content that is manually created or captured in an offline manner. In contrast, this paper focuses on augmented reality applications that use live captured 3D objects while maintaining free viewpoint interaction. We present a system that allows live dynamic 3D objects (e.g. a person who is talking) to be captured in real-time. Real-time performance is achieved by traversing a number of representation formats and exploiting their specific benefits. For instance, depth images are maintained for fast neighborhood retrieval and occlusion determination, while implicit surfaces are used to facilitate multi-source aggregation for both geometry and texture. The result is a 3D reconstruction system that outputs multi-textured triangle meshes at real-time rates. An end-to-end system is presented that captures and reconstructs live 3D data and allows for this data to be used on a networked (AR) device. For allocating the different functional blocks onto the available physical devices, a number of alternatives are proposed considering the available computational power and bandwidth for each of the components. As we will show, the representation format can play an important role in this functional allocation and allows for a flexible system that can support a highly heterogeneous infrastructure.

  13. Increasing gas producer profitability with virtual well visibility via an end-to-end, wireless Internet gas monitoring system

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, M.; Coleman, K.; Beck, R.; Lyon, R.; Potts, R. [Northrock Resources Ltd., Calgary, AB (Canada); Benterud, K. [Zed.i solutions, Calgary, AB (Canada)

    2003-07-01

    Most gas producing companies still use 100-year old technology to measure gas volumes because of the prohibitive costs of implementing corporate wide electronic information systems to replace circular mechanical chart technology. This paper describes how Northrock Resources Ltd. increased profitability using Smart-Alek{sup TM} while avoiding high implementation costs. Smart-Alek is a new type of fully integrated end-to-end electronic gas flow measurement (GFM) system based on Field Intelligence Network and End User Interference (FINE). Smart-Alek can analyze gas production through public wireless communications and a web-browser delivery system. The system has enabled Northrock to increase gas volumes with more accurate measurement and reduced downtime. In addition, operating costs were also decreased because the frequency of well visits was reduced and the administrative procedures of data collection was more efficient. The real-time well visibility of the tool has proven to be very effective in optimizing business profitability. 9 refs., 1 tab., 9 figs.

  14. End-to-end gene fusions and their impact on the production of multifunctional biomass degrading enzymes

    International Nuclear Information System (INIS)

    Rizk, Mazen; Antranikian, Garabed; Elleuche, Skander

    2012-01-01

    Highlights: ► Multifunctional enzymes offer an interesting approach for biomass degradation. ► Size and conformation of separate constructs play a role in the effectiveness of chimeras. ► A connecting linker allows for maximal flexibility and increased thermostability. ► Genes with functional similarities are the best choice for fusion candidates. -- Abstract: The reduction of fossil fuels, coupled with its increase in price, has made the search for alternative energy resources more plausible. One of the topics gaining fast interest is the utilization of lignocellulose, the main component of plants. Its primary constituents, cellulose and hemicellulose, can be degraded by a series of enzymes present in microorganisms, into simple sugars, later used for bioethanol production. Thermophilic bacteria have proven to be an interesting source of enzymes required for hydrolysis since they can withstand high and denaturing temperatures, which are usually required for processes involving biomass degradation. However, the cost associated with the whole enzymatic process is staggering. A solution for cost effective and highly active production is through the construction of multifunctional enzyme complexes harboring the function of more than one enzyme needed for the hydrolysis process. There are various strategies for the degradation of complex biomass ranging from the regulation of the enzymes involved, to cellulosomes, and proteins harboring more than one enzymatic activity. In this review, the construction of multifunctional biomass degrading enzymes through end-to-end gene fusions, and its impact on production and activity by choosing the enzymes and linkers is assessed.

  15. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy.

    Science.gov (United States)

    Gallas, Raya R; Hünemohr, Nora; Runz, Armin; Niebuhr, Nina I; Jäkel, Oliver; Greilich, Steffen

    2015-12-01

    With the increasing complexity of external beam therapy "end-to-end" tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification. Copyright © 2015. Published by Elsevier GmbH.

  16. On cryptographic security of end-to-end encrypted connections in WhatsApp and Telegram messengers

    Directory of Open Access Journals (Sweden)

    Sergey V. Zapechnikov

    2017-11-01

    Full Text Available The aim of this work is to analyze the available possibilities for improving secure messaging with end-to-end connections under conditions of external violator actions and distrusted service provider. We made a comparative analysis of cryptographic security mechanisms for two widely used messengers: Telegram and WhatsApp. It was found that Telegram is based on MTProto protocol, while WhatsApp is based on the alternative Signal protocol. We examine the specific features of messengers implementation associated with random number generation on the most popular Android mobile platform. It was shown that Signal has better security properties. It is used in several other popular messengers such as TextSecure, RedPhone, GoogleAllo, FacebookMessenger, Signal along with WhatsApp. A number of possible attacks on both messengers were analyzed in details. In particular, we demonstrate that the metadata are poorly protected in both messengers. Metadata security may be one of the goals for further studies.

  17. Secondary link adaptation in cognitive radio networks: End-to-end performance with cross-layer design

    KAUST Repository

    Ma, Hao

    2012-04-01

    Under spectrum-sharing constraints, we consider the secondary link exploiting cross-layer combining of adaptive modulation and coding (AMC) at the physical layer with truncated automatic repeat request (T-ARQ) at the data link layer in cognitive radio networks. Both, basic AMC and aggressive AMC, are adopted to optimize the overall average spectral efficiency, subject to the interference constraints imposed by the primary user of the shared spectrum band and a target packet loss rate. We achieve the optimal boundary points in closed form to choose the AMC transmission modes by taking into account the channel state information from the secondary transmitter to both the primary receiver and the secondary receiver. Moreover, numerical results substantiate that, without any cost in the transmitter/receiver design nor the end-to-end delay, the scheme with aggressive AMC outperforms that with conventional AMC. The main reason is that, with aggressive AMC, different transmission modes utilized in the initial packet transmission and the following retransmissions match the time-varying channel conditions better than the basic pattern. © 2012 IEEE.

  18. Chinese Medical Question Answer Matching Using End-to-End Character-Level Multi-Scale CNNs

    Directory of Open Access Journals (Sweden)

    Sheng Zhang

    2017-07-01

    Full Text Available This paper focuses mainly on the problem of Chinese medical question answer matching, which is arguably more challenging than open-domain question answer matching in English due to the combination of its domain-restricted nature and the language-specific features of Chinese. We present an end-to-end character-level multi-scale convolutional neural framework in which character embeddings instead of word embeddings are used to avoid Chinese word segmentation in text preprocessing, and multi-scale convolutional neural networks (CNNs are then introduced to extract contextual information from either question or answer sentences over different scales. The proposed framework can be trained with minimal human supervision and does not require any handcrafted features, rule-based patterns, or external resources. To validate our framework, we create a new text corpus, named cMedQA, by harvesting questions and answers from an online Chinese health and wellness community. The experimental results on the cMedQA dataset show that our framework significantly outperforms several strong baselines, and achieves an improvement of top-1 accuracy by up to 19%.

  19. West Coast fish, mammal, and bird species diets - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  20. Gulf of California species and catch spatial distributions and historical time series - Developing end-to-end models of the Gulf of California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the northern Gulf of California, linking oceanography, biogeochemistry, food web...

  1. West Coast fish, mammal, bird life history and abunance parameters - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  2. Poly(ethyl glyoxylate)-Poly(ethylene oxide) Nanoparticles: Stimuli-Responsive Drug Release via End-to-End Polyglyoxylate Depolymerization.

    Science.gov (United States)

    Fan, Bo; Gillies, Elizabeth R

    2017-08-07

    The ability to disrupt polymer assemblies in response to specific stimuli provides the potential to release drugs selectively at certain sites or conditions in vivo. However, most stimuli-responsive delivery systems require many stimuli-initiated events to release drugs. "Self-immolative polymers" offer the potential to provide amplified responses to stimuli as they undergo complete end-to-end depolymerization following the cleavage of a single end-cap. Herein, linker end-caps were developed to conjugate self-immolative poly(ethyl glyoxylate) (PEtG) with poly(ethylene oxide) (PEO) to form amphiphilic block copolymers. These copolymers were self-assembled to form nanoparticles in aqueous solution. Cleavage of the linker end-caps were triggered by a thiol reducing agent, UV light, H 2 O 2 , and combinations of these stimuli, resulting in nanoparticle disintegration. Low stimuli concentrations were effective in rapidly disrupting the nanoparticles. Nile red, doxorubin, and curcumin were encapsulated into the nanoparticles and were selectively released upon application of the appropriate stimulus. The ability to tune the stimuli-responsiveness simply by changing the linker end-cap makes this new platform highly attractive for applications in drug delivery.

  3. Adaptation and validation of a commercial head phantom for cranial radiosurgery dosimetry end-to-end audit.

    Science.gov (United States)

    Dimitriadis, Alexis; Palmer, Antony L; Thomas, Russell A S; Nisbet, Andrew; Clark, Catharine H

    2017-06-01

    To adapt and validate an anthropomorphic head phantom for use in a cranial radiosurgery audit. Two bespoke inserts were produced for the phantom: one for providing the target and organ at risk for delineation and the other for performing dose measurements. The inserts were tested to assess their positional accuracy. A basic treatment plan dose verification with an ionization chamber was performed to establish a baseline accuracy for the phantom and beam model. The phantom and inserts were then used to perform dose verification measurements of a radiosurgery plan. The dose was measured with alanine pellets, EBT extended dose film and a plastic scintillation detector (PSD). Both inserts showed reproducible positioning (±0.5 mm) and good positional agreement between them (±0.6 mm). The basic treatment plan measurements showed agreement to the treatment planning system (TPS) within 0.5%. Repeated film measurements showed consistent gamma passing rates with good agreement to the TPS. For 2%-2 mm global gamma, the mean passing rate was 96.7% and the variation in passing rates did not exceed 2.1%. The alanine pellets and PSD showed good agreement with the TPS (-0.1% and 0.3% dose difference in the target) and good agreement with each other (within 1%). The adaptations to the phantom showed acceptable accuracies. The presence of alanine and PSD do not affect film measurements significantly, enabling simultaneous measurements by all three detectors. Advances in knowledge: A novel method for thorough end-to-end test of radiosurgery, with capability to incorporate all steps of the clinical pathway in a time-efficient and reproducible manner, suitable for a national audit.

  4. An end-to-end examination of geometric accuracy of IGRT using a new digital accelerator equipped with onboard imaging system.

    Science.gov (United States)

    Wang, Lei; Kielar, Kayla N; Mok, Ed; Hsu, Annie; Dieterich, Sonja; Xing, Lei

    2012-02-07

    The Varian's new digital linear accelerator (LINAC), TrueBeam STx, is equipped with a high dose rate flattening filter free (FFF) mode (6 MV and 10 MV), a high definition multileaf collimator (2.5 mm leaf width), as well as onboard imaging capabilities. A series of end-to-end phantom tests were performed, TrueBeam-based image guided radiation therapy (IGRT), to determine the geometric accuracy of the image-guided setup and dose delivery process for all beam modalities delivered using intensity modulated radiation therapy (IMRT) and RapidArc. In these tests, an anthropomorphic phantom with a Ball Cube II insert and the analysis software (FilmQA (3cognition)) were used to evaluate the accuracy of TrueBeam image-guided setup and dose delivery. Laser cut EBT2 films with 0.15 mm accuracy were embedded into the phantom. The phantom with the film inserted was first scanned with a GE Discovery-ST CT scanner, and the images were then imported to the planning system. Plans with steep dose fall off surrounding hypothetical targets of different sizes were created using RapidArc and IMRT with FFF and WFF (with flattening filter) beams. Four RapidArc plans (6 MV and 10 MV FFF) and five IMRT plans (6 MV and 10 MV FFF; 6 MV, 10 MV and 15 MV WFF) were studied. The RapidArc plans with 6 MV FFF were planned with target diameters of 1 cm (0.52 cc), 2 cm (4.2 cc) and 3 cm (14.1 cc), and all other plans with a target diameter of 3 cm. Both onboard planar and volumetric imaging procedures were used for phantom setup and target localization. The IMRT and RapidArc plans were then delivered, and the film measurements were compared with the original treatment plans using a gamma criteria of 3%/1 mm and 3%/2 mm. The shifts required in order to align the film measured dose with the calculated dose distributions was attributed to be the targeting error. Targeting accuracy of image-guided treatment using TrueBeam was found to be within 1 mm. For irradiation of the 3 cm target, the gammas (3%, 1

  5. First Demonstration of Real-Time End-to-End 40 Gb/s PAM-4 System using 10-G Transmitter for Next Generation Access Applications

    DEFF Research Database (Denmark)

    Wei, Jinlong; Eiselt, Nicklas; Griesser, Helmut

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved.......We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved....

  6. Partial QoS-Aware Opportunistic Relay Selection Over Two-Hop Channels: End-to-End Performance Under Spectrum-Sharing Requirements

    KAUST Repository

    Yuli Yang,; Hao Ma,; Aissa, Sonia

    2014-01-01

    it with transmission constraints imposed on the transmit power budget and interference to other users. By analyzing the statistics of received SNRs in the first and second hops, we obtain the end-to-end PLR of this scheme in closed form under the considered scenario

  7. Greenhouse gas profiling by infrared-laser and microwave occultation: retrieval algorithm and demonstration results from end-to-end simulations

    Directory of Open Access Journals (Sweden)

    V. Proschek

    2011-10-01

    Full Text Available Measuring greenhouse gas (GHG profiles with global coverage and high accuracy and vertical resolution in the upper troposphere and lower stratosphere (UTLS is key for improved monitoring of GHG concentrations in the free atmosphere. In this respect a new satellite mission concept adding an infrared-laser part to the already well studied microwave occultation technique exploits the joint propagation of infrared-laser and microwave signals between Low Earth Orbit (LEO satellites. This synergetic combination, referred to as LEO-LEO microwave and infrared-laser occultation (LMIO method, enables to retrieve thermodynamic profiles (pressure, temperature, humidity and accurate altitude levels from the microwave signals and GHG profiles from the simultaneously measured infrared-laser signals. However, due to the novelty of the LMIO method, a retrieval algorithm for GHG profiling is not yet available. Here we introduce such an algorithm for retrieving GHGs from LEO-LEO infrared-laser occultation (LIO data, applied as a second step after retrieving thermodynamic profiles from LEO-LEO microwave occultation (LMO data. We thoroughly describe the LIO retrieval algorithm and unveil the synergy with the LMO-retrieved pressure, temperature, and altitude information. We furthermore demonstrate the effective independence of the GHG retrieval results from background (a priori information in discussing demonstration results from LMIO end-to-end simulations for a representative set of GHG profiles, including carbon dioxide (CO2, water vapor (H2O, methane (CH4, and ozone (O3. The GHGs except for ozone are well retrieved throughout the UTLS, while ozone is well retrieved from about 10 km to 15 km upwards, since the ozone layer resides in the lower stratosphere. The GHG retrieval errors are generally smaller than 1% to 3% r.m.s., at a vertical resolution of about 1 km. The retrieved profiles also appear unbiased, which points

  8. Electronic remote blood issue: a combination of remote blood issue with a system for end-to-end electronic control of transfusion to provide a "total solution" for a safe and timely hospital blood transfusion service.

    Science.gov (United States)

    Staves, Julie; Davies, Amanda; Kay, Jonathan; Pearson, Oliver; Johnson, Tony; Murphy, Michael F

    2008-03-01

    The rapid provision of red cell (RBC) units to patients needing blood urgently is an issue of major importance in transfusion medicine. The development of electronic issue (sometimes termed "electronic crossmatch") has facilitated rapid provision of RBC units by avoidance of the serologic crossmatch in eligible patients. A further development is the issue of blood under electronic control at blood refrigerator remote from the blood bank. This study evaluated a system for electronic remote blood issue (ERBI) developed as an enhancement of a system for end-to-end electronic control of hospital transfusion. Practice was evaluated before and after its introduction in cardiac surgery. Before the implementation of ERBI, the median time to deliver urgently required RBC units to the patient was 24 minutes. After its implementation, RBC units were obtained from the nearby blood refrigerator in a median time of 59 seconds (range, 30 sec to 2 min). The study also found that unused requests were reduced significantly from 42 to 20 percent, the number of RBC units issued reduced by 52 percent, the number of issued units that were transfused increased from 40 to 62 percent, and there was a significant reduction in the workload of both blood bank and clinical staff. This study evaluated a combination of remote blood issue with an end-to-end electronically controlled hospital transfusion process, ERBI. ERBI reduced the time to make blood available for surgical patients and improved the efficiency of hospital transfusion.

  9. OpenCyto: an open source infrastructure for scalable, robust, reproducible, and automated, end-to-end flow cytometry data analysis.

    Directory of Open Access Journals (Sweden)

    Greg Finak

    2014-08-01

    Full Text Available Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis framework designed to lower the barrier of entry to automated flow data analysis algorithms by addressing key areas that we believe have held back wider adoption of automated approaches. OpenCyto supports end-to-end data analysis that is robust and reproducible while generating results that are easy to interpret. We have improved the existing, widely used core BioConductor flow cytometry infrastructure by allowing analysis to scale in a memory efficient manner to the large flow data sets that arise in clinical trials, and integrating domain-specific knowledge as part of the pipeline through the hierarchical relationships among cell populations. Pipelines are defined through a text-based csv file, limiting the need to write data-specific code, and are data agnostic to simplify repetitive analysis for core facilities. We demonstrate how to analyze two large cytometry data sets: an intracellular cytokine staining (ICS data set from a published HIV vaccine trial focused on detecting rare, antigen-specific T-cell populations, where we identify a new subset of CD8 T-cells with a vaccine-regimen specific response that could not be identified through manual analysis, and a CyTOF T-cell phenotyping data set where a large staining panel and many cell populations are a challenge for traditional analysis. The substantial improvements to the core BioConductor flow cytometry packages give OpenCyto the potential for wide adoption. It can rapidly leverage new developments in

  10. Including 10-Gigabit-capable Passive Optical Network under End-to-End Generalized Multi-Protocol Label Switching Provisioned Quality of Service

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Gavler, Anders; Wessing, Henrik

    2012-01-01

    of the network where quality of service signaling is bridged. This article proposes strategies for generalized multi-protocol label switching control over next emerging passive optical network standard, i.e., the 10-gigabit-capable passive optical network. Node management and resource allocation approaches...... are discussed, and possible issues are raised. The analysis shows that consideration of a 10-gigabit-capable passive optical network as a generalized multi-protocol label switching controlled domain is valid and may advance end-to-end quality of service provisioning for passive optical network based customers.......End-to-end quality of service provisioning is still a challenging task despite many years of research and development in this area. Considering a generalized multi-protocol label switching based core/metro network and resource reservation protocol capable home gateways, it is the access part...

  11. Influence of suture technique and suture material selection on the mechanics of end-to-end and end-to-side anastomoses.

    Science.gov (United States)

    Baumgartner, N; Dobrin, P B; Morasch, M; Dong, Q S; Mrkvicka, R

    1996-05-01

    Experiments were performed in dogs to evaluate the mechanics of 26 end-to-end and 42 end-to-side artery-vein graft anastomoses constructed with continuous polypropylene sutures (Surgilene; Davis & Geck, Division of American Cyanamid Co., Danbury, Conn.), continuous polybutester sutures (Novafil; Davis & Geck), and interrupted stitches with either suture material. After construction, the grafts and adjoining arteries were excised, mounted in vitro at in situ length, filled with a dilute barium sulfate suspension, and pressurized in 25 mm Hg steps up to 200 mm Hg. Radiographs were obtained at each pressure. The computed cross-sectional areas of the anastomoses were compared with those of the native arteries at corresponding pressures. Results showed that for the end-to-end anastomoses at 100 mm Hg the cross-sectional areas of the continuous Surgilene anastomoses were 70% of the native artery cross-sectional areas, the cross-sectional areas of the continuous Novafil anastomoses were 90% of the native artery cross-sectional areas, and the cross-sectional areas of the interrupted anastomoses were 107% of the native artery cross-sectional areas (p anastomoses demonstrated no differences in cross-sectional areas or compliance for the three suture techniques. This suggests that, unlike with end-to-end anastomoses, when constructing an end-to-side anastomosis in patients any of the three suture techniques may be acceptable.

  12. One stage functional end-to-end stapled intestinal anastomosis and resection performed by nonexpert surgeons for the treatment of small intestinal obstruction in 30 dogs.

    Science.gov (United States)

    Jardel, Nicolas; Hidalgo, Antoine; Leperlier, Dimitri; Manassero, Mathieu; Gomes, Aymeric; Bedu, Anne Sophie; Moissonnier, Pierre; Fayolle, Pascal; Begon, Dominique; Riquois, Elisabeth; Viateau, Véronique

    2011-02-01

    To describe stapled 1-stage functional end-to-end intestinal anastomosis for treatment of small intestinal obstruction in dogs and evaluate outcome when the technique is performed by nonexpert surgeons after limited training in the technique. Case series. Dogs (n=30) with intestinal lesions requiring an enterectomy. Stapled 1-stage functional end-to-end anastomosis and resection using a GIA-60 and a TA-55 stapling devices were performed under supervision of senior residents and faculty surgeons by junior surgeons previously trained in the technique on pigs. Procedure duration and technical problems were recorded. Short-term results were collected during hospitalization and at suture removal. Long-term outcome was established by clinical and ultrasonographic examinations at least 2 months after surgery and from written questionnaires, completed by owners. Mean±SD procedure duration was 15±12 minutes. Postoperative recovery was uneventful in 25 dogs. One dog had anastomotic leakage, 1 had a localized abscess at the transverse staple line, and 3 dogs developed an incisional abdominal wall abscess. No long-term complications occurred (follow-up, 2-32 months). Stapled 1-stage functional end-to-end anastomosis and resection is a fast and safe procedure in the hand of nonexpert but trained surgeons. © Copyright 2011 by The American College of Veterinary Surgeons.

  13. Primary and secondary structure dependence of peptide flexibility assessed by fluorescence-based measurement of end-to-end collision rates.

    Science.gov (United States)

    Huang, Fang; Hudgins, Robert R; Nau, Werner M

    2004-12-22

    The intrachain fluorescence quenching of the fluorophore 2,3-diazabicyclo[2.2.2]oct-2-ene (DBO) is measured in short peptide fragments, namely the two strands and the turn of the N-terminal beta-hairpin of ubiquitin. The investigated peptides adopt a random-coil conformation in aqueous solution according to CD and NMR experiments. The combination of quenchers with different quenching efficiencies, namely tryptophan and tyrosine, allows the extrapolation of the rate constants for end-to-end collision rates as well as the dissociation of the end-to-end encounter complex. The measured activation energies for fluorescence quenching demonstrate that the end-to-end collision process in peptides is partially controlled by internal friction within the backbone, while measurements in solvents of different viscosities (H2O, D2O, and 7.0 M guanidinium chloride) suggest that solvent friction is an additional important factor in determining the collision rate. The extrapolated end-to-end collision rates, which are only slightly larger than the experimental rates for the DBO/Trp probe/quencher system, provide a measure of the conformational flexibility of the peptide backbone. The chain flexibility is found to be strongly dependent on the type of secondary structure that the peptides represent. The collision rates for peptides derived from the beta-strand motifs (ca. 1 x 10(7) s(-1)) are ca. 4 times slower than that derived from the beta-turn. The results provide further support for the hypothesis that chain flexibility is an important factor in the preorganization of protein fragments during protein folding. Mutations to the beta-turn peptide show that subtle sequence changes strongly affect the flexibility of peptides as well. The protonation and charge status of the peptides, however, are shown to have no significant effect on the flexibility of the investigated peptides. The meaning and definition of end-to-end collision rates in the context of protein folding are critically

  14. A fully automatic end-to-end method for content-based image retrieval of CT scans with similar liver lesion annotations.

    Science.gov (United States)

    Spanier, A B; Caplan, N; Sosna, J; Acar, B; Joskowicz, L

    2018-01-01

    The goal of medical content-based image retrieval (M-CBIR) is to assist radiologists in the decision-making process by retrieving medical cases similar to a given image. One of the key interests of radiologists is lesions and their annotations, since the patient treatment depends on the lesion diagnosis. Therefore, a key feature of M-CBIR systems is the retrieval of scans with the most similar lesion annotations. To be of value, M-CBIR systems should be fully automatic to handle large case databases. We present a fully automatic end-to-end method for the retrieval of CT scans with similar liver lesion annotations. The input is a database of abdominal CT scans labeled with liver lesions, a query CT scan, and optionally one radiologist-specified lesion annotation of interest. The output is an ordered list of the database CT scans with the most similar liver lesion annotations. The method starts by automatically segmenting the liver in the scan. It then extracts a histogram-based features vector from the segmented region, learns the features' relative importance, and ranks the database scans according to the relative importance measure. The main advantages of our method are that it fully automates the end-to-end querying process, that it uses simple and efficient techniques that are scalable to large datasets, and that it produces quality retrieval results using an unannotated CT scan. Our experimental results on 9 CT queries on a dataset of 41 volumetric CT scans from the 2014 Image CLEF Liver Annotation Task yield an average retrieval accuracy (Normalized Discounted Cumulative Gain index) of 0.77 and 0.84 without/with annotation, respectively. Fully automatic end-to-end retrieval of similar cases based on image information alone, rather that on disease diagnosis, may help radiologists to better diagnose liver lesions.

  15. Experience of using MOSFET detectors for dose verification measurements in an end-to-end 192Ir brachytherapy quality assurance system.

    Science.gov (United States)

    Persson, Maria; Nilsson, Josef; Carlsson Tedgren, Åsa

    Establishment of an end-to-end system for the brachytherapy (BT) dosimetric chain could be valuable in clinical quality assurance. Here, the development of such a system using MOSFET (metal oxide semiconductor field effect transistor) detectors and experience gained during 2 years of use are reported with focus on the performance of the MOSFET detectors. A bolus phantom was constructed with two implants, mimicking prostate and head & neck treatments, using steel needles and plastic catheters to guide the 192 Ir source and house the MOSFET detectors. The phantom was taken through the BT treatment chain from image acquisition to dose evaluation. During the 2-year evaluation-period, delivered doses were verified a total of 56 times using MOSFET detectors which had been calibrated in an external 60 Co beam. An initial experimental investigation on beam quality differences between 192 Ir and 60 Co is reported. The standard deviation in repeated MOSFET measurements was below 3% in the six measurement points with dose levels above 2 Gy. MOSFET measurements overestimated treatment planning system doses by 2-7%. Distance-dependent experimental beam quality correction factors derived in a phantom of similar size as that used for end-to-end tests applied on a time-resolved measurement improved the agreement. MOSFET detectors provide values stable over time and function well for use as detectors for end-to-end quality assurance purposes in 192 Ir BT. Beam quality correction factors should address not only distance from source but also phantom dimensions. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  16. Interoperable End-to-End Remote Patient Monitoring Platform Based on IEEE 11073 PHD and ZigBee Health Care Profile.

    Science.gov (United States)

    Clarke, Malcolm; de Folter, Joost; Verma, Vivek; Gokalp, Hulya

    2018-05-01

    This paper describes the implementation of an end-to-end remote monitoring platform based on the IEEE 11073 standards for personal health devices (PHD). It provides an overview of the concepts and approaches and describes how the standard has been optimized for small devices with limited resources of processor, memory, and power that use short-range wireless technology. It explains aspects of IEEE 11073, including the domain information model, state model, and nomenclature, and how these support its plug-and-play architecture. It shows how these aspects underpin a much larger ecosystem of interoperable devices and systems that include IHE PCD-01, HL7, and BlueTooth LE medical devices, and the relationship to the Continua Guidelines, advocating the adoption of data standards and nomenclature to support semantic interoperability between health and ambient assisted living in future platforms. The paper further describes the adaptions that have been made in order to implement the standard on the ZigBee Health Care Profile and the experiences of implementing an end-to-end platform that has been deployed to frail elderly patients with chronic disease(s) and patients with diabetes.

  17. User-oriented end-to-end transport protocols for the real-time distribution of telemetry data from NASA spacecraft

    Science.gov (United States)

    Hooke, A. J.

    1979-01-01

    A set of standard telemetry protocols for downlink data flow facilitating the end-to-end transport of instrument data from the spacecraft to the user in real time is proposed. The direct switching of data by autonomous message 'packets' that are assembled by the source instrument on the spacecraft is discussed. The data system consists thus of a format on a message rather than word basis, and such packet telemetry would include standardized protocol headers. Standards are being developed within the NASA End-to-End Data System (NEEDS) program for the source packet and transport frame protocols. The source packet protocol contains identification of both the sequence number of the packet as it is generated by the source and the total length of the packet, while the transport frame protocol includes a sequence count defining the serial number of the frame as it is generated by the spacecraft data system, and a field specifying any 'options' selected in the format of the frame itself.

  18. Ferromagnetic interaction in an asymmetric end-to-end azido double-bridged copper(II) dinuclear complex: a combined structure, magnetic, polarized neutron diffraction and theoretical study.

    Science.gov (United States)

    Aronica, Christophe; Jeanneau, Erwann; El Moll, Hani; Luneau, Dominique; Gillon, Béatrice; Goujon, Antoine; Cousson, Alain; Carvajal, Maria Angels; Robert, Vincent

    2007-01-01

    A new end-to-end azido double-bridged copper(II) complex [Cu(2)L(2)(N(3))2] (1) was synthesized and characterized (L=1,1,1-trifluoro-7-(dimethylamino)-4-methyl-5-aza-3-hepten-2-onato). Despite the rather long Cu-Cu distance (5.105(1) A), the magnetic interaction is ferromagnetic with J= +16 cm(-1) (H=-JS(1)S(2)), a value that has been confirmed by DFT and high-level correlated ab initio calculations. The spin distribution was studied by using the results from polarized neutron diffraction. This is the first such study on an end-to-end system. The experimental spin density was found to be localized mainly on the copper(II) ions, with a small degree of delocalization on the ligand (L) and terminal azido nitrogens. There was zero delocalization on the central nitrogen, in agreement with DFT calculations. Such a picture corresponds to an important contribution of the d(x2-y2) orbital and a small population of the d(z2) orbital, in agreement with our calculations. Based on a correlated wavefunction analysis, the ferromagnetic behavior results from a dominant double spin polarization contribution and vanishingly small ionic forms.

  19. Mixed integer nonlinear programming model of wireless pricing scheme with QoS attribute of bandwidth and end-to-end delay

    Science.gov (United States)

    Irmeilyana, Puspita, Fitri Maya; Indrawati

    2016-02-01

    The pricing for wireless networks is developed by considering linearity factors, elasticity price and price factors. Mixed Integer Nonlinear Programming of wireless pricing model is proposed as the nonlinear programming problem that can be solved optimally using LINGO 13.0. The solutions are expected to give some information about the connections between the acceptance factor and the price. Previous model worked on the model that focuses on bandwidth as the QoS attribute. The models attempt to maximize the total price for a connection based on QoS parameter. The QoS attributes used will be the bandwidth and the end to end delay that affect the traffic. The maximum goal to maximum price is achieved when the provider determine the requirement for the increment or decrement of price change due to QoS change and amount of QoS value.

  20. Crystal structure of Aquifex aeolicus gene product Aq1627: a putative phosphoglucosamine mutase reveals a unique C-terminal end-to-end disulfide linkage.

    Science.gov (United States)

    Sridharan, Upasana; Kuramitsu, Seiki; Yokoyama, Shigeyuki; Kumarevel, Thirumananseri; Ponnuraj, Karthe

    2017-06-27

    The Aq1627 gene from Aquifex aeolicus, a hyperthermophilic bacterium has been cloned and overexpressed in Escherichia coli. The protein was purified to homogeneity and its X-ray crystal structure was determined to 1.3 Å resolution using multiple wavelength anomalous dispersion phasing. The structural and sequence analysis of Aq1627 is suggestive of a putative phosphoglucosamine mutase. The structural features of Aq1627 further indicate that it could belong to a new subclass of the phosphoglucosamine mutase family. Aq1627 structure contains a unique C-terminal end-to-end disulfide bond, which links two monomers and this structural information can be used in protein engineering to make proteins more stable in different applications.

  1. Reconstruction after ureteral resection during HIPEC surgery: Re-implantation with uretero-neocystostomy seems safer than end-to-end anastomosis.

    Science.gov (United States)

    Pinar, U; Tremblay, J-F; Passot, G; Dazza, M; Glehen, O; Tuech, J-J; Pocard, M

    2017-09-01

    Resection of the pelvic ureter may be necessary in cytoreductive surgery for peritoneal carcinomatosis in combination with hyperthermic intraperitoneal chemotherapy (HIPEC). As the morbidity for cytoreductive surgery with HIPEC has decreased, expert teams have begun to perform increasingly complex surgical procedures associated with HIPEC, including pelvic reconstructions. After ureteral resection, two types of reconstruction are possible: uretero-ureteral end-to-end anastomosis and uretero-vesical re-implantation or uretero-neocystostomy (the so-called psoas hitch technique). By compiling the experience of three surgical teams that perform HIPEC surgeries, we have tried to compare the effectiveness of these two techniques. A retrospective comparative case-matched multicenter study was conducted for patients undergoing operation between 2005 and 2014. Patients included had undergone resection of the pelvic ureter during cytoreductive surgery with HIPEC for peritoneal carcinomatomosis; ureteral reconstruction was by either end-to-end anastomosis (EEA group) or re-implantation uretero-neocystostomy (RUC group). The primary endpoint was the occurrence of urinary fistula in postoperative follow-up. There were 14 patients in the EEA group and 14 in the RUC group. The groups were comparable for age, extent of carcinomatosis (PCI index) and operative duration. Four urinary fistulas occurred in the EEA group (28.5%) versus zero fistulas in the RUC group (0%) (P=0.0308). Re-implantation with uretero-neocystostomy during cytoreductive surgery with HIPEC is the preferred technique for reconstruction after ureteral resection in case of renal conservation. Copyright © 2017. Published by Elsevier Masson SAS.

  2. Poster - 44: Development and implementation of a comprehensive end-to-end testing methodology for linac-based frameless SRS QA using a modified commercial stereotactic anthropomorphic phantom

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Derek; Mutanga, Theodore [University of Toronto, Carlo Fidani Peel Regional Cancer Center (Canada)

    2016-08-15

    Purpose: An end-to-end testing methodology was designed to evaluate the overall SRS treatment fidelity, incorporating all steps in the linac-based frameless radiosurgery treatment delivery process. The study details our commissioning experience of the Steev (CIRS, Norfolk, VA) stereotactic anthropomorphic head phantom including modification, test design, and baseline measurements. Methods: Repeated MR and CT scans were performed with interchanging inserts. MR-CT fusion accuracy was evaluated and the insert spatial coincidence was verified on CT. Five non-coplanar arcs delivered a prescription dose to a 15 mm spherical CTV with 2 mm PTV margin. Following setup, CBCT-based shifts were applied as per protocol. Sequential measurements were performed by interchanging inserts without disturbing the setup. Spatial and dosimetric accuracy was assessed by a combination of CBCT hidden target, radiochromic film, and ion chamber measurements. To facilitate film registration, the film insert was modified in-house by etching marks. Results: MR fusion error and insert spatial coincidences were within 0.3 mm. Both CBCT and film measurements showed spatial displacements of 1.0 mm in similar directions. Both coronal and sagittal films reported 2.3 % higher target dose relative to the treatment plan. The corrected ion chamber measurement was similarly greater by 1.0 %. The 3 %/2 mm gamma pass rate was 99% for both films Conclusions: A comprehensive end-to-end testing methodology was implemented for our SRS QA program. The Steev phantom enabled realistic evaluation of the entire treatment process. Overall spatial and dosimetric accuracy of the delivery were 1 mm and 3 % respectively.

  3. A Validation Approach of an End-to-End Whole Genome Sequencing Workflow for Source Tracking of Listeria monocytogenes and Salmonella enterica

    Directory of Open Access Journals (Sweden)

    Anne-Catherine Portmann

    2018-03-01

    Full Text Available Whole genome sequencing (WGS, using high throughput sequencing technology, reveals the complete sequence of the bacterial genome in a few days. WGS is increasingly being used for source tracking, pathogen surveillance and outbreak investigation due to its high discriminatory power. In the food industry, WGS used for source tracking is beneficial to support contamination investigations. Despite its increased use, no standards or guidelines are available today for the use of WGS in outbreak and/or trace-back investigations. Here we present a validation of our complete (end-to-end WGS workflow for Listeria monocytogenes and Salmonella enterica including: subculture of isolates, DNA extraction, sequencing and bioinformatics analysis. This end-to-end WGS workflow was evaluated according to the following performance criteria: stability, repeatability, reproducibility, discriminatory power, and epidemiological concordance. The current study showed that few single nucleotide polymorphism (SNPs were observed for L. monocytogenes and S. enterica when comparing genome sequences from five independent colonies from the first subculture and five independent colonies after the tenth subculture. Consequently, the stability of the WGS workflow for L. monocytogenes and S. enterica was demonstrated despite the few genomic variations that can occur during subculturing steps. Repeatability and reproducibility were also demonstrated. The WGS workflow was shown to have a high discriminatory power and has the ability to show genetic relatedness. Additionally, the WGS workflow was able to reproduce published outbreak investigation results, illustrating its capability of showing epidemiological concordance. The current study proposes a validation approach comprising all steps of a WGS workflow and demonstrates that the workflow can be applied to L. monocytogenes or S. enterica.

  4. SU-F-T-76: Total Skin Electron Therapy: An-End-To-End Examination of the Absolute Dosimetry with a Rando Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Cui, G; Ha, J; Zhou, S; Cui, J; Shiu, A [University Southern California, Los Angeles, CA (United States)

    2016-06-15

    Purpose: To examine and validate the absolute dose for total skin electron therapy (TSET) through an end-to-end test with a Rando phantom using optically stimulated luminescent dosimeters (OSLDs) and EBT3 radiochromic films. Methods: A Varian Trilogy linear accelerator equipped with the special procedure 6 MeV HDTSe- was used to perform TSET irradiations using a modified Stanford 6-dual-field technique. The absolute dose was calibrated using a Markus ion chamber at a reference depth of 1.3cm at 100 cm SSD with a field size of 36 × 36 cm at the isocenter in solid water slabs. The absolute dose was cross validated by a farmer ion chamber. Then the dose rate in the unit of cGy/Mu was calibrated using the Markus chamber at the treatment position. OSLDs were used to independently verify the dose using the calibrated dose rate. Finally, a patient treatment plan (200 cGy/cycle) was delivered in the QA mode to a Rando phantom, which had 16 pairs of OSLDs and EBT3 films taped onto its surface at different anatomical positions. The doses recorded were read out to validate the absolute dosimetry for TSET. Results: The OSLD measurements were within 7% agreement with the planned dose except the shoulder areas, where the doses recorded were 23% lower on average than those of the planned. The EBT3 film measurements were within 10% agreement with the planned dose except the shoulder and the scalp vertex areas, where the respective doses recorded were 18% and 14% lower on average than those of the planned. The OSLDs gave more consistent dose measurements than those of the EBT3 films. Conclusion: The absolute dosimetry for TSET was validated by an end-to-end test with a Rando phantom using the OSLDs and EBT3 films. The beam calibration and monitor unit calculations were confirmed.

  5. Rapid Design and Navigation Tools to Enable Small-Body Missions

    Data.gov (United States)

    National Aeronautics and Space Administration — Rapid design and navigation tools broaden the number and scope of available missions by making the most of advances in astrodynamics and in computer software and...

  6. Mission Engineering of a Rapid Cycle Spacecraft Logistics Fleet

    Science.gov (United States)

    Holladay, Jon; McClendon, Randy (Technical Monitor)

    2002-01-01

    The requirement for logistics re-supply of the International Space Station has provided a unique opportunity for engineering the implementation of NASA's first dedicated pressurized logistics carrier fleet. The NASA fleet is comprised of three Multi-Purpose Logistics Modules (MPLM) provided to NASA by the Italian Space Agency in return for operations time aboard the International Space Station. Marshall Space Flight Center was responsible for oversight of the hardware development from preliminary design through acceptance of the third flight unit, and currently manages the flight hardware sustaining engineering and mission engineering activities. The actual MPLM Mission began prior to NASA acceptance of the first flight unit in 1999 and will continue until the de-commission of the International Space Station that is planned for 20xx. Mission engineering of the MPLM program requires a broad focus on three distinct yet inter-related operations processes: pre-flight, flight operations, and post-flight turn-around. Within each primary area exist several complex subsets of distinct and inter-related activities. Pre-flight processing includes the evaluation of carrier hardware readiness for space flight. This includes integration of payload into the carrier, integration of the carrier into the launch vehicle, and integration of the carrier onto the orbital platform. Flight operations include the actual carrier operations during flight and any required real-time ground support. Post-flight processing includes de-integration of the carrier hardware from the launch vehicle, de-integration of the payload, and preparation for returning the carrier to pre-flight staging. Typical space operations are engineered around the requirements and objectives of a dedicated mission on a dedicated operational platform (i.e. Launch or Orbiting Vehicle). The MPLM, however, has expanded this envelope by requiring operations with both vehicles during flight as well as pre-launch and post

  7. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study.

    Science.gov (United States)

    Bowen, S R; Nyflot, M J; Herrmann, C; Groh, C M; Meyer, J; Wollenweber, S D; Stearns, C W; Kinahan, P E; Sandison, G A

    2015-05-07

    Effective positron emission tomography / computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [(18)F]FDG. The lung lesion insert was driven by six different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/Bmean) ratios, target volumes, planned equivalent uniform target doses, and 2%-2 mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10-20%, treatment planning errors were 5-10%, and treatment delivery errors were 5-30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5-10% in PET/CT imaging, PET/CT imaging to RT planning, and RT delivery under a dose painting paradigm is feasible within an integrated respiratory motion phantom workflow. For a limited set of cases, the magnitude

  8. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study

    International Nuclear Information System (INIS)

    Bowen, S R; Nyflot, M J; Meyer, J; Sandison, G A; Herrmann, C; Groh, C M; Wollenweber, S D; Stearns, C W; Kinahan, P E

    2015-01-01

    Effective positron emission tomography / computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [ 18 F]FDG. The lung lesion insert was driven by six different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/B mean ) ratios, target volumes, planned equivalent uniform target doses, and 2%-2 mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10–20%, treatment planning errors were 5–10%, and treatment delivery errors were 5–30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5–10% in PET/CT imaging, <5% in treatment planning, and <2% in treatment delivery. We have demonstrated that estimation of respiratory motion uncertainty and its propagation from PET/CT imaging to RT

  9. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study

    Science.gov (United States)

    Bowen, S R; Nyflot, M J; Hermann, C; Groh, C; Meyer, J; Wollenweber, S D; Stearns, C W; Kinahan, P E; Sandison, G A

    2015-01-01

    Effective positron emission tomography/computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [18F]FDG. The lung lesion insert was driven by 6 different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy (VMAT) were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/Bmean) ratios, target volumes, planned equivalent uniform target doses (EUD), and 2%-2mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10–20%, treatment planning errors were 5–10%, and treatment delivery errors were 5–30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5–10% in PET/CT imaging, PET/CT imaging to RT planning, and RT delivery under a dose painting paradigm is feasible within an integrated respiratory motion phantom workflow. For a limited set of cases, the

  10. SU-F-P-37: Implementation of An End-To-End QA Test of the Radiation Therapy Imaging, Planning and Delivery Process to Identify and Correct Possible Sources of Deviation

    International Nuclear Information System (INIS)

    Salinas Aranda, F; Suarez, V; Arbiser, S; Sansogne, R

    2016-01-01

    Purpose: To implement an end-to-end QA test of the radiation therapy imaging, planning and delivery process, aimed to assess the dosimetric agreement accuracy between planned and delivered treatment, in order to identify and correct possible sources of deviation. To establish an internal standard for machine commissioning acceptance. Methods: A test involving all steps of the radiation therapy: imaging, planning and delivery process was designed. The test includes analysis of point dose and planar dose distributions agreement between TPS calculated and measured dose. An ad hoc 16 cm diameter PMMA phantom was constructed with one central and four peripheral bores that can accommodate calibrated electron density inserts. Using Varian Eclipse 10.0 and Elekta XiO 4.50 planning systems, IMRT, RapidArc and 3DCRT with hard and dynamic wedges plans were planned on the phantom and tested. An Exradin A1SL chamber is used with a Keithley 35617EBS electrometer for point dose measurements in the phantom. 2D dose distributions were acquired using MapCheck and Varian aS1000 EPID.Gamma analysis was performed for evaluation of 2D dose distribution agreement using MapCheck software and Varian Portal Dosimetry Application.Varian high energy Clinacs Trilogy, 2100C/CD, 2000CR and low energy 6X/EX where tested.TPS-CT# vs. electron density table were checked for CT-scanners used. Results: Calculated point doses were accurate to 0.127% SD: 0.93%, 0.507% SD: 0.82%, 0.246% SD: 1.39% and 0.012% SD: 0.01% for LoX-3DCRT, HiX-3DCRT, IMRT and RapidArc plans respectively. Planar doses pass gamma 3% 3mm in all cases and 2% 2mm for VMAT plans. Conclusion: Implementation of a simple and reliable quality assurance tool was accomplished. The end-to-end proved efficient, showing excellent agreement between planned and delivered dose evidencing strong consistency of the whole process from imaging through planning to delivery. This test can be used as a first step in beam model acceptance for clinical

  11. Presence of calcium in the vessel walls after end-to-end arterial anastomoses with polydioxanone and polypropylene sutures in growing dogs.

    Science.gov (United States)

    Gersak, B

    1993-10-01

    The presence of calcium in the vessel walls after end-to-end arterial anastomoses performed with polydioxanone and polypropylene interrupted sutures was studied in 140 anastomoses in 35 10-week-old German shepherd dogs. Histologic examination with hematoxylin and eosin, van Gieson, and von Kossa staining techniques was performed after the animals were killed 6 months after the operation. Ketamine hydrochloride was used as an anesthetic agent. At the start of the investigation the dogs weighed 14.5 +/- 2.6 kg (mean +/- standard deviation, n = 35), and after 6 months they weighed 45.3 +/- 3.1 kg (mean +/- standard deviation, n = 35). The diameter of the sutured arteries in the first operation was 2.6 +/- 0.5 mm (mean +/- standard deviation, n = 140). With each dog, both brachial and both femoral arteries were used--one artery for each different type of suture. In different dogs, different arteries were used for the same type of suture. The prevalence of calcifications after 6 months was determined from the numeric density of calcifications with standard stereologic techniques. The sutured and sutureless parts taken from longitudinal sections from each artery were studied, and t test values were calculated as follows: In paired samples, statistically significant differences in numerical density of calcifications were seen between sutured and sutureless arterial parts for both materials (sutureless part versus part with polydioxanone sutures, p 0.05, n = 70) and sutureless parts (p > 0.05, n = 70).

  12. System for Informatics in the Molecular Pathology Laboratory: An Open-Source End-to-End Solution for Next-Generation Sequencing Clinical Data Management.

    Science.gov (United States)

    Kang, Wenjun; Kadri, Sabah; Puranik, Rutika; Wurst, Michelle N; Patil, Sushant A; Mujacic, Ibro; Benhamed, Sonia; Niu, Nifang; Zhen, Chao Jie; Ameti, Bekim; Long, Bradley C; Galbo, Filipo; Montes, David; Iracheta, Crystal; Gamboa, Venessa L; Lopez, Daisy; Yourshaw, Michael; Lawrence, Carolyn A; Aisner, Dara L; Fitzpatrick, Carrie; McNerney, Megan E; Wang, Y Lynn; Andrade, Jorge; Volchenboum, Samuel L; Furtado, Larissa V; Ritterhouse, Lauren L; Segal, Jeremy P

    2018-04-24

    Next-generation sequencing (NGS) diagnostic assays increasingly are becoming the standard of care in oncology practice. As the scale of an NGS laboratory grows, management of these assays requires organizing large amounts of information, including patient data, laboratory processes, genomic data, as well as variant interpretation and reporting. Although several Laboratory Information Systems and/or Laboratory Information Management Systems are commercially available, they may not meet all of the needs of a given laboratory, in addition to being frequently cost-prohibitive. Herein, we present the System for Informatics in the Molecular Pathology Laboratory, a free and open-source Laboratory Information System/Laboratory Information Management System for academic and nonprofit molecular pathology NGS laboratories, developed at the Genomic and Molecular Pathology Division at the University of Chicago Medicine. The System for Informatics in the Molecular Pathology Laboratory was designed as a modular end-to-end information system to handle all stages of the NGS laboratory workload from test order to reporting. We describe the features of the system, its clinical validation at the Genomic and Molecular Pathology Division at the University of Chicago Medicine, and its installation and testing within a different academic center laboratory (University of Colorado), and we propose a platform for future community co-development and interlaboratory data sharing. Copyright © 2018. Published by Elsevier Inc.

  13. Partial QoS-Aware Opportunistic Relay Selection Over Two-Hop Channels: End-to-End Performance Under Spectrum-Sharing Requirements

    KAUST Repository

    Yuli Yang,

    2014-10-01

    In this paper, we propose a partial quality-of-service (QoS)-oriented relay selection scheme with a decode-and-forward (DF) relaying protocol, to reduce the feedback amount required for relay selection. In the proposed scheme, the activated relay is the one with the maximum signal-to-noise power ratio (SNR) in the second hop among those whose packet loss rates (PLRs) in the first hop achieve a predetermined QoS level. For the purpose of evaluating the performance of the proposed scheme, we exploit it with transmission constraints imposed on the transmit power budget and interference to other users. By analyzing the statistics of received SNRs in the first and second hops, we obtain the end-to-end PLR of this scheme in closed form under the considered scenario. Moreover, to compare the proposed scheme with popular relay selection schemes, we also derive the closed-form PLR expressions for partial relay selection (PRS) and opportunistic relay selection (ORS) criteria in the same scenario under study. Illustrative numerical results demonstrate the accuracy of our derivations and substantiate that the proposed relay selection scheme is a promising alternative with respect to the tradeoff between performance and complexity.

  14. Albert-Lembert versus hybrid-layered suture in hand sewn end-to-end cervical esophagogastric anastomosis after esophageal squamous cell carcinoma resection.

    Science.gov (United States)

    Feng, Fan; Sun, Li; Xu, Guanghui; Hong, Liu; Yang, Jianjun; Cai, Lei; Li, Guocai; Guo, Man; Lian, Xiao; Zhang, Hongwei

    2015-11-01

    Hand sewn cervical esophagogastric anastomosis (CEGA) is regarded as preferred technique by surgeons after esophagectomy. However, considering the anastomotic leakage and stricture, the optimal technique for performing this anastomosis is still under debate. Between November 2010 and September 2012, 230 patients who underwent esophagectomy with hand sewn end-to-end (ETE) CEGA for esophageal squamous cell carcinoma (ESCC) were analyzed retrospectively, including 111 patients underwent Albert-Lembert suture anastomosis and 119 patients underwent hybrid-layered suture anastomosis. Anastomosis construction time was recorded during operation. Anastomotic leakage was recorded through upper gastrointestinal water-soluble contrast examination. Anastomotic stricture was recorded during follow up. The hybrid-layered suture was faster than Albert-Lembert suture (29.40±1.24 min vs. 33.83±1.41 min, P=0.02). The overall anastomotic leak rate was 7.82%, the leak rate in hybrid-layered suture group was significantly lower than that in Albert-Lembert suture group (3.36% vs. 12.61%, P=0.01). The overall anastomotic stricture rate was 9.13%, the stricture rate in hybrid-layered suture group was significantly lower than that in Albert-Lembert suture group (5.04% vs. 13.51%, P=0.04). Hand sewn ETE CEGA with hybrid-layered suture is associated with lower anastomotic leakage and stricture rate compared to hand sewn ETE CEGA with Albert-Lembert suture.

  15. Stapled side-to-side anastomosis might be better than handsewn end-to-end anastomosis in ileocolic resection for Crohn's disease: a meta-analysis.

    Science.gov (United States)

    He, Xiaosheng; Chen, Zexian; Huang, Juanni; Lian, Lei; Rouniyar, Santosh; Wu, Xiaojian; Lan, Ping

    2014-07-01

    Ileocolic anastomosis is an essential step in the treatment to restore continuity of the gastrointestinal tract following ileocolic resection in patients with Crohn's disease (CD). However, the association between anastomotic type and surgical outcome is controversial. The aim of this meta-analysis is to compare surgical outcomes between stapled side-to-side anastomosis (SSSA) and handsewn end-to-end anastomosis (HEEA) after ileocolic resection in patients with CD. Studies comparing SSSA with HEEA after ileocolic resection in patients with CD were identified in PubMed and EMBASE. Outcomes such as complication, recurrence, and re-operation were evaluated. Eight studies (three randomized controlled trials, one prospective non-randomized trial, and four non-randomized retrospective trials) comparing SSSA (396 cases) and HEEA (425 cases) were included. As compared with HEEA, SSSA was superior in terms of overall postoperative complications [odds ratio (OR), 0.54; 95 % confidence interval (CI) 0.32-0.93], anastomotic leak (OR 0.45; 95 % CI 0.20-1.00), recurrence (OR 0.20; 95 % CI 0.07-0.55), and re-operation for recurrence (OR 0.18; 95 % CI 0.07-0.45). Postoperative hospital stay, mortality, and complications other than anastomotic leak were comparable. Based on the results of our meta-analysis, SSSA would appear to be the preferred procedure after ileocolic resection for CD, with reduced overall postoperative complications, especially anastomotic leak, and a decreased recurrence and re-operation rate.

  16. Distributed Large Data-Object Environments: End-to-End Performance Analysis of High Speed Distributed Storage Systems in Wide Area ATM Networks

    Science.gov (United States)

    Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary

    1996-01-01

    We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.

  17. Healing of esophageal anastomoses performed with the biofragmentable anastomosis ring versus the end-to-end anastomosis stapler: comparative experimental study in dogs.

    Science.gov (United States)

    Kovács, Tibor; Köves, István; Orosz, Zsolt; Németh, Tibor; Pandi, Erzsébet; Kralovanszky, Judit

    2003-04-01

    The biofragmentable anastomosis ring (BAR) has been used successfully for anastomoses from the stomach to the upper rectum. The healing of intrathoracic esophageal anastomoses performed with the BAR or an end-to-end anastomosis (EEA) stapler on an experimental model was compared. Parameters of tissue repair were evaluated: macroscopic examination, bursting strength (BS), collagen (hydroxyproline, or HP), histology (H&E and Picrosirius red staining for collagen). A series of 48 mongrel dogs were randomly separated into two groups (30 BAR, 18 stapler) and subgroups according to the time of autopsy (days 4, 7, 14, 28). Mortality was 13.3% (4 BAR cases) with two deaths not related to surgery (excluded). There were four leaks in the BAR group (14.3%) and no leaks or deaths but two strictures in the stapler group. BS was significantly higher in the BAR group during the first week, and values were almost equal from the second week with both methods. The HP rate was significantly reduced on days 4 and 7 in both groups compared to the reference values; the values were close to reference values from the second week (lower in the BAR group). Stapled anastomoses caused less pronounced inflammation and were associated with an earlier start of regeneration, but the difference was not significant compared to that in the BAR group. Accumulation of new collagen (green polarization) started on day 7 in both groups, but maturation (orange-red polarization) was significantly more advanced in the BAR group after the second week. A strong linear correlation between the BS and HP rate was found with both methods. There was no significant difference in the complication rate or healing of intrathoracic BAR and stapled anastomoses. The BAR method is simple, quick, and safe; and it seems to be a feasible procedure for creating intrathoracic esophageal anastomoses in dogs.

  18. End-to-end process of hollow spacecraft structures with high frequency and low mass obtained with in-house structural optimization tool and additive manufacturing

    Directory of Open Access Journals (Sweden)

    Alexandru-Mihai CISMILIANU

    2017-09-01

    Full Text Available In the space sector the most decisive elements are: mass reduction, cost saving and minimum lead time; here, structural optimization and additive layer manufacturing (ALM fit best. The design must be driven by stiffness, because an important requirement for spacecraft (S/C structures is to reduce the dynamic coupling between the S/C and the launch vehicle. The objective is to create an end-to-end process, from the input given by the customer to the manufacturing of an aluminum part as light as possible but at the same time considerably stiffer while taking the full advantage of the design flexibility given by ALM. To design and optimize the parts, a specialized in-house tool was used, guaranteeing a load-sufficient material distribution. Using topological optimization, the iterations between the design and the stress departments were diminished, thus greatly reducing the lead time. In order to improve and lighten the obtained structure a design with internal cavities and hollow beams was considered. This implied developing of a procedure for powder evacuation through iterations with the manufacturer while optimizing the design for ALM. The resulted part can be then manufactured via ALM with no need of further design adjustments. To achieve a high-quality part with maximum efficiency, it is essential to have a loop between the design team and the manufacturer. Topological optimization and ALM work hand in hand if used properly. The team achieved a more efficient structure using topology optimization and ALM, than using conventional design and manufacturing methods.

  19. Automated Detection of Clinically Significant Prostate Cancer in mp-MRI Images Based on an End-to-End Deep Neural Network.

    Science.gov (United States)

    Wang, Zhiwei; Liu, Chaoyue; Cheng, Danpeng; Wang, Liang; Yang, Xin; Cheng, Kwang-Ting

    2018-05-01

    Automated methods for detecting clinically significant (CS) prostate cancer (PCa) in multi-parameter magnetic resonance images (mp-MRI) are of high demand. Existing methods typically employ several separate steps, each of which is optimized individually without considering the error tolerance of other steps. As a result, they could either involve unnecessary computational cost or suffer from errors accumulated over steps. In this paper, we present an automated CS PCa detection system, where all steps are optimized jointly in an end-to-end trainable deep neural network. The proposed neural network consists of concatenated subnets: 1) a novel tissue deformation network (TDN) for automated prostate detection and multimodal registration and 2) a dual-path convolutional neural network (CNN) for CS PCa detection. Three types of loss functions, i.e., classification loss, inconsistency loss, and overlap loss, are employed for optimizing all parameters of the proposed TDN and CNN. In the training phase, the two nets mutually affect each other and effectively guide registration and extraction of representative CS PCa-relevant features to achieve results with sufficient accuracy. The entire network is trained in a weakly supervised manner by providing only image-level annotations (i.e., presence/absence of PCa) without exact priors of lesions' locations. Compared with most existing systems which require supervised labels, e.g., manual delineation of PCa lesions, it is much more convenient for clinical usage. Comprehensive evaluation based on fivefold cross validation using 360 patient data demonstrates that our system achieves a high accuracy for CS PCa detection, i.e., a sensitivity of 0.6374 and 0.8978 at 0.1 and 1 false positives per normal/benign patient.

  20. Combined fishing and climate forcing in the southern Benguela upwelling ecosystem: an end-to-end modelling approach reveals dampened effects.

    Directory of Open Access Journals (Sweden)

    Morgane Travers-Trolet

    Full Text Available The effects of climate and fishing on marine ecosystems have usually been studied separately, but their interactions make ecosystem dynamics difficult to understand and predict. Of particular interest to management, the potential synergism or antagonism between fishing pressure and climate forcing is analysed in this paper, using an end-to-end ecosystem model of the southern Benguela ecosystem, built from coupling hydrodynamic, biogeochemical and multispecies fish models (ROMS-N2P2Z2D2-OSMOSE. Scenarios of different intensities of upwelling-favourable wind stress combined with scenarios of fishing top-predator fish were tested. Analyses of isolated drivers show that the bottom-up effect of the climate forcing propagates up the food chain whereas the top-down effect of fishing cascades down to zooplankton in unfavourable environmental conditions but dampens before it reaches phytoplankton. When considering both climate and fishing drivers together, it appears that top-down control dominates the link between top-predator fish and forage fish, whereas interactions between the lower trophic levels are dominated by bottom-up control. The forage fish functional group appears to be a central component of this ecosystem, being the meeting point of two opposite trophic controls. The set of combined scenarios shows that fishing pressure and upwelling-favourable wind stress have mostly dampened effects on fish populations, compared to predictions from the separate effects of the stressors. Dampened effects result in biomass accumulation at the top predator fish level but a depletion of biomass at the forage fish level. This should draw our attention to the evolution of this functional group, which appears as both structurally important in the trophic functioning of the ecosystem, and very sensitive to climate and fishing pressures. In particular, diagnoses considering fishing pressure only might be more optimistic than those that consider combined effects

  1. A novel PON based UMTS broadband wireless access network architecture with an algorithm to guarantee end to end QoS

    Science.gov (United States)

    Sana, Ajaz; Hussain, Shahab; Ali, Mohammed A.; Ahmed, Samir

    2007-09-01

    In this paper we proposes a novel Passive Optical Network (PON) based broadband wireless access network architecture to provide multimedia services (video telephony, video streaming, mobile TV, mobile emails etc) to mobile users. In the conventional wireless access networks, the base stations (Node B) and Radio Network Controllers (RNC) are connected by point to point T1/E1 lines (Iub interface). The T1/E1 lines are expensive and add up to operating costs. Also the resources (transceivers and T1/E1) are designed for peak hours traffic, so most of the time the dedicated resources are idle and wasted. Further more the T1/E1 lines are not capable of supporting bandwidth (BW) required by next generation wireless multimedia services proposed by High Speed Packet Access (HSPA, Rel.5) for Universal Mobile Telecommunications System (UMTS) and Evolution Data only (EV-DO) for Code Division Multiple Access 2000 (CDMA2000). The proposed PON based back haul can provide Giga bit data rates and Iub interface can be dynamically shared by Node Bs. The BW is dynamically allocated and the unused BW from lightly loaded Node Bs is assigned to heavily loaded Node Bs. We also propose a novel algorithm to provide end to end Quality of Service (QoS) (between RNC and user equipment).The algorithm provides QoS bounds in the wired domain as well as in wireless domain with compensation for wireless link errors. Because of the air interface there can be certain times when the user equipment (UE) is unable to communicate with Node B (usually referred to as link error). Since the link errors are bursty and location dependent. For a proposed approach, the scheduler at the Node B maps priorities and weights for QoS into wireless MAC. The compensations for errored links is provided by the swapping of services between the active users and the user data is divided into flows, with flows allowed to lag or lead. The algorithm guarantees (1)delay and throughput for error-free flows,(2)short term fairness

  2. SU-E-J-25: End-To-End (E2E) Testing On TomoHDA System Using a Real Pig Head for Intracranial Radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Corradini, N; Leick, M; Bonetti, M; Negretti, L [Clinica Luganese, Radiotherapy Center, Lugano (Switzerland)

    2015-06-15

    Purpose: To determine the MVCT imaging uncertainty on the TomoHDA system for intracranial radiosurgery treatments. To determine the end-to-end (E2E) overall accuracy of the TomoHDA system for intracranial radiosurgery. Methods: A pig head was obtained from the butcher, cut coronally through the brain, and preserved in formaldehyde. The base of the head was fixed to a positioning plate allowing precise movement, i.e. translation and rotation, in all 6 axes. A repeatability test was performed on the pig head to determine uncertainty in the image bone registration algorithm. Furthermore, the test studied images with MVCT slice thicknesses of 1 and 3 mm in unison with differing scan lengths. A sensitivity test was performed to determine the registration algorithm’s ability to find the absolute position of known translations/rotations of the pig head. The algorithm’s ability to determine absolute position was compared against that of manual operators, i.e. a radiation therapist and radiation oncologist. Finally, E2E tests for intracranial radiosurgery were performed by measuring the delivered dose distributions within the pig head using Gafchromic films. Results: The repeatability test uncertainty was lowest for the MVCTs of 1-mm slice thickness, which measured less than 0.10 mm and 0.12 deg for all axes. For the sensitivity tests, the bone registration algorithm performed better than human eyes and a maximum difference of 0.3 mm and 0.4 deg was observed for the axes. E2E test results in absolute position difference measured 0.03 ± 0.21 mm in x-axis and 0.28 ± 0.18 mm in y-axis. A maximum difference of 0.32 and 0.66 mm was observed in x and y, respectively. The average peak dose difference between measured and calculated dose was 2.7 cGy or 0.4%. Conclusion: Our tests using a pig head phantom estimate the TomoHDA system to have a submillimeter overall accuracy for intracranial radiosurgery.

  3. SU-E-J-25: End-To-End (E2E) Testing On TomoHDA System Using a Real Pig Head for Intracranial Radiosurgery

    International Nuclear Information System (INIS)

    Corradini, N; Leick, M; Bonetti, M; Negretti, L

    2015-01-01

    Purpose: To determine the MVCT imaging uncertainty on the TomoHDA system for intracranial radiosurgery treatments. To determine the end-to-end (E2E) overall accuracy of the TomoHDA system for intracranial radiosurgery. Methods: A pig head was obtained from the butcher, cut coronally through the brain, and preserved in formaldehyde. The base of the head was fixed to a positioning plate allowing precise movement, i.e. translation and rotation, in all 6 axes. A repeatability test was performed on the pig head to determine uncertainty in the image bone registration algorithm. Furthermore, the test studied images with MVCT slice thicknesses of 1 and 3 mm in unison with differing scan lengths. A sensitivity test was performed to determine the registration algorithm’s ability to find the absolute position of known translations/rotations of the pig head. The algorithm’s ability to determine absolute position was compared against that of manual operators, i.e. a radiation therapist and radiation oncologist. Finally, E2E tests for intracranial radiosurgery were performed by measuring the delivered dose distributions within the pig head using Gafchromic films. Results: The repeatability test uncertainty was lowest for the MVCTs of 1-mm slice thickness, which measured less than 0.10 mm and 0.12 deg for all axes. For the sensitivity tests, the bone registration algorithm performed better than human eyes and a maximum difference of 0.3 mm and 0.4 deg was observed for the axes. E2E test results in absolute position difference measured 0.03 ± 0.21 mm in x-axis and 0.28 ± 0.18 mm in y-axis. A maximum difference of 0.32 and 0.66 mm was observed in x and y, respectively. The average peak dose difference between measured and calculated dose was 2.7 cGy or 0.4%. Conclusion: Our tests using a pig head phantom estimate the TomoHDA system to have a submillimeter overall accuracy for intracranial radiosurgery

  4. SU-E-T-19: A New End-To-End Test Method for ExacTrac for Radiation and Plan Isocenter Congruence

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S; Nguyen, N; Liu, F; Huang, Y [Rhode Island Hospital / Warren Alpert Medical, Providence, RI (United States); Sio, T [Mayo Clinic, Rochester, MN (United States); Jung, J [East Carolina University, Greenville, North Carolina (United States); Pyakuryal, A [UniversityIllinois at Chicago, Chicago, IL (United States); Jang, S [Princeton Radiation Oncology Ctr., Jamesburg, NJ (United States)

    2014-06-01

    Purpose: To combine and integrate quality assurance (QA) of target localization and radiation isocenter End to End (E2E) test of BrainLAB ExacTrac system, a new QA approach was devised using anthropomorphic head and neck phantom. This test insures the target localization as well as radiation isocenter congruence which is one step ahead the current ExacTrac QA procedures. Methods: The head and neck phantom typically used for CyberKnife E2E test was irradiated to the sphere target that was visible in CT-sim images. The CT-sim was performed using 1 mm thickness slice with helical scanning technique. The size of the sphere was 3-cm diameter and contoured as a target volume using iPlan V.4.5.2. A conformal arc plan was generated using MLC-based with 7 fields, and five of them were include couch rotations. The prescription dose was 5 Gy and 95% coverage to the target volume. For the irradiation, two Gafchromic films were perpendicularly inserted into the cube that hold sphere inside. The linac used for the irradiation was TrueBeam STx equipped with HD120 MLC. In order to use ExacTrac, infra-red head–array was used to correlate orthogonal X-ray images. Results: Using orthogonal X-rays of ExacTrac the phantom was positioned. For each field, phantom was check again with X-rays and re-positioned if necessary. After each setup using ExacTrac, the target was irradiated. The films were analyzed to determine the deviation of the radiation isocenter in all three dimensions: superior-inferior, left-right and anterior-posterior. The total combining error was found to be 0.76 mm ± 0.05 mm which was within sub-millimeter accuracy. Conclusion: Until now, E2E test for ExacTrac was separately implemented to test image localization and radiation isocenter. This new method can be used for periodic QA procedures.

  5. RTEMP: Exploring an end-to-end, agnostic platform for multidisciplinary real-time analytics in the space physics community and beyond

    Science.gov (United States)

    Chaddock, D.; Donovan, E.; Spanswick, E.; Jackel, B. J.

    2014-12-01

    Large-scale, real-time, sensor-driven analytics are a highly effective set of tools in many research environments; however, the barrier to entry is expensive and the learning curve is steep. These systems need to operate efficiently from end to end, with the key aspects being data transmission, acquisition, management and organization, and retrieval. When building a generic multidisciplinary platform, acquisition and data management needs to be designed with scalability and flexibility as the primary focus. Additionally, in order to leverage current sensor web technologies, the integration of common sensor data standards (ie. SensorML and SWE Services) should be supported. Perhaps most important, researchers should be able to get started and integrate the platform into their set of research tools as easily and quickly as possible. The largest issue with current platforms is that the sensor data must be formed and described using the previously mentioned standards. As useful as these standards are for organizing data, they are cumbersome to adopt, often restrictive, and are required to be geospatially-driven. Our solution, RTEMP (Real-time Environment Monitoring Platform), is a real-time analytics platform with over ten years and an estimated two million dollars of investment. It has been developed for our continuously expanding requirements of operating and building remote sensors and supporting equipment for space physics research. A key benefit of our approach is RTEMP's ability to manage agnostic data. This allows data that flows through the system to be structured in any way that best addresses the needs of the sensor operators and data users, enabling extensive flexibility and streamlined development and research. Here we begin with an overview of RTEMP and how it is structured. Additionally, we will showcase the ways that we are using RTEMP and how it is being adopted by researchers in an increasingly broad range of other research fields. We will lay out a

  6. WE-DE-BRA-11: A Study of Motion Tracking Accuracy of Robotic Radiosurgery Using a Novel CCD Camera Based End-To-End Test System

    Energy Technology Data Exchange (ETDEWEB)

    Wang, L; M Yang, Y [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA (United States); Nelson, B [Logos Systems Intl, Scotts Valley, CA (United States)

    2016-06-15

    Purpose: A novel end-to-end test system using a CCD camera and a scintillator based phantom (XRV-124, Logos Systems Int’l) capable of measuring the beam-by-beam delivery accuracy of Robotic Radiosurgery (CyberKnife) was developed and reported in our previous work. This work investigates its application in assessing the motion tracking (Synchrony) accuracy for CyberKnife. Methods: A QA plan with Anterior and Lateral beams (with 4 different collimator sizes) was created (Multiplan v5.3) for the XRV-124 phantom. The phantom was placed on a motion platform (superior and inferior movement), and the plans were delivered on the CyberKnife M6 system using four motion patterns: static, Sine- wave, Sine with 15° phase shift, and a patient breathing pattern composed of 2cm maximum motion with 4 second breathing cycle. Under integral recording mode, the time-averaged beam vectors (X, Y, Z) were measured by the phantom and compared with static delivery. In dynamic recording mode, the beam spots were recorded at a rate of 10 frames/second. The beam vector deviation from average position was evaluated against the various breathing patterns. Results: The average beam position of the six deliveries with no motion and three deliveries with Synchrony tracking on ideal motion (sinewave without phase shift) all agree within −0.03±0.00 mm, 0.10±0.04, and 0.04±0.03 in the X, Y, and X directions. Radiation beam width (FWHM) variations are within ±0.03 mm. Dynamic video record showed submillimeter tracking stability for both regular and irregular breathing pattern; however the tracking error up to 3.5 mm was observed when a 15 degree phase shift was introduced. Conclusion: The XRV-124 system is able to provide 3D and 4D targeting accuracy for CyberKnife delivery with Synchrony. The experimental results showed sub-millimeter delivery in phantom with excellent correlation in target to breathing motion. The accuracy was degraded when irregular motion and phase shift was introduced.

  7. Demonstration of the First Real-Time End-to-End 40-Gb/s PAM-4 for Next-Generation Access Applications using 10-Gb/s Transmitter

    DEFF Research Database (Denmark)

    Wei, J. L.; Eiselt, Nicklas; Griesser, Helmut

    2016-01-01

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next-generation access applications using 10-Gb/s class transmitters only. Based on the measurement of a real-time 40-Gb/s PAM system, low-cost upstream and downstream link power budgets are estimated. Up...

  8. Treatment of a partially thrombosed giant aneurysm of the vertebral artery by aneurysm trapping and direct vertebral artery-posterior inferior cerebellar artery end-to-end anastomosis: technical case report.

    Science.gov (United States)

    Benes, Ludwig; Kappus, Christoph; Sure, Ulrich; Bertalanffy, Helmut

    2006-07-01

    The purpose of this article is to focus for the first time on the operative management of a direct vertebral artery (VA)-posterior inferior cerebellar artery (PICA) end-to-end anastomosis in a partially thrombosed giant VA-PICA-complex aneurysm and to underline its usefulness as an additional treatment option. The operative technique of a direct VA-PICA end-to-end anatomosis is described in detail. The VA was entering the large aneurysm sack. Distally, the PICA originated from the aneurysm sack-VA-complex. The donor and recipient vessel were cut close to the aneurysm. Whereas the VA was cut in a straight manner, the PICA was cut at an oblique 45-degree angle to enlarge the vascular end diameter. Vessel ends were flushed with heparinized saline and sutured. The thrombotic material inside the aneurysm sack was removed and the distal VA clipped, leaving the anterior spinal artery and brainstem perforators free. The patient regained consciousness without additional morbidity. Magnetic resonance imaging scans revealed a completely decompressed brainstem without infarction. The postoperative angiograms demonstrated a good filling of the anastomosed PICA. Despite the caliber mistmatch of these two vessels the direct VA-PICA end-to-end anastomosis provides an accurate alternative in addition to other anastomoses and bypass techniques, when donor and recipient vessels are suitable and medullary perforators do not have to be disrupted.

  9. SIP end to end performance metrics

    OpenAIRE

    Vozňák, Miroslav; Rozhon, Jan

    2012-01-01

    The paper deals with a SIP performance testing methodology. The main contribution to the field of performance testing of SIP infrastructure consists in the possibility to perform the standardized stress tests with the developed SIP TesterApp without a deeper knowledge in the area of SIP communication. The developed tool exploits several of open-source applications such as jQuery, Python, JSON and the cornerstone SIP generator SIPp, the result is highly modifiable and the ...

  10. CASTOR end-to-end monitoring

    International Nuclear Information System (INIS)

    Rekatsinas, Theodoros; Duellmann, Dirk; Pokorski, Witold; Ponce, Sebastien; Rabacal, Bartolomeu; Waldron, Dennis; Wojcieszuk, Jacek

    2010-01-01

    With the start of Large Hadron Collider approaching, storage and management of raw event data, as well as reconstruction and analysis data, is of crucial importance for the researchers. The CERN Advanced STORage system (CASTOR) is a hierarchical system developed at CERN, used to store physics production files and user files. CASTOR, as one of the essential software tools used by the LHC experiments, has to provide reliable services for storing and managing data. Monitoring of this complicated system is mandatory in order to assure its stable operation and improve its future performance. This paper presents the new monitoring system of CASTOR which provides operation and user request specific metrics. This system is build around a dedicated, optimized database schema. The schema is populated by PL/SQL procedures, which process a stream of incoming raw metadata from different CASTOR components, initially collected by the Distributed Logging Facility (DLF). A web interface has been developed for the visualization of the monitoring data. The different histograms and plots are created using PHP scripts which query the monitoring database.

  11. End-to-end energy efficient communication

    DEFF Research Database (Denmark)

    Dittmann, Lars

    Awareness of energy consumption in communication networks such as the Internet is currently gaining momentum as it is commonly acknowledged that increased network capacity (currently driven by video applications) requires significant more electrical power. This paper stresses the importance...

  12. Unmanned Aircraft Systems Detect and Avoid System: End-to-End Verification and Validation Simulation Study of Minimum Operations Performance Standards for Integrating Unmanned Aircraft into the National Airspace System

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Sturdy, James L.; Vincent, Michael J.; Hoffler, Keith D.; Myer, Robert R.; DeHaven, Anna M.

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The technique, results, and lessons learned from a detailed End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS), based on specific test vectors and encounter cases, will be presented in this paper.

  13. Safety and efficacy of the NiTi Shape Memory Compression Anastomosis Ring (CAR/ColonRing) for end-to-end compression anastomosis in anterior resection or low anterior resection.

    Science.gov (United States)

    Kang, Jeonghyun; Park, Min Geun; Hur, Hyuk; Min, Byung Soh; Lee, Kang Young; Kim, Nam Kyu

    2013-04-01

    Compression anastomoses may represent an improvement over traditional hand-sewn or stapled techniques. This prospective exploratory study aimed to assess the efficacy and complication rates in patients undergoing anterior resection (AR) or low anterior resection (LAR) anastomosed with a novel end-to-end compression anastomosis ring, the ColonRing. In all, 20 patients (13 male) undergoing AR or LAR were enrolled to be anastomosed using the NiTi Shape Memory End-to-End Compression Anastomosis Ring (NiTi Medical Technologies Ltd, Netanya, Israel). Demographic, intraoperative, and postoperative data were collected. Patients underwent AR (11/20) or LAR using laparoscopy (75%), robotic (10%) surgery, or an open laparotomy (15%) approach, with a median anastomotic level of 14.5 cm (range, 4-25 cm). Defunctioning loop ileostomies were formed in 6 patients for low anastomoses. Surgeons rated the ColonRing device as either easy or very easy to use. One patient developed an anastomotic leakage in the early postoperative period; there were no late postoperative complications. Mean time to passage of first flatus and commencement of oral fluids was 2.5 days and 3.2 days, respectively. Average hospital stay was 12.6 days (range, 8-23 days). Finally, the device was expelled on average 15.3 days postoperatively without difficulty. This is the first study reporting results in a significant number of LAR patients and the first reported experience from South Korea; it shows that the compression technique is surgically feasible, easy to use, and without significant complication rates. A large randomized controlled trial is warranted to investigate the benefits of the ColonRing over traditional stapling techniques.

  14. Efficacy and safety of a NiTi CAR 27 compression ring for end-to-end anastomosis compared with conventional staplers: A real-world analysis in Chinese colorectal cancer patients

    Science.gov (United States)

    Lu, Zhenhai; Peng, Jianhong; Li, Cong; Wang, Fulong; Jiang, Wu; Fan, Wenhua; Lin, Junzhong; Wu, Xiaojun; Wan, Desen; Pan, Zhizhong

    2016-01-01

    OBJECTIVES: This study aimed to evaluate the safety and efficacy of a new nickel-titanium shape memory alloy compression anastomosis ring, NiTi CAR 27, in constructing an anastomosis for colorectal cancer resection compared with conventional staples. METHODS: In total, 234 consecutive patients diagnosed with colorectal cancer receiving sigmoidectomy and anterior resection for end-to-end anastomosis from May 2010 to June 2012 were retrospectively analyzed. The postoperative clinical parameters, postoperative complications and 3-year overall survival in 77 patients using a NiTi CAR 27 compression ring (CAR group) and 157 patients with conventional circular staplers (STA group) were compared. RESULTS: There were no statistically significant differences between the patients in the two groups in terms of general demographics and tumor features. A clinically apparent anastomotic leak occurred in 2 patients (2.6%) in the CAR group and in 5 patients (3.2%) in the STA group (p=0.804). These eight patients received a temporary diverting ileostomy. One patient (1.3%) in the CAR group was diagnosed with anastomotic stricture through an electronic colonoscopy after 3 months postoperatively. The incidence of postoperative intestinal obstruction was comparable between the two groups (p=0.192). With a median follow-up duration of 39.6 months, the 3-year overall survival rate was 83.1% in the CAR group and 89.0% in the STA group (p=0.152). CONCLUSIONS: NiTi CAR 27 is safe and effective for colorectal end-to-end anastomosis. Its use is equivalent to that of the conventional circular staplers. This study suggests that NiTi CAR 27 may be a beneficial alternative in colorectal anastomosis in Chinese colorectal cancer patients. PMID:27276395

  15. Rapid Preliminary Design of Interplanetary Trajectories Using the Evolutionary Mission Trajectory Generator

    Science.gov (United States)

    Englander, Jacob

    2016-01-01

    Preliminary design of interplanetary missions is a highly complex process. The mission designer must choose discrete parameters such as the number of flybys, the bodies at which those flybys are performed, and in some cases the final destination. In addition, a time-history of control variables must be chosen that defines the trajectory. There are often many thousands, if not millions, of possible trajectories to be evaluated. This can be a very expensive process in terms of the number of human analyst hours required. An automated approach is therefore very desirable. This work presents such an approach by posing the mission design problem as a hybrid optimal control problem. The method is demonstrated on notional high-thrust chemical and low-thrust electric propulsion missions. In the low-thrust case, the hybrid optimal control problem is augmented to include systems design optimization.

  16. Rearrangement of potassium ions and Kv1.1/Kv1.2 potassium channels in regenerating axons following end-to-end neurorrhaphy: ionic images from TOF-SIMS.

    Science.gov (United States)

    Liu, Chiung-Hui; Chang, Hung-Ming; Wu, Tsung-Huan; Chen, Li-You; Yang, Yin-Shuo; Tseng, To-Jung; Liao, Wen-Chieh

    2017-10-01

    The voltage-gated potassium channels Kv1.1 and Kv1.2 that cluster at juxtaparanodal (JXP) regions are essential in the regulation of nerve excitability and play a critical role in axonal conduction. When demyelination occurs, Kv1.1/Kv1.2 activity increases, suppressing the membrane potential nearly to the equilibrium potential of K + , which results in an axonal conduction blockade. The recovery of K + -dependent communication signals and proper clustering of Kv1.1/Kv1.2 channels at JXP regions may directly reflect nerve regeneration following peripheral nerve injury. However, little is known about potassium channel expression and its relationship with the dynamic potassium ion distribution at the node of Ranvier during the regenerative process of peripheral nerve injury (PNI). In the present study, end-to-end neurorrhaphy (EEN) was performed using an in vivo model of PNI. The distribution of K + at regenerating axons following EEN was detected by time-of-flight secondary-ion mass spectrometry. The specific localization and expression of Kv1.1/Kv1.2 channels were examined by confocal microscopy and western blotting. Our data showed that the re-establishment of K + distribution and intensity was correlated with the functional recovery of compound muscle action potential morphology in EEN rats. Furthermore, the re-clustering of Kv1.1/1.2 channels 1 and 3 months after EEN at the nodal region of the regenerating nerve corresponded to changes in the K + distribution. This study provided direct evidence of K + distribution in regenerating axons for the first time. We proposed that the Kv1.1/Kv1.2 channels re-clustered at the JXP regions of regenerating axons are essential for modulating the proper patterns of K + distribution in axons for maintaining membrane potential stability after EEN.

  17. SU-F-J-150: Development of An End-To-End Chain Test for the First-In-Man MR-Guided Treatments with the MRI Linear Accelerator by Using the Alderson Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Hoogcarspel, S; Kerkmeijer, L; Lagendijk, J; Van Vulpen, M; Raaymakers, B [University Medical Center Utrecht, Utrecht, Utrecht (Netherlands)

    2016-06-15

    The Alderson phantom is a human shaped quality assurance tool that has been used for over 30 years in radiotherapy. The phantom can provide integrated tests of the entire chain of treatment planning and delivery. The purpose of this research was to investigate if this phantom can be used to chain test a treatment on the MRI linear accelerator (MRL) which is currently being developed at the UMC Utrecht, in collaboration with Elekta and Philips. The latter was demonstrated by chain testing the future First-in-Man treatments with this system.An Alderson phantom was used to chain test an entire treatment with the MRL. First, a CT was acquired of the phantom with additional markers that are both visible on MR and CT. A treatment plan for treating bone metastases in the sacrum was made. The phantom was consecutively placed in the MRL. For MRI imaging, an 3D volume was acquired. The initially developed treatment plan was then simulated on the new MRI dataset. For simulation, both the MR and CT data was used by registering them together. Before treatment delivery a MV image was acquired and compared with a DRR that was calculated form the MR/CT registration data. Finally, the treatment was delivered. Figure 1 shows both the T1 weighted MR-image of the phantom and the CT that was registered to the MR image. Figure 2 shows both the calculated and measured MV image that was acquired by the MV panel. Figure 3 shows the dose distribution that was simulated. The total elapsed time for the entire procedure excluding irradiation was 13:35 minutes.The Alderson Phantom yields sufficient MR contrast and can be used for full MR guided radiotherapy treatment chain testing. As a result, we are able to perform an end-to-end chain test of the future First-in-Man treatments.

  18. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    Science.gov (United States)

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems

  19. Changes in maximum muscle strength and rapid muscle force characteristics after long-term special support and reconnaissance missions

    DEFF Research Database (Denmark)

    Christensen, Peter Astrup; Jacobsen, Jacob Ole; Thorlund, Jonas B

    2008-01-01

    PURPOSE: The purpose of the present study was to examine the impact of 8 days of immobilization during a Special Support and Reconnaissance mission (SSR) on muscle mass, contraction dynamics, maximum jump height/power, and body composition. METHODS: Unilateral maximal voluntary contraction, rate...... of force development, and maximal jump height were tested to assess muscle strength/power along with whole-body impedance analysis before and after SSR. RESULTS: Body weight, fat-free mass, and total body water decreased (4-5%) after SSR, along with impairments in maximal jump height (-8%) and knee...... extensor maximal voluntary contraction (-10%). Furthermore, rate of force development was severely affected (-15-30%). CONCLUSIONS: Eight days of immobilization during a covert SSR mission by Special Forces soldiers led to substantial decrements in maximal muscle force and especially in rapid muscle force...

  20. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak

  1. Mission operations management

    Science.gov (United States)

    Rocco, David A.

    1994-01-01

    Redefining the approach and philosophy that operations management uses to define, develop, and implement space missions will be a central element in achieving high efficiency mission operations for the future. The goal of a cost effective space operations program cannot be realized if the attitudes and methodologies we currently employ to plan, develop, and manage space missions do not change. A management philosophy that is in synch with the environment in terms of budget, technology, and science objectives must be developed. Changing our basic perception of mission operations will require a shift in the way we view the mission. This requires a transition from current practices of viewing the mission as a unique end product, to a 'mission development concept' built on the visualization of the end-to-end mission. To achieve this change we must define realistic mission success criteria and develop pragmatic approaches to achieve our goals. Custom mission development for all but the largest and most unique programs is not practical in the current budget environment, and we simply do not have the resources to implement all of our planned science programs. We need to shift our management focus to allow us the opportunity make use of methodologies and approaches which are based on common building blocks that can be utilized in the space, ground, and mission unique segments of all missions.

  2. End-to-End Multi-View Lipreading

    NARCIS (Netherlands)

    Petridis, Stavros; Wang, Yujiang; Li, Zuwei; Pantic, Maja

    2017-01-01

    Non-frontal lip views contain useful information which can be used to enhance the performance of frontal view lipreading. However, the vast majority of recent lipreading works, including the deep learning approaches which significantly outperform traditional approaches, have focused on frontal mouth

  3. End-to-end visual speech recognition with LSTMS

    NARCIS (Netherlands)

    Petridis, Stavros; Li, Zuwei; Pantic, Maja

    2017-01-01

    Traditional visual speech recognition systems consist of two stages, feature extraction and classification. Recently, several deep learning approaches have been presented which automatically extract features from the mouth images and aim to replace the feature extraction stage. However, research on

  4. End to End Inter-domain Quality of Service Provisioning

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy

    This thesis addresses selected topics of Quality of Service (QoS) provisioning in heterogeneous data networks that construct the communication environment of today's Internet. In the vast range of protocols available in different domains of network infrastructures, a few chosen ones are discussed......, the general UPnPQoS performance was assessed analytically and confirmed by simulations results. The results validate the usability of UPnP-QoS, but some open issues in the specication were identified. As a result of addressing mentioned shortcomings of UPnP-QoS, a few pre-emption algorithms for home gateway...... and discuss also access Passive Optical Network (PON) technologies, a GMPLS controlled Ten Gigabit Passive Optical Network (XGPON) was proposed. This part of the thesis introduces the possibility of managing the XG-PON by the GMPLS suite, showing again that this protocol suite is a good candidate...

  5. CMDS System Integration and IAMD End-to-End Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Cruise Missile Defense Systems (CMDS) Project Office is establishing a secure System Integration Laboratory at the AMRDEC. This lab will contain tactical Signal...

  6. End-to-End Service Oriented Architectures (SOA) Security Project

    Science.gov (United States)

    2012-02-01

    Java 6.0 (javax.ws) platform and deployed on boston.cs.purdue.edu. TB stores all data regarding sessions and services in a MySQL database, setup on...pointcut designators. JBoss AOP [JBO2] and AspectJ [ASP1] are powerful frameworks that implement AOP for Java programs. Its pointcut designators... hibernate cglib enhanced proxies <attribute name="Ignore">*$$EnhancerByCGLIB$$*</attribute> --> <attribute name="Optimized">true</attribute

  7. End-to-end experiment management in HPC

    Energy Technology Data Exchange (ETDEWEB)

    Bent, John M [Los Alamos National Laboratory; Kroiss, Ryan R [Los Alamos National Laboratory; Torrez, Alfred [Los Alamos National Laboratory; Wingate, Meghan [Los Alamos National Laboratory

    2010-01-01

    Experiment management in any domain is challenging. There is a perpetual feedback loop cycling through planning, execution, measurement, and analysis. The lifetime of a particular experiment can be limited to a single cycle although many require myriad more cycles before definite results can be obtained. Within each cycle, a large number of subexperiments may be executed in order to measure the effects of one or more independent variables. Experiment management in high performance computing (HPC) follows this general pattern but also has three unique characteristics. One, computational science applications running on large supercomputers must deal with frequent platform failures which can interrupt, perturb, or terminate running experiments. Two, these applications typically integrate in parallel using MPI as their communication medium. Three, there is typically a scheduling system (e.g. Condor, Moab, SGE, etc.) acting as a gate-keeper for the HPC resources. In this paper, we introduce LANL Experiment Management (LEM), an experimental management framework simplifying all four phases of experiment management. LEM simplifies experiment planning by allowing the user to describe their experimental goals without having to fully construct the individual parameters for each task. To simplify execution, LEM dispatches the subexperiments itself thereby freeing the user from remembering the often arcane methods for interacting with the various scheduling systems. LEM provides transducers for experiments that automatically measure and record important information about each subexperiment; these transducers can easily be extended to collect additional measurements specific to each experiment. Finally, experiment analysis is simplified by providing a general database visualization framework that allows users to quickly and easily interact with their measured data.

  8. Using SIM for strong end-to-end Application Authentication

    OpenAIRE

    Lunde, Lars; Wangensteen, Audun

    2006-01-01

    Today the Internet is mostly used for services that require low or none security. The commercial and governmental applications have started to emerge but met problems since they require strong authentication, which is both difficult and costly to realize. The SIM card used in mobile phones is a tamper resistant device that contains strong authentication mechanisms. It would be very convenient and cost-efficient if Internet services could use authentication methods based on the SIM. This mast...

  9. End-to-end simulation: The front end

    International Nuclear Information System (INIS)

    Haber, I.; Bieniosek, F.M.; Celata, C.M.; Friedman, A.; Grote, D.P.; Henestroza, E.; Vay, J.-L.; Bernal, S.; Kishek, R.A.; O'Shea, P.G.; Reiser, M.; Herrmannsfeldt, W.B.

    2002-01-01

    For the intense beams in heavy ion fusion accelerators, details of the beam distribution as it emerges from the source region can determine the beam behavior well downstream. This occurs because collective space-charge modes excited as the beam is born remain undamped for many focusing periods. Traditional studies of the source region in particle beam systems have emphasized the behavior of averaged beam characteristics, such as total current, rms beam size, or emittance, rather than the details of the full beam distribution function that are necessary to predict the excitation of these modes. Simulations of the beam in the source region and comparisons to experimental measurements at LBNL and the University of Maryland are presented to illustrate some of the complexity in beam characteristics that has been uncovered as increased attention has been devoted to developing a detailed understanding of the source region. Also discussed are methods of using the simulations to infer characteristics of the beam distribution that can be difficult to measure directly

  10. Network analysis on skype end-to-end video quality

    NARCIS (Netherlands)

    Exarchakos, Georgios; Druda, Luca; Menkovski, Vlado; Liotta, Antonio

    2015-01-01

    Purpose – This paper aims to argue on the efficiency of Quality of Service (QoS)-based adaptive streamingwith regards to perceived quality Quality of Experience (QoE). Although QoS parameters are extensivelyused even by high-end adaptive streaming algorithms, achieved QoE fails to justify their use

  11. End to End Beam Dynamics of the ESS Linac

    DEFF Research Database (Denmark)

    Thomsen, Heine Dølrath

    2012-01-01

    The European Spallation Source, ESS, uses a linear accelerator to deliver a high intensity proton beam to the target station. The nominal beam power on target will be 5 MW at an energy of 2.5 GeV. We briefly describe the individual accelerating structures and transport lines through which we have...

  12. Crew Transportation System Design Reference Missions

    Science.gov (United States)

    Mango, Edward J.

    2015-01-01

    Contains summaries of potential design reference mission goals for systems to transport humans to andfrom low Earth orbit (LEO) for the Commercial Crew Program. The purpose of this document is to describe Design Reference Missions (DRMs) representative of the end-to-end Crew Transportation System (CTS) framework envisioned to successfully execute commercial crew transportation to orbital destinations. The initial CTS architecture will likely be optimized to support NASA crew and NASA-sponsored crew rotation missions to the ISS, but consideration may be given in this design phase to allow for modifications in order to accomplish other commercial missions in the future. With the exception of NASA’s mission to the ISS, the remaining commercial DRMs are notional. Any decision to design or scar the CTS for these additional non-NASA missions is completely up to the Commercial Provider. As NASA’s mission needs evolve over time, this document will be periodically updated to reflect those needs.

  13. Smashing the Stovepipe: Leveraging the GMSEC Open Architecture and Advanced IT Automation to Rapidly Prototype, Develop and Deploy Next-Generation Multi-Mission Ground Systems

    Science.gov (United States)

    Swenson, Paul

    2017-01-01

    Satellite/Payload Ground Systems - Typically highly-customized to a specific mission's use cases - Utilize hundreds (or thousands!) of specialized point-to-point interfaces for data flows / file transfers Documentation and tracking of these complex interfaces requires extensive time to develop and extremely high staffing costs Implementation and testing of these interfaces are even more cost-prohibitive, and documentation often lags behind implementation resulting in inconsistencies down the road With expanding threat vectors, IT Security, Information Assurance and Operational Security have become key Ground System architecture drivers New Federal security-related directives are generated on a daily basis, imposing new requirements on current / existing ground systems - These mandated activities and data calls typically carry little or no additional funding for implementation As a result, Ground System Sustaining Engineering groups and Information Technology staff continually struggle to keep up with the rolling tide of security Advancing security concerns and shrinking budgets are pushing these large stove-piped ground systems to begin sharing resources - I.e. Operational / SysAdmin staff, IT security baselines, architecture decisions or even networks / hosting infrastructure Refactoring these existing ground systems into multi-mission assets proves extremely challenging due to what is typically very tight coupling between legacy components As a result, many "Multi-Mission" ops. environments end up simply sharing compute resources and networks due to the difficulty of refactoring into true multi-mission systems Utilizing continuous integration / rapid system deployment technologies in conjunction with an open architecture messaging approach allows System Engineers and Architects to worry less about the low-level details of interfaces between components and configuration of systems GMSEC messaging is inherently designed to support multi-mission requirements, and

  14. An integrated radar model solution for mission level performance and cost trades

    Science.gov (United States)

    Hodge, John; Duncan, Kerron; Zimmerman, Madeline; Drupp, Rob; Manno, Mike; Barrett, Donald; Smith, Amelia

    2017-05-01

    A fully integrated Mission-Level Radar model is in development as part of a multi-year effort under the Northrop Grumman Mission Systems (NGMS) sector's Model Based Engineering (MBE) initiative to digitally interconnect and unify previously separate performance and cost models. In 2016, an NGMS internal research and development (IR and D) funded multidisciplinary team integrated radio frequency (RF), power, control, size, weight, thermal, and cost models together using a commercial-off-the-shelf software, ModelCenter, for an Active Electronically Scanned Array (AESA) radar system. Each represented model was digitally connected with standard interfaces and unified to allow end-to-end mission system optimization and trade studies. The radar model was then linked to the Air Force's own mission modeling framework (AFSIM). The team first had to identify the necessary models, and with the aid of subject matter experts (SMEs) understand and document the inputs, outputs, and behaviors of the component models. This agile development process and collaboration enabled rapid integration of disparate models and the validation of their combined system performance. This MBE framework will allow NGMS to design systems more efficiently and affordably, optimize architectures, and provide increased value to the customer. The model integrates detailed component models that validate cost and performance at the physics level with high-level models that provide visualization of a platform mission. This connectivity of component to mission models allows hardware and software design solutions to be better optimized to meet mission needs, creating cost-optimal solutions for the customer, while reducing design cycle time through risk mitigation and early validation of design decisions.

  15. A Rapid Turn-around, Scalable Big Data Processing Capability for the JPL Airborne Snow Observatory (ASO) Mission

    Science.gov (United States)

    Mattmann, C. A.

    2014-12-01

    The JPL Airborne Snow Observatory (ASO) is an integrated LIDAR and Spectrometer measuring snow depth and rate of snow melt in the Sierra Nevadas, specifically, the Tuolumne River Basin, Sierra Nevada, California above the O'Shaughnessy Dam of the Hetch Hetchy reservoir, and the Uncompahgre Basin, Colorado, amongst other sites. The ASO data was delivered to water resource managers from the California Department of Water Resources in under 24 hours from the time that the Twin Otter aircraft landed in Mammoth Lakes, CA to the time disks were plugged in to the ASO Mobile Compute System (MCS) deployed at the Sierra Nevada Aquatic Research Laboratory (SNARL) near the airport. ASO performed weekly flights and each flight took between 500GB to 1 Terabyte of raw data, which was then processed from level 0 data products all the way to full level 4 maps of Snow Water Equivalent, albedo mosaics, and snow depth from LIDAR. These data were produced by Interactive Data analysis Language (IDL) algorithms which were then unobtrusively and automatically integrated into an Apache OODT and Apache Tika based Big Data processing system. Data movement was both electronic and physical including novel uses of LaCie 1 and 2 TeraByte (TB) data bricks and deployment in rugged terrain. The MCS was controlled remotely from the Jet Propulsion Laboratory, California Institute of Technology (JPL) in Pasadena, California on behalf of the National Aeronautics and Space Administration (NASA). Communication was aided through the use of novel Internet Relay Chat (IRC) command and control mechanisms and through the use of the Notifico open source communication tools. This talk will describe the high powered, and light-weight Big Data processing system that we developed for ASO and its implications more broadly for airborne missions at NASA and throughout the government. The lessons learned from ASO show the potential to have a large impact in the development of Big Data processing systems in the years

  16. Comparação entre dois fios de sutura não absorvíveis na anastomose traqueal término-terminal em cães Comparison of two nonabsorbable suture materials in the end-to-end tracheal anastomosis in dogs

    Directory of Open Access Journals (Sweden)

    Sheila Canevese Rahal

    1995-01-01

    Full Text Available Doze cães sem raça definida, com idade variando entre 1 e 6 anos e peso de 6 a 20kg, foram submetidos a ressecção traqueal e anastomose término-terminal, na qual foram testados os fios poliéster trançado não capilar e náilon monofilamento. Seis animais, cada três com um mesmo tipo de fio de sutura, sofreram a excisão equivalente a três anéis traqueais. Com 15 dias foi executada uma nova intervenção onde se ressecou o equivalente a mais seis anéis, perfazendo um total de nove. Ao final de outros 15 dias foram sacrificados. Os outros seis animais, cada três com um mesmo tipo de fio, foram submetidos à excisão equivalente a três anéis traqueais e mantidos por 43 dias. As traquéias foram avaliadas por exames clínicos, radiográficos, macroscópicos e histopatológicos. O fio de náilon monofilamento apresentou menos reação tecidual do que o poliéster trançado não capilar, promoveu uma anastomose segura e com menor chance de formação de granuloma.Twelve mongrel dogs, with age between 1 and 6 years old and weight between 12 and 40 pounds, were submitted to tracheal resection and end-to-end anastomosis in which were tested braided polyester no capillary and monofilament nylon materiais. Six animais, every threeones with a same type of suture material, suffered the excision equivalent to three tracheal rings. A new intervention was performed with fifteen days, in which the equivalent of more six tracheal rings were removed, completing the total of nine. At the end of more fifteen days they were sacrificed. The other six animals, every three with a same type of suture material, were submitted to the excision equivalent to three tracheal rings and maintained for 43 days. The tracheal anastomosis were evaluated to clinic, radiographic, macroscopic and histopathologic studies. The monofilament nylon material exhibited less reaction than polyester and promoted a secure anastomosis with less risk of granuloma formation.

  17. Internet Technology for Future Space Missions

    Science.gov (United States)

    Hennessy, Joseph F. (Technical Monitor); Rash, James; Casasanta, Ralph; Hogie, Keith

    2002-01-01

    Ongoing work at National Aeronautics and Space Administration Goddard Space Flight Center (NASA/GSFC), seeks to apply standard Internet applications and protocols to meet the technology challenge of future satellite missions. Internet protocols and technologies are under study as a future means to provide seamless dynamic communication among heterogeneous instruments, spacecraft, ground stations, constellations of spacecraft, and science investigators. The primary objective is to design and demonstrate in the laboratory the automated end-to-end transport of files in a simulated dynamic space environment using off-the-shelf, low-cost, commodity-level standard applications and protocols. The demonstrated functions and capabilities will become increasingly significant in the years to come as both earth and space science missions fly more sensors and the present labor-intensive, mission-specific techniques for processing and routing data become prohibitively. This paper describes how an IP-based communication architecture can support all existing operations concepts and how it will enable some new and complex communication and science concepts. The authors identify specific end-to-end data flows from the instruments to the control centers and scientists, and then describe how each data flow can be supported using standard Internet protocols and applications. The scenarios include normal data downlink and command uplink as well as recovery scenarios for both onboard and ground failures. The scenarios are based on an Earth orbiting spacecraft with downlink data rates from 300 Kbps to 4 Mbps. Included examples are based on designs currently being investigated for potential use by the Global Precipitation Measurement (GPM) mission.

  18. The Nasa-Isro SAR Mission Science Data Products and Processing Workflows

    Science.gov (United States)

    Rosen, P. A.; Agram, P. S.; Lavalle, M.; Cohen, J.; Buckley, S.; Kumar, R.; Misra-Ray, A.; Ramanujam, V.; Agarwal, K. M.

    2017-12-01

    The NASA-ISRO SAR (NISAR) Mission is currently in the development phase and in the process of specifying its suite of data products and algorithmic workflows, responding to inputs from the NISAR Science and Applications Team. NISAR will provide raw data (Level 0), full-resolution complex imagery (Level 1), and interferometric and polarimetric image products (Level 2) for the entire data set, in both natural radar and geocoded coordinates. NASA and ISRO are coordinating the formats, meta-data layers, and algorithms for these products, for both the NASA-provided L-band radar and the ISRO-provided S-band radar. Higher level products will be also be generated for the purpose of calibration and validation, over large areas of Earth, including tectonic plate boundaries, ice sheets and sea-ice, and areas of ecosystem disturbance and change. This level of comprehensive product generation has been unprecedented for SAR missions in the past, and leads to storage processing challenges for the production system and the archive center. Further, recognizing the potential to support applications that require low latency product generation and delivery, the NISAR team is optimizing the entire end-to-end ground data system for such response, including exploring the advantages of cloud-based processing, algorithmic acceleration using GPUs, and on-demand processing schemes that minimize computational and transport costs, but allow rapid delivery to science and applications users. This paper will review the current products, workflows, and discuss the scientific and operational trade-space of mission capabilities.

  19. A Lean, Fast Mars Round-trip Mission Architecture: Using Current Technologies for a Human Mission in the 2030s

    Science.gov (United States)

    Bailey, Lora; Folta, David; Barbee, Brent W.; Vaughn, Frank; Kirchman, Frank; Englander, Jacob; Campbell, Bruce; Thronson, Harley; Lin, Tzu Yu

    2013-01-01

    We present a lean fast-transfer architecture concept for a first human mission to Mars that utilizes current technologies and two pivotal parameters: an end-to-end Mars mission duration of approximately one year, and a deep space habitat of approximately 50 metric tons. These parameters were formulated by a 2012 deep space habitat study conducted at the NASA Johnson Space Center (JSC) that focused on a subset of recognized high- engineering-risk factors that may otherwise limit space travel to destinations such as Mars or near-Earth asteroid (NEA)s. With these constraints, we model and promote Mars mission opportunities in the 2030s enabled by a combination of on-orbit staging, mission element pre-positioning, and unique round-trip trajectories identified by state-of-the-art astrodynamics algorithms.

  20. Processus ultra-rapides associés à la dynamique d'émission de la protéine GFP

    Science.gov (United States)

    Didier, P.; Guidoni, L.; Schwalbach, G.; Bigot, J.-Y.

    2002-06-01

    La protéine GFP (Green Fluorescent Protein) est un marqueur très efficace, utilisable en milieu vivant. La spectroscopie femtoseconde est particulièrement bien adaptée pour comprendre les mécanismes d'émission de cette protéine, étant donné la rapidité des processus de transfert mis en jeu. Nous-présentons des résultats sur la dynamique spectro-temporelle d'émission du mutant GFPuv résolue à l'échelle de la centaine de femtosecondes. Une transition Raman à 3300 cm^{-1} ainsi que la dynamique d'etablissement du gain avec un temps caractéristique d'environ 1.5 ps ont été mis en évidence.

  1. Interplanetary Trajectory Design for the Asteroid Robotic Redirect Mission Alternate Approach Trade Study

    Science.gov (United States)

    Merrill, Raymond Gabriel; Qu, Min; Vavrina, Matthew A.; Englander, Jacob A.; Jones, Christopher A.

    2014-01-01

    This paper presents mission performance analysis methods and results for the Asteroid Robotic Redirect Mission (ARRM) option to capture a free standing boulder on the surface of a 100 m or larger NEA. It details the optimization and design of heliocentric low-thrust trajectories to asteroid targets for the ARRM solar electric propulsion spacecraft. Extensive searches were conducted to determine asteroid targets with large pick-up mass potential and potential observation opportunities. Interplanetary trajectory approximations were developed in method based tools for Itokawa, Bennu, 1999 JU3, and 2008 EV5 and were validated by end-to-end integrated trajectories.

  2. A Review of New and Developing Technology to Significantly Improve Mars Sample-Return Missions

    Science.gov (United States)

    Carsey, F.; Brophy, J.; Gilmore, M.; Rodgers, D.; Wilcox, B.

    2000-07-01

    A JPL development activity was initiated in FY 1999 for the purpose of examining and evaluating technologies that could materially improve future (i.e., beyond the 2005 launch) Mars sample return missions. The scope of the technology review was comprehensive and end-to-end; the goal was to improve mass, cost, risk, and scientific return. A specific objective was to assess approaches to sample return with only one Earth launch. While the objective of the study was specifically for sample-return, in-situ missions can also benefit from using many of the technologies examined.

  3. Space Network IP Services (SNIS): An Architecture for Supporting Low Earth Orbiting IP Satellite Missions

    Science.gov (United States)

    Israel, David J.

    2005-01-01

    The NASA Space Network (SN) supports a variety of missions using the Tracking and Data Relay Satellite System (TDRSS), which includes ground stations in White Sands, New Mexico and Guam. A Space Network IP Services (SNIS) architecture is being developed to support future users with requirements for end-to-end Internet Protocol (IP) communications. This architecture will support all IP protocols, including Mobile IP, over TDRSS Single Access, Multiple Access, and Demand Access Radio Frequency (RF) links. This paper will describe this architecture and how it can enable Low Earth Orbiting IP satellite missions.

  4. Gas mission; Mission gaz

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    This preliminary report analyses the desirable evolutions of gas transport tariffing and examines some questions relative to the opening of competition on the French gas market. The report is made of two documents: a synthesis of the previous report with some recommendations about the tariffing of gas transport, about the modalities of network access to third parties, and about the dissociation between transport and trade book-keeping activities. The second document is the progress report about the opening of the French gas market. The first part presents the European problem of competition in the gas supply and its consequences on the opening and operation of the French gas market. The second part presents some partial syntheses about each topic of the mission letter of the Ministry of Economics, Finances and Industry: future evolution of network access tariffs, critical analysis of contractual documents for gas transport and delivery, examination of auxiliary services linked with the access to the network (modulation, balancing, conversion), consideration about the processing of network congestions and denied accesses, analysis of the metering dissociation between the integrated activities of gas operators. Some documents are attached in appendixes: the mission letter from July 9, 2001, the detailed analysis of the new temporary tariffs of GdF and CFM, the offer of methane terminals access to third parties, the compatibility of a nodal tariffing with the presence of three transport operators (GdF, CFM and GSO), the contract-type for GdF supply, and the contract-type for GdF connection. (J.S.)

  5. NASA Planetary Science Summer School: Preparing the Next Generation of Planetary Mission Leaders

    Science.gov (United States)

    Lowes, L. L.; Budney, C. J.; Sohus, A.; Wheeler, T.; Urban, A.; NASA Planetary Science Summer School Team

    2011-12-01

    , during which their mentors aid them in finalizing their mission design and instrument suite, and in making the necessary trade-offs to stay within the cost cap. Tours of JPL facilities highlight the end-to-end life cycle of a mission. At week's end, students present their Concept Study to a "proposal review board" of JPL scientists and engineers and NASA Headquarters executives, who feed back the strengths and weaknesses of their proposal and mission design. A survey of Planetary Science Summer School alumni administered in summer of 2011 provides information on the program's impact on students' career choices and leadership roles as they pursue their employment in planetary science and related fields. Preliminary results will be discussed during the session. Almost a third of the approximately 450 Planetary Science Summer School alumni from the last 10 years of the program are currently employed by NASA or JPL. The Planetary Science Summer School is implemented by the JPL Education Office in partnership with JPL's Team X Project Design Center.

  6. [The mission].

    Science.gov (United States)

    Ruiz Moreno, J; Blanch Mon, A

    2000-01-01

    After having made a historical review of the concept of mission statement, of evaluating its importance (See Part I), of describing the bases to create a mission statement from a strategic perspective and of analyzing the advantages of this concept, probably more important as a business policy (See Parts I and II), the authors proceed to analyze the mission statement in health organizations. Due to the fact that a mission statement is lacking in the majority of health organizations, the strategy of health organizations are not exactly favored; as a consequence, neither are its competitive advantage nor the development of its essential competencies. After presenting a series of mission statements corresponding to Anglo-Saxon health organizations, the authors highlight two mission statements corresponding to our social context. The article finishes by suggesting an adequate sequence for developing a mission statement in those health organizations having a strategic sense.

  7. MDP: Reliable File Transfer for Space Missions

    Science.gov (United States)

    Rash, James; Criscuolo, Ed; Hogie, Keith; Parise, Ron; Hennessy, Joseph F. (Technical Monitor)

    2002-01-01

    This paper presents work being done at NASA/GSFC by the Operating Missions as Nodes on the Internet (OMNI) project to demonstrate the application of the Multicast Dissemination Protocol (MDP) to space missions to reliably transfer files. This work builds on previous work by the OMNI project to apply Internet communication technologies to space communication. The goal of this effort is to provide an inexpensive, reliable, standard, and interoperable mechanism for transferring files in the space communication environment. Limited bandwidth, noise, delay, intermittent connectivity, link asymmetry, and one-way links are all possible issues for space missions. Although these are link-layer issues, they can have a profound effect on the performance of transport and application level protocols. MDP, a UDP-based reliable file transfer protocol, was designed for multicast environments which have to address these same issues, and it has done so successfully. Developed by the Naval Research Lab in the mid 1990's, MDP is now in daily use by both the US Post Office and the DoD. This paper describes the use of MDP to provide automated end-to-end data flow for space missions. It examines the results of a parametric study of MDP in a simulated space link environment and discusses the results in terms of their implications for space missions. Lessons learned are addressed, which suggest minor enhancements to the MDP user interface to add specific features for space mission requirements, such as dynamic control of data rate, and a checkpoint/resume capability. These are features that are provided for in the protocol, but are not implemented in the sample MDP application that was provided. A brief look is also taken at the status of standardization. A version of MDP known as NORM (Neck Oriented Reliable Multicast) is in the process of becoming an IETF standard.

  8. [Myanmar mission].

    Science.gov (United States)

    Alfandari, B; Persichetti, P; Pelissier, P; Martin, D; Baudet, J

    2004-06-01

    The authors report the accomplishment of humanitarian missions in plastic surgery performed by a small team in town practice in Yangon, about their 3 years experience in Myanmar with 300 consultations and 120 surgery cases. They underline the interest of this type of mission and provide us their reflexion about team training, the type of relation with the country where the mission is conducted and the type of right team.

  9. Multi-hop Relaying: An End-to-End Delay Analysis

    KAUST Repository

    Chaaban, Anas

    2015-12-01

    The impact of multi-hopping schemes on the communication latency in a relay channel is studied. The main aim is to characterize conditions under which such schemes decrease the communication latency given a reliability requirement. Both decode-forward (DF) and amplify-forward (AF) with block coding are considered, and are compared with the point-to-point (P2P) scheme which ignores the relay. Latency expressions for the three schemes are derived, and conditions under which DF and AF reduce latency are obtained for high signal-to-noise ratio (SNR). Interestingly, these conditions are more strict when compared to the conditions under which the same multi-hopping schemes achieve higher long-term (information-theoretic) rates than P2P. It turns out that the relation between the sourcedestination SNR and the harmonic mean of the SNR’s of the channels to and from the relay dictates whether multi-hopping reduces latency or not.

  10. Adaptive end-to-end optimization of mobile video streaming using QoS negotiation

    NARCIS (Netherlands)

    Taal, Jacco R.; Langendoen, Koen; van der Schaaf, Arjen; van Dijk, H.W.; Lagendijk, R. (Inald) L.

    Video streaming over wireless links is a non-trivial problem due to the large and frequent changes in the quality of the underlying radio channel combined with latency constraints. We believe that every layer in a mobile system must be prepared to adapt its behavior to its environment. Thus layers

  11. End-to-End Verification of Information-Flow Security for C and Assembly Programs

    Science.gov (United States)

    2016-04-01

    seL4 security verification [18] avoids this issue in the same way. In that work, the authors frame their solution as a restriction that disallows...identical: (σ, σ′1) ∈ TM ∧ (σ, σ′2) ∈ TM =⇒ Ol(σ′1) = Ol(σ′2) The successful security verifications of both seL4 and mCertiKOS provide reasonable...evidence that this restriction on specifications is not a major hindrance for usability. Unlike the seL4 verification, however, our framework runs into a

  12. MONTAGE: A Methodology for Designing Composable End-to-End Secure Distributed Systems

    Science.gov (United States)

    2012-08-01

    and verification, from PSOS [NF03] to the recent seL4 [KEH+09]. While they make considerable progress toward high-assurance OS, these works are not...of the specification itself. Examples include the seL4 microkernel work by Klein et al. [KEH+09], which presents the experience of formally proving...David Cock, Philip Derrin, Dhammika Elkaduwe, Kai Engelhardt, Rafal Kolanski, Michael Norrish, Thomas Sewell, Harvey Tuch, and Simon Winwood. sel4

  13. Future Wireless Network: MyNET Platform and End-to-End Network Slicing

    OpenAIRE

    Zhang, Hang

    2016-01-01

    Future wireless networks are facing new challenges. These new challenges require new solutions and strategies of the network deployment, management, and operation. Many driving factors are decisive in the re-definition and re-design of the future wireless network architecture. In the previously published paper "5G Wireless Network - MyNET and SONAC", MyNET and SONAC, a future network architecture, are described. This paper elaborates MyNET platform with more details. The design principles of ...

  14. The Knowledge Graph for End-to-End Learning on Heterogeneous Knowledge

    NARCIS (Netherlands)

    Wilcke, W.X.; Bloem, P.; de Boer, Viktor

    2018-01-01

    In modern machine learning,raw data is the preferred input for our models. Where a decade ago data scientists were still engineering features, manually picking out the details we thought salient, they now prefer the data in their raw form. As long as we can assume that all relevant and irrelevant

  15. Network Slicing in Industry 4.0 Applications: Abstraction Methods and End-to-End Analysis

    DEFF Research Database (Denmark)

    Nielsen, Jimmy Jessen; Popovski, Petar; Kalør, Anders Ellersgaard

    2018-01-01

    Industry 4.0 refers to the fourth industrial revolution, and introduces modern communication and computation technologies such as 5G, cloud computing and Internet of Things to industrial manufacturing systems. As a result, many devices, machines and applications will rely on connectivity, while...... having different requirements from the network, ranging from high reliability and low latency to high data rates. Furthermore, these industrial networks will be highly heterogeneous as they will feature a number of diverse communication technologies. In this article, we propose network slicing...

  16. End-to-End Deep Learning Model For Automatic Sleep Staging Using Raw PSG Waveforms

    DEFF Research Database (Denmark)

    Olesen, Alexander Neergaard; Peppard, P. E.; Sorensen, H. B.

    2018-01-01

    Deep learning has seen significant progress over the last few years, especially in computer vision, where competitions such as the ImageNet challenge have been the driving factor behind many new model architectures far superior to humans in image recognition. We propose a novel method for automatic...... accuracy, precision and recall were 84.93%, 97.42% and 97.02%, respectively. Evaluating on the validation set yielded an overall accuracy of 85.07% and overall precision/recall of 98.54% and 95.72%, respectively. Conclusion: Preliminary results indicate that state of the art deep learning models can...... sleep staging, which relies on current advances in computer vision models eliminating the need for feature engineering or other transformations of input data. By exploiting the high capacity for complex learning in a state of the art object recognition model, we can effectively use raw PSG signals...

  17. End-to-End Mechanisms for Rate-Adaptive Multicast Streaming over the Internet

    OpenAIRE

    Rimac, Ivica

    2005-01-01

    Continuous media applications over packet-switched networks are becoming more and more popular. Radio stations, for example, already use streaming technology to disseminate their content to users on the Internet, and video streaming services are expected to experience similar popularity. In contrast to traditional television and radio broadcast systems, however, prevalent Internet streaming solutions are based on unicast communication and raise scalability and efficiency issues. Multicast com...

  18. An end-to-end security auditing approach for service oriented architectures

    NARCIS (Netherlands)

    Azarmi, M.; Bhargava, B.; Angin, P.; Ranchal, R.; Ahmed, N.; Sinclair, A.; Linderman, M.; Ben Othmane, L.

    2012-01-01

    Service-Oriented Architecture (SOA) is becoming a major paradigm for distributed application development in the recent explosion of Internet services and cloud computing. However, SOA introduces new security challenges not present in the single-hop client-server architectures due to the involvement

  19. Enhancing end-to-end QoS for multimedia streaming in IMS-based networks

    NARCIS (Netherlands)

    Ozcelebi, T.; Radovanovic, I.; Chaudron, M.R.V.

    2007-01-01

    Convergence of the emerging IP Multimedia Subsystem(IMS) includes unlicensed, nondedicated and nondeterministic hence uncontrollable. computer access, networks for IP multimedia services. It enables provision of resource demanding real-time services and multimedia communication raising new

  20. An end-to-end computing model for the Square Kilometre Array

    NARCIS (Netherlands)

    Jongerius, R.; Wijnholds, S.; Nijboer, R.; Corporaal, H.

    2014-01-01

    For next-generation radio telescopes such as the Square Kilometre Array, seemingly minor changes in scientific constraints can easily push computing requirements into the exascale domain. The authors propose a model for engineers and astronomers to understand these relations and make tradeoffs in

  1. AAL Security and Privacy: transferring XACML policies for end-to-end acess and usage control

    NARCIS (Netherlands)

    Vlamings, H.G.M.; Koster, R.P.

    2010-01-01

    Ambient Assisted Living (AAL) systems and services aim to provide a solution for growing healthcare expenses and degradation of life quality of elderly using information and communication technology. Inparticular AAL solutions are being created that are heavily based on web services an sensor

  2. IMS Intra- and Inter Domain End-to-End Resilience Analysis

    DEFF Research Database (Denmark)

    Kamyod, Chayapol; Nielsen, Rasmus Hjorth; Prasad, Neeli R.

    2013-01-01

    This paper evaluated resilience of the reference IMS based network topology in operation through the keys reliability parameters via OPNET. The reliability behaviors of communication within similar and across registered home IMS domains were simulated and compared. Besides, the reliability effects...

  3. Topological Constraints on Identifying Additive Link Metrics via End-to-end Paths Measurements

    Science.gov (United States)

    2012-09-20

    identifiable if and only ifR in (1) has full column rank, i.e., rank(R) = n. In other words, to uniquely determine w, there must be n linearly...be identified from paths traversing l1; similar argument applies to l2. Moreover, similar analysis as in the proof of this lemma shows that none of

  4. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    Science.gov (United States)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  5. End-to-end requirements management for multiprojects in the construction industry

    DEFF Research Database (Denmark)

    Wörösch, Michael

    Performance Concrete and insulation materials – is used. By means of action research and interviews of case project staff it has become evident that many elements of formalized requirements management are missing in the case project. To fill those gaps and be able to manage requirements end...... with regards to requirements management. As the literature study gives little new information, a series of interviews are initiated with experts from industry and universities. Those interviews reveal major shortcomings in the way requirements are handled in Danish construction companies today. In order...... to give managers of construction projects a useful and guiding tool for formally managing requirements that is rooted in practice, the “Conceptual requirements management framework”, is created. The framework builds upon the gathered empirical data, obtained by action research, interviews, and available...

  6. Ubiquitous Monitoring Solution for Wireless Sensor Networks with Push Notifications and End-to-End Connectivity

    Directory of Open Access Journals (Sweden)

    Luis M. L. Oliveira

    2014-01-01

    Full Text Available Wireless Sensor Networks (WSNs belongs to a new trend in technology in which tiny and resource constrained devices are wirelessly interconnected and are able to interact with the surrounding environment by collecting data such as temperature and humidity. Recently, due to the huge growth in the use of mobile devices with Internet connection, smartphones are becoming the center of future ubiquitous wireless networks. Interconnecting WSNs with smartphones and the Internet is a big challenge and new architectures are required due to the heterogeneity of these devices. Taking into account that people are using smartphones with Internet connection, there is a good opportunity to propose a new architecture for wireless sensors monitoring using push notifications and smartphones. Then, this paper proposes a ubiquitous approach for WSN monitoring based on a REST Web Service, a relational database, and an Android mobile application. Real-time data sensed by WSNs are sent directly to a smartphone or stored in a database and requested by the mobile application using a well-defined RESTful interface. A push notification system was created in order to alert mobile users when a sensor parameter overcomes a given threshold. The proposed architecture and mobile application were evaluated and validated using a laboratory WSN testbed and are ready for use.

  7. Modeling and Simulation of Satellite Subsystems for End-to-End Spacecraft Modeling

    National Research Council Canada - National Science Library

    Schum, William K; Doolittle, Christina M; Boyarko, George A

    2006-01-01

    During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems...

  8. Designing a holistic end-to-end intelligent network analysis and security platform

    Science.gov (United States)

    Alzahrani, M.

    2018-03-01

    Firewall protects a network from outside attacks, however, once an attack entering a network, it is difficult to detect. Recent significance accidents happened. i.e.: millions of Yahoo email account were stolen and crucial data from institutions are held for ransom. Within two year Yahoo’s system administrators were not aware that there are intruder inside the network. This happened due to the lack of intelligent tools to monitor user behaviour in internal network. This paper discusses a design of an intelligent anomaly/malware detection system with proper proactive actions. The aim is to equip the system administrator with a proper tool to battle the insider attackers. The proposed system adopts machine learning to analyse user’s behaviour through the runtime behaviour of each node in the network. The machine learning techniques include: deep learning, evolving machine learning perceptron, hybrid of Neural Network and Fuzzy, as well as predictive memory techniques. The proposed system is expanded to deal with larger network using agent techniques.

  9. Improving End-To-End Tsunami Warning for Risk Reduction on Canada’s West Coast

    Science.gov (United States)

    2015-01-01

    in 2014, up from 455 cals in 2013 (Chamber of Shipping, 2014). Even the more traditional forms of marine tourism such as sports fishing have been...some of the most noteworthy areas of new economic activity to emerge have been aquaculture, recreation and tourism , research and oil, gas and other...Risk Reduction on Canada’s West Coast (CSSP-2013-TI-1033) 3   annual value of output over $590 milion (Fisheries and Oceans Canada, 2013). Tourism

  10. Research on the Establishment and Evaluation of End - to - End Service Quality Index System

    Science.gov (United States)

    Wei, Chen; Jing, Tao; Ji, Yutong

    2018-01-01

    From the perspective of power data networks, put forward the index system model to measure the quality of service, covering user experience, business performance, network capacity support, etc., and gives the establishment and use of each layer index in the model.

  11. Increasing Army Supply Chain Performance: Using an Integrated End to End Metrics System

    Science.gov (United States)

    2017-01-01

    Sched Deliver Sched Delinquent Contracts Current Metrics PQDR/SDRs Forecasting Accuracy Reliability Demand Management Asset Mgmt Strategies Pipeline...are identified and characterized by statistical analysis. The study proposed a framework and tool for inventory management based on factors such as

  12. End-to-end unsupervised deformable image registration with a convolutional neural network

    NARCIS (Netherlands)

    de Vos, Bob D.; Berendsen, Floris; Viergever, Max A.; Staring, Marius; Išgum, Ivana

    2017-01-01

    In this work we propose a deep learning network for deformable image registration (DIRNet). The DIRNet consists of a convolutional neural network (ConvNet) regressor, a spatial transformer, and a resampler. The ConvNet analyzes a pair of fixed and moving images and outputs parameters for the spatial

  13. An end-to-end assessment of extreme weather impacts on food security

    Science.gov (United States)

    Chavez, Erik; Conway, Gordon; Ghil, Michael; Sadler, Marc

    2015-11-01

    Both governments and the private sector urgently require better estimates of the likely incidence of extreme weather events, their impacts on food crop production and the potential consequent social and economic losses. Current assessments of climate change impacts on agriculture mostly focus on average crop yield vulnerability to climate and adaptation scenarios. Also, although new-generation climate models have improved and there has been an exponential increase in available data, the uncertainties in their projections over years and decades, and at regional and local scale, have not decreased. We need to understand and quantify the non-stationary, annual and decadal climate impacts using simple and communicable risk metrics that will help public and private stakeholders manage the hazards to food security. Here we present an `end-to-end’ methodological construct based on weather indices and machine learning that integrates current understanding of the various interacting systems of climate, crops and the economy to determine short- to long-term risk estimates of crop production loss, in different climate and adaptation scenarios. For provinces north and south of the Yangtze River in China, we have found that risk profiles for crop yields that translate climate into economic variability follow marked regional patterns, shaped by drivers of continental-scale climate. We conclude that to be cost-effective, region-specific policies have to be tailored to optimally combine different categories of risk management instruments.

  14. Privacy in Pharmacogenetics: An End-to-End Case Study of Personalized Warfarin Dosing.

    Science.gov (United States)

    Fredrikson, Matthew; Lantz, Eric; Jha, Somesh; Lin, Simon; Page, David; Ristenpart, Thomas

    2014-08-01

    We initiate the study of privacy in pharmacogenetics, wherein machine learning models are used to guide medical treatments based on a patient's genotype and background. Performing an in-depth case study on privacy in personalized warfarin dosing, we show that suggested models carry privacy risks, in particular because attackers can perform what we call model inversion : an attacker, given the model and some demographic information about a patient, can predict the patient's genetic markers. As differential privacy (DP) is an oft-proposed solution for medical settings such as this, we evaluate its effectiveness for building private versions of pharmacogenetic models. We show that DP mechanisms prevent our model inversion attacks when the privacy budget is carefully selected . We go on to analyze the impact on utility by performing simulated clinical trials with DP dosing models. We find that for privacy budgets effective at preventing attacks, patients would be exposed to increased risk of stroke, bleeding events, and mortality . We conclude that current DP mechanisms do not simultaneously improve genomic privacy while retaining desirable clinical efficacy, highlighting the need for new mechanisms that should be evaluated in situ using the general methodology introduced by our work.

  15. End-to-End Key Exchange through Disjoint Paths in P2P Networks

    Directory of Open Access Journals (Sweden)

    Daouda Ahmat

    2015-01-01

    Full Text Available Due to their inherent features, P2P networks have proven to be effective in the exchange of data between autonomous peers. Unfortunately, these networks are subject to various security threats that cannot be addressed readily since traditional security infrastructures, which are centralized, cannot be applied to them. Furthermore, communication reliability across the Internet is threatened by various attacks, including usurpation of identity, eavesdropping or traffic modification. Thus, in order to overcome these security issues and allow peers to securely exchange data, we propose a new key management scheme over P2P networks. Our approach introduces a new method that enables a secret key exchange through disjoint paths in the absence of a trusted central coordination point which would be required in traditional centralized security systems.

  16. Effect of 3 Key Factors on Average End to End Delay and Jitter in MANET

    Directory of Open Access Journals (Sweden)

    Saqib Hakak

    2015-01-01

    Full Text Available A mobile ad-hoc network (MANET is a self-configuring infrastructure-less network of mobile devices connected by wireless links where each node or mobile device is independent to move in any desired direction and thus the links keep moving from one node to another. In such a network, the mobile nodes are equipped with CSMA/CA (carrier sense multiple access with collision avoidance transceivers and communicate with each other via radio. In MANETs, routing is considered one of the most difficult and challenging tasks. Because of this, most studies on MANETs have focused on comparing protocols under varying network conditions. But to the best of our knowledge no one has studied the effect of other factors on network performance indicators like throughput, jitter and so on, revealing how much influence a particular factor or group of factors has on each network performance indicator. Thus, in this study the effects of three key factors, i.e. routing protocol, packet size and DSSS rate, were evaluated on key network performance metrics, i.e. average delay and average jitter, as these parameters are crucial for network performance and directly affect the buffering requirements for all video devices and downstream networks.

  17. Intelligent End-To-End Resource Virtualization Using Service Oriented Architecture

    NARCIS (Netherlands)

    Onur, E.; Sfakianakis, E.; Papagianni, C.; Karagiannis, Georgios; Kontos, T.; Niemegeers, I.G.M.M.; Niemegeers, I.; Chochliouros, I.; Heemstra de Groot, S.M.; Sjödin, P.; Hidell, M.; Cinkler, T.; Maliosz, M.; Kaklamani, D.I.; Carapinha, J.; Belesioti, M.; Futrps, E.

    2009-01-01

    Service-oriented architecture can be considered as a philosophy or paradigm in organizing and utilizing services and capabilities that may be under the control of different ownership domains. Virtualization provides abstraction and isolation of lower level functionalities, enabling portability of

  18. End-to-end simulation of a visible 1 kW FEL

    International Nuclear Information System (INIS)

    Parazzoli, Claudio G.; Koltenbah, Benjamin E.C.

    2000-01-01

    In this paper we present the complete numerical simulation of the 1 kW visible Free Electron Laser under construction in Seattle. We show that the goal of producing 1.0 kW at 0.7 μm is well within the hardware capabilities. We simulate in detail the evolution of the electron bunch phase space in the entire e-beam line. The e-beam line includes the photo-injector cavities, the 433.33 MHz accelerator, the magnetic buncher, the 1300 MHz accelerator, the 180 deg. bend and the matching optics into the wiggler. The computed phase space is input for a three-dimensional time-dependent code that predicts the FEL performance. All the computations are based on state of the art software, and the limitations of the current software are discussed. We believe that this is the first time that such a thorough numerical simulation has been carried out and that such a realistic electron phase space has been used in FEL performance calculations

  19. End-to-end workflow for finite element analysis of tumor treating fields in glioblastomas

    Science.gov (United States)

    Timmons, Joshua J.; Lok, Edwin; San, Pyay; Bui, Kevin; Wong, Eric T.

    2017-11-01

    Tumor Treating Fields (TTFields) therapy is an approved modality of treatment for glioblastoma. Patient anatomy-based finite element analysis (FEA) has the potential to reveal not only how these fields affect tumor control but also how to improve efficacy. While the automated tools for segmentation speed up the generation of FEA models, multi-step manual corrections are required, including removal of disconnected voxels, incorporation of unsegmented structures and the addition of 36 electrodes plus gel layers matching the TTFields transducers. Existing approaches are also not scalable for the high throughput analysis of large patient volumes. A semi-automated workflow was developed to prepare FEA models for TTFields mapping in the human brain. Magnetic resonance imaging (MRI) pre-processing, segmentation, electrode and gel placement, and post-processing were all automated. The material properties of each tissue were applied to their corresponding mask in silico using COMSOL Multiphysics (COMSOL, Burlington, MA, USA). The fidelity of the segmentations with and without post-processing was compared against the full semi-automated segmentation workflow approach using Dice coefficient analysis. The average relative differences for the electric fields generated by COMSOL were calculated in addition to observed differences in electric field-volume histograms. Furthermore, the mesh file formats in MPHTXT and NASTRAN were also compared using the differences in the electric field-volume histogram. The Dice coefficient was less for auto-segmentation without versus auto-segmentation with post-processing, indicating convergence on a manually corrected model. An existent but marginal relative difference of electric field maps from models with manual correction versus those without was identified, and a clear advantage of using the NASTRAN mesh file format was found. The software and workflow outlined in this article may be used to accelerate the investigation of TTFields in glioblastoma patients by facilitating the creation of FEA models derived from patient MRI datasets.

  20. Hardware Support for Malware Defense and End-to-End Trust

    Science.gov (United States)

    2017-02-01

    this problem is described in section 3.1.5. 3.1.3. SOFTWARE ARCHITECTURE Starting from the Chromebook hardware platform, this project removed the...personalities (KVM Virtual Machines) of Android , while including our overall integrity architecture with integrity measurement, appraisal, and...attestation, both for the native Linux, and for the Android guests. The overall architecture developed in this project is shown in Figure 1. 3.1.4

  1. CLOUD SECURITY AND COMPLIANCE - A SEMANTIC APPROACH IN END TO END SECURITY

    OpenAIRE

    Kalaiprasath, R.; Elankavi, R.; Udayakumar, R.

    2017-01-01

    The Cloud services are becoming an essential part of many organizations. Cloud providers have to adhere to security and privacy policies to ensure their users' data remains confidential and secure. Though there are some ongoing efforts on developing cloud security standards, most cloud providers are implementing a mish-mash of security and privacy controls. This has led to confusion among cloud consumers as to what security measures they should expect from the cloud services, and whether thes...

  2. End-to-end information extraction without token-level supervision

    DEFF Research Database (Denmark)

    Palm, Rasmus Berg; Hovy, Dirk; Laws, Florian

    2017-01-01

    Most state-of-the-art information extraction approaches rely on token-level labels to find the areas of interest in text. Unfortunately, these labels are time-consuming and costly to create, and consequently, not available for many real-life IE tasks. To make matters worse, token-level labels...... and output text. We evaluate our model on the ATIS data set, MIT restaurant corpus and the MIT movie corpus and compare to neural baselines that do use token-level labels. We achieve competitive results, within a few percentage points of the baselines, showing the feasibility of E2E information extraction...

  3. Telephony Over IP: A QoS Measurement-Based End to End Control Algorithm

    Directory of Open Access Journals (Sweden)

    Luigi Alcuri

    2004-12-01

    Full Text Available This paper presents a method for admitting voice calls in Telephony over IP (ToIP scenarios. This method, called QoS-Weighted CAC, aims to guarantee Quality of Service to telephony applications. We use a measurement-based call admission control algorithm, which detects network congested links through a feedback on overall link utilization. This feedback is based on the measures of packet delivery latencies related to voice over IP connections at the edges of the transport network. In this way we introduce a close loop control method, which is able to auto-adapt the quality margin on the basis of network load and specific service level requirements. Moreover we evaluate the difference in performance achieved by different Queue management configurations to guarantee Quality of Service to telephony applications, in which our goal was to evaluate the weight of edge router queue configuration in complex and real-like telephony over IP scenario. We want to compare many well-know queue scheduling algorithms, such as SFQ, WRR, RR, WIRR, and Priority. This comparison aims to locate queue schedulers in a more general control scheme context where different elements such as DiffServ marking and Admission control algorithms contribute to the overall Quality of Service required by real-time voice conversations. By means of software simulations we want to compare this solution with other call admission methods already described in scientific literature in order to locate this proposed method in a more general control scheme context. On the basis of the results we try to evidence the possible advantages of this QoS-Weighted solution in comparison with other similar CAC solutions ( in particular Measured Sum, Bandwidth Equivalent with Hoeffding Bounds, and Simple Measure CAC, on the planes of complexity, stability, management, tune-ability to service level requirements, and compatibility with actual network implementation.

  4. Multi-hop Relaying: An End-to-End Delay Analysis

    KAUST Repository

    Chaaban, Anas; Sezgin, Aydin

    2015-01-01

    The impact of multi-hopping schemes on the communication latency in a relay channel is studied. The main aim is to characterize conditions under which such schemes decrease the communication latency given a reliability requirement. Both decode

  5. HIDE & SEEK: End-to-end packages to simulate and process radio survey data

    Science.gov (United States)

    Akeret, J.; Seehars, S.; Chang, C.; Monstein, C.; Amara, A.; Refregier, A.

    2017-01-01

    As several large single-dish radio surveys begin operation within the coming decade, a wealth of radio data will become available and provide a new window to the Universe. In order to fully exploit the potential of these datasets, it is important to understand the systematic effects associated with the instrument and the analysis pipeline. A common approach to tackle this is to forward-model the entire system-from the hardware to the analysis of the data products. For this purpose, we introduce two newly developed, open-source Python packages: the HI Data Emulator (HIDE) and the Signal Extraction and Emission Kartographer (SEEK) for simulating and processing single-dish radio survey data. HIDE forward-models the process of collecting astronomical radio signals in a single-dish radio telescope instrument and outputs pixel-level time-ordered-data. SEEK processes the time-ordered-data, removes artifacts from Radio Frequency Interference (RFI), automatically applies flux calibration, and aims to recover the astronomical radio signal. The two packages can be used separately or together depending on the application. Their modular and flexible nature allows easy adaptation to other instruments and datasets. We describe the basic architecture of the two packages and examine in detail the noise and RFI modeling in HIDE, as well as the implementation of gain calibration and RFI mitigation in SEEK. We then apply HIDE &SEEK to forward-model a Galactic survey in the frequency range 990-1260 MHz based on data taken at the Bleien Observatory. For this survey, we expect to cover 70% of the full sky and achieve a median signal-to-noise ratio of approximately 5-6 in the cleanest channels including systematic uncertainties. However, we also point out the potential challenges of high RFI contamination and baseline removal when examining the early data from the Bleien Observatory. The fully documented HIDE &SEEK packages are available at http://hideseek.phys.ethz.ch/ and are published under the GPLv3 license on GitHub.

  6. End-to-end workflow for finite element analysis of tumor treating fields in glioblastomas.

    Science.gov (United States)

    Timmons, Joshua J; Lok, Edwin; San, Pyay; Bui, Kevin; Wong, Eric T

    2017-10-12

    Tumor Treating Fields (TTFields) therapy is an approved modality of treatment for glioblastoma. Patient anatomy-based finite element analysis (FEA) has the potential to reveal not only how these fields affect tumor control but also how to improve efficacy. While the automated tools for segmentation speed up the generation of FEA models, multi-step manual corrections are required, including removal of disconnected voxels, incorporation of unsegmented structures and the addition of 36 electrodes plus gel layers matching the TTFields transducers. Existing approaches are also not scalable for the high throughput analysis of large patient volumes. A semi-automated workflow was developed to prepare FEA models for TTFields mapping in the human brain. Magnetic resonance imaging (MRI) pre-processing, segmentation, electrode and gel placement, and post-processing were all automated. The material properties of each tissue were applied to their corresponding mask in silico using COMSOL Multiphysics (COMSOL, Burlington, MA, USA). The fidelity of the segmentations with and without post-processing was compared against the full semi-automated segmentation workflow approach using Dice coefficient analysis. The average relative differences for the electric fields generated by COMSOL were calculated in addition to observed differences in electric field-volume histograms. Furthermore, the mesh file formats in MPHTXT and NASTRAN were also compared using the differences in the electric field-volume histogram. The Dice coefficient was less for auto-segmentation without versus auto-segmentation with post-processing, indicating convergence on a manually corrected model. An existent but marginal relative difference of electric field maps from models with manual correction versus those without was identified, and a clear advantage of using the NASTRAN mesh file format was found. The software and workflow outlined in this article may be used to accelerate the investigation of TTFields in glioblastoma patients by facilitating the creation of FEA models derived from patient MRI datasets.

  7. Towards End-to-End Lane Detection: an Instance Segmentation Approach

    OpenAIRE

    Neven, Davy; De Brabandere, Bert; Georgoulis, Stamatios; Proesmans, Marc; Van Gool, Luc

    2018-01-01

    Modern cars are incorporating an increasing number of driver assist features, among which automatic lane keeping. The latter allows the car to properly position itself within the road lanes, which is also crucial for any subsequent lane departure or trajectory planning decision in fully autonomous cars. Traditional lane detection methods rely on a combination of highly-specialized, hand-crafted features and heuristics, usually followed by post-processing techniques, that are computationally e...

  8. The Challenge of Ensuring Human Rights in the End-to-End Supply Chain

    DEFF Research Database (Denmark)

    Wieland, Andreas; Handfield, Robert B.

    2014-01-01

    Certification programs have their merits and their limitations. With the growing availability of social media, analytics tools, and supply chain data, a smarter set of solutions could soon be possible....

  9. Design and Evaluation for the End-to-End Detection of TCP/IP Header Manipulation

    Science.gov (United States)

    2014-06-01

    3.2.2 Outsourcing middleboxes Jingling [86] is a prototype outsourcing architecture where the network forwards data out to external “Feature...The relation to our problem is that Jingling could help proactively address broken and inadvertent middlebox behaviors, depending on the administrative

  10. Mining Fashion Outfit Composition Using An End-to-End Deep Learning Approach on Set Data

    OpenAIRE

    Li, Yuncheng; Cao, LiangLiang; Zhu, Jiang; Luo, Jiebo

    2016-01-01

    Composing fashion outfits involves deep understanding of fashion standards while incorporating creativity for choosing multiple fashion items (e.g., Jewelry, Bag, Pants, Dress). In fashion websites, popular or high-quality fashion outfits are usually designed by fashion experts and followed by large audiences. In this paper, we propose a machine learning system to compose fashion outfits automatically. The core of the proposed automatic composition system is to score fashion outfit candidates...

  11. Building an End-to-end System for Long Term Soil Monitoring

    Science.gov (United States)

    Szlavecz, K.; Terzis, A.; Musaloiu-E., R.; Cogan, J.; Szalay, A.; Gray, J.

    2006-05-01

    We have developed and deployed an experimental soil monitoring system in an urban forest. Wireless sensor nodes collect data on soil temperature, soil moisture, air temperature, and light. Data are uploaded into a SQL Server database, where they are calibrated and reorganized into an OLAP data cube. The data are accessible on-line using a web services interface with various visual tools. Our prototype system of ten nodes has been live since Sep 2005, and in 5 months of operation over 6 million measurements have been collected. At a high level, our experiment was a success: we detected variations in soil condition corresponding to topography and external environmental parameters as expected. However, we encountered a number of challenging technical problems: need for low-level programming at multiple levels, calibration across space and time, and cross- reference of measurements with external sources. Based upon the experience with this system we are now deploying 200 mode nodes with close to a thousand sensors spread over multiple sites in the context of the Baltimore Ecosystem Study LTER. www

  12. End-to-end integrated security and performance analysis on the DEGAS Choreographer platform

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Gilmore, Stephen; Haenel, Valentin

    2005-01-01

    We present a software tool platform which facilitates security and performance analysis of systems which starts and ends with UML model descriptions. A UML project is presented to the platform for analysis, formal content is extracted in the form of process calculi descriptions, analysed with the......We present a software tool platform which facilitates security and performance analysis of systems which starts and ends with UML model descriptions. A UML project is presented to the platform for analysis, formal content is extracted in the form of process calculi descriptions, analysed...

  13. Hoe kunnen end-to-end processen worden geborgd in de organisatie?

    NARCIS (Netherlands)

    Strikwerda, H.

    2017-01-01

    Processen waarin kennis, informatie en materiaal worden getransformeerd in goederen en diensten, vormen de kern van organiseren. Dat is een van de oudste uitgangspunten in de bedrijfskunde. Processen zijn in het scientific management en daarmee in lean six sigma het object van analyse en verbetering

  14. SecMon: End-to-End Quality and Security Monitoring System

    OpenAIRE

    Ciszkowski, Tomasz; Eliasson, Charlott; Fiedler, Markus; Kotulski, Zbigniew; Lupu, Radu; Mazurczyk, Wojciech

    2008-01-01

    The Voice over Internet Protocol (VoIP) is becoming a more available and popular way of communicating for Internet users. This also applies to Peer-to-Peer (P2P) systems and merging these two have already proven to be successful (e.g. Skype). Even the existing standards of VoIP provide an assurance of security and Quality of Service (QoS), however, these features are usually optional and supported by limited number of implementations. As a result, the lack of mandatory and widely applicable Q...

  15. New vision solar system exploration missions study: Analysis of the use of biomodal space nuclear power systems to support outer solar system exploration missions. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-08

    This report presents the results of an analysis of the capability of nuclear bimodal systems to perform outer solar system exploration missions. Missions of interest include orbiter mission s to Mars, Jupiter, Saturn, Uranus, Neptune, and Pluto. An initial technology baseline consisting of a NEBA 10 kWe, 1000 N thrust, 850 s, 1500 kg bimodal system was selected, and its performance examined against a data base for trajectories to outer solar system planetary destinations to select optimal direct and gravity assisted trajectories for study. A conceptual design for a common bimodal spacecraft capable of performing missions to all the planetary destinations was developed and made the basis of end to end mission designs for orbiter missions to Jupiter, Saturn, and Neptune. Concepts for microspacecraft capable of probing Jupiter`s atmosphere and exploring Titan were also developed. All mission designs considered use the Atlas 2AS for launch. It is shown that the bimodal nuclear power and propulsion system offers many attractive option for planetary missions, including both conventional planetary missions in which all instruments are carried by a single primary orbiting spacecraft, and unconventional missions in which the primary spacecraft acts as a carrier, relay, and mother ship for a fleet of micro spacecraft deployed at the planetary destination.

  16. Systems Engineering and Application of System Performance Modeling in SIM Lite Mission

    Science.gov (United States)

    Moshir, Mehrdad; Murphy, David W.; Milman, Mark H.; Meier, David L.

    2010-01-01

    The SIM Lite Astrometric Observatory will be the first space-based Michelson interferometer operating in the visible wavelength, with the ability to perform ultra-high precision astrometric measurements on distant celestial objects. SIM Lite data will address in a fundamental way questions such as characterization of Earth-mass planets around nearby stars. To accomplish these goals it is necessary to rely on a model-based systems engineering approach - much more so than most other space missions. This paper will describe in further detail the components of this end-to-end performance model, called "SIM-sim", and show how it has helped the systems engineering process.

  17. Rapid Mission Design for Dynamically Complex Environments

    Data.gov (United States)

    National Aeronautics and Space Administration — Designing trajectories in dynamically complex environments is very challenging and easily becomes an intractable problem. More complex planning implies potentially...

  18. Landsat Data Continuity Mission (LDCM) space to ground mission data architecture

    Science.gov (United States)

    Nelson, Jack L.; Ames, J.A.; Williams, J.; Patschke, R.; Mott, C.; Joseph, J.; Garon, H.; Mah, G.

    2012-01-01

    The Landsat Data Continuity Mission (LDCM) is a scientific endeavor to extend the longest continuous multi-spectral imaging record of Earth's land surface. The observatory consists of a spacecraft bus integrated with two imaging instruments; the Operational Land Imager (OLI), built by Ball Aerospace & Technologies Corporation in Boulder, Colorado, and the Thermal Infrared Sensor (TIRS), an in-house instrument built at the Goddard Space Flight Center (GSFC). Both instruments are integrated aboard a fine-pointing, fully redundant, spacecraft bus built by Orbital Sciences Corporation, Gilbert, Arizona. The mission is scheduled for launch in January 2013. This paper will describe the innovative end-to-end approach for efficiently managing high volumes of simultaneous realtime and playback of image and ancillary data from the instruments to the reception at the United States Geological Survey's (USGS) Landsat Ground Network (LGN) and International Cooperator (IC) ground stations. The core enabling capability lies within the spacecraft Command and Data Handling (C&DH) system and Radio Frequency (RF) communications system implementation. Each of these systems uniquely contribute to the efficient processing of high speed image data (up to 265Mbps) from each instrument, and provide virtually error free data delivery to the ground. Onboard methods include a combination of lossless data compression, Consultative Committee for Space Data Systems (CCSDS) data formatting, a file-based/managed Solid State Recorder (SSR), and Low Density Parity Check (LDPC) forward error correction. The 440 Mbps wideband X-Band downlink uses Class 1 CCSDS File Delivery Protocol (CFDP), and an earth coverage antenna to deliver an average of 400 scenes per day to a combination of LGN and IC ground stations. This paper will also describe the integrated capabilities and processes at the LGN ground stations for data reception using adaptive filtering, and the mission operations approach fro- the LDCM

  19. The THEMIS Mission

    CERN Document Server

    Burch, J. L

    2009-01-01

    The THEMIS mission aims to determine the trigger and large-scale evolution of substorms by employing five identical micro-satellites which line up along the Earth's magnetotail to track the motion of particles, plasma, and waves from one point to another and for the first time, resolve space-time ambiguities in key regions of the magnetosphere on a global scale. The primary goal of THEMIS is to elucidate which magnetotail process is responsible for substorm onset at the region where substorm auroras map: (i) local disruption of the plasma sheet current (current disruption) or (ii) the interaction of the current sheet with the rapid influx of plasma emanating from reconnection. The probes also traverse the radiation belts and the dayside magnetosphere, allowing THEMIS to address additional baseline objectives. This volume describes the mission, the instrumentation, and the data derived from them.

  20. Euso-Balloon: A pathfinder mission for the JEM-EUSO experiment

    Energy Technology Data Exchange (ETDEWEB)

    Osteria, Giuseppe, E-mail: osteria@na.infn.it [Istituto Nazionale di Fisica Nucleare Sezione di Napoli, Naples (Italy); Scotti, Valentina [Istituto Nazionale di Fisica Nucleare Sezione di Napoli, Naples (Italy); Università di Napoli Federico II, Dipartimento di Fisica, Naples (Italy)

    2013-12-21

    EUSO-Balloon is a pathfinder mission for JEM-EUSO, the near-UV telescope proposed to be installed on board the ISS in 2017. The main objective of this pathfinder mission is to perform a full scale end-to-end test of all the key technologies and instrumentation of JEM-EUSO detectors and to prove the entire detection chain. EUSO-Balloon will measure the atmospheric and terrestrial UV background components, in different observational modes, fundamental for the development of the simulations. Through a series of flights performed by the French Space Agency CNES, EUSO-Balloon also has the potential to detect Extensive Air Showers (EAS) from above. EUSO-Balloon will be mounted in an unpressurized gondola of a stratospheric balloon. We will describe the instrument and the electronic system which performs instrument control and data management in such a critical environment.

  1. MISSION PROFILE AND DESIGN CHALLENGES FOR MARS LANDING EXPLORATION

    Directory of Open Access Journals (Sweden)

    J. Dong

    2017-07-01

    Full Text Available An orbiter and a descent module will be delivered to Mars in the Chinese first Mars exploration mission. The descent module is composed of a landing platform and a rover. The module will be released into the atmosphere by the orbiter and make a controlled landing on Martian surface. After landing, the rover will egress from the platform to start its science mission. The rover payloads mainly include the subsurface radar, terrain camera, multispectral camera, magnetometer, anemometer to achieve the scientific investigation of the terrain, soil characteristics, material composition, magnetic field, atmosphere, etc. The landing process is divided into three phases (entry phase, parachute descent phase and powered descent phase, which are full of risks. There exit lots of indefinite parameters and design constrain to affect the selection of the landing sites and phase switch (mortaring the parachute, separating the heat shield and cutting off the parachute. A number of new technologies (disk-gap-band parachute, guidance and navigation, etc. need to be developed. Mars and Earth have gravity and atmosphere conditions that are significantly different from one another. Meaningful environmental conditions cannot be recreated terrestrially on earth. A full-scale flight validation on earth is difficult. Therefore the end-to-end simulation and some critical subsystem test must be considered instead. The challenges above and the corresponding design solutions are introduced in this paper, which can provide reference for the Mars exploration mission.

  2. Ground Contact Model for Mars Science Laboratory Mission Simulations

    Science.gov (United States)

    Raiszadeh, Behzad; Way, David

    2012-01-01

    The Program to Optimize Simulated Trajectories II (POST 2) has been successful in simulating the flight of launch vehicles and entry bodies on earth and other planets. POST 2 has been the primary simulation tool for the Entry Descent, and Landing (EDL) phase of numerous Mars lander missions such as Mars Pathfinder in 1997, the twin Mars Exploration Rovers (MER-A and MER-B) in 2004, Mars Phoenix lander in 2007, and it is now the main trajectory simulation tool for Mars Science Laboratory (MSL) in 2012. In all previous missions, the POST 2 simulation ended before ground impact, and a tool other than POST 2 simulated landing dynamics. It would be ideal for one tool to simulate the entire EDL sequence, thus avoiding errors that could be introduced by handing off position, velocity, or other fight parameters from one simulation to the other. The desire to have one continuous end-to-end simulation was the motivation for developing the ground interaction model in POST 2. Rover landing, including the detection of the postlanding state, is a very critical part of the MSL mission, as the EDL landing sequence continues for a few seconds after landing. The method explained in this paper illustrates how a simple ground force interaction model has been added to POST 2, which allows simulation of the entire EDL from atmospheric entry through touchdown.

  3. Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing

    Science.gov (United States)

    Some, Raphael; Doyle, Richard; Bergman, Larry; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael

    2013-01-01

    Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and mission. Onboard computing can be aptly viewed as a "technology multiplier" in that advances provide direct dramatic improvements in flight functions and capabilities across the NASA mission classes, and enable new flight capabilities and mission scenarios, increasing science and exploration return. Space-qualified computing technology, however, has not advanced significantly in well over ten years and the current state of the practice fails to meet the near- to mid-term needs of NASA missions. Recognizing this gap, the NASA Game Changing Development Program (GCDP), under the auspices of the NASA Space Technology Mission Directorate, commissioned a study on space-based computing needs, looking out 15-20 years. The study resulted in a recommendation to pursue high-performance spaceflight computing (HPSC) for next-generation missions, and a decision to partner with the Air Force Research Lab (AFRL) in this development.

  4. Active Debris Removal mission design in Low Earth Orbit

    Science.gov (United States)

    Martin, Th.; Pérot, E.; Desjean, M.-Ch.; Bitetti, L.

    2013-03-01

    Active Debris Removal (ADR) aims at removing large sized intact objects ― defunct satellites, rocket upper-stages ― from space crowded regions. Why? Because they constitute the main source of the long-term debris environment deterioration caused by possible future collisions with fragments and worse still with other intact but uncontrolled objects. In order to limit the growth of the orbital debris population in the future (referred to as the Kessler syndrome), it is now highly recommended to carry out such ADR missions, together with the mitigation measures already adopted by national agencies (such as postmission disposal). At the French Space Agency, CNES, and in the frame of advanced studies, the design of such an ADR mission in Low Earth Orbit (LEO) is under evaluation. A two-step preliminary approach has been envisaged. First, a reconnaissance mission based on a small demonstrator (˜500 kg) rendezvousing with several targets (observation and in-flight qualification testing). Secondly, an ADR mission based on a larger vehicle (inherited from the Orbital Transfer Vehicle (OTV) concept) being able to capture and deorbit several preselected targets by attaching a propulsive kit to these targets. This paper presents a flight dynamics level tradeoff analysis between different vehicle and mission concepts as well as target disposal options. The delta-velocity, times, and masses required to transfer, rendezvous with targets and deorbit are assessed for some propelled systems and propellant less options. Total mass budgets are then derived for two end-to-end study cases corresponding to the reconnaissance and ADR missions mentioned above.

  5. Changes in occlusal relationships in mixed dentition patients treated with rapid maxillary expansion. A prospective clinical study.

    Science.gov (United States)

    McNamara, James A; Sigler, Lauren M; Franchi, Lorenzo; Guest, Susan S; Baccetti, Tiziano

    2010-03-01

    To prospectively measure occlusal changes in mixed dentition patients who underwent a standardized early expansion protocol. The treatment sample consisted of 500 patients who were assigned to three groups according to molar relationship: Class I (n = 204), end-to-end (n = 166), and Class II (n = 130). All patients were treated with a bonded rapid maxillary expander (RME) followed by a removable maintenance plate and a transpalatal arch. Mean age at the start of treatment was 8.8 years (T(1)), with a pre-phase 2 treatment cephalogram (T(2)) taken 3.7 years later. The control sample consisted of the cephalometric records of 188 untreated subjects (Class 1, n = 79; end-to-end, n = 51; Class II, n = 58). The largest change in molar relationship was noted when the Class II treatment group (1.8 mm) was compared with the matched control group (0.3 mm). A positive change was seen in 81% of the Class II treatment group, with almost half of the group improving by > or = 2.0 mm. The end-to-end treatment group had a positive change of 1.4 mm, compared with a control value of 0.6 mm, and the Class I group of about 1 mm compared with controls, who remained unchanged (0.1 mm). Skeletal changes were not significant when any of the groups were compared with controls. The expansion protocol had a significantly favorable effect on the sagittal occlusal relationships of Class II, end-to-end, and Class I patients treated in the early mixed dentition.

  6. Social Tagging of Mission Data

    Science.gov (United States)

    Norris, Jeffrey S.; Wallick, Michael N.; Joswig, Joseph C.; Powell, Mark W.; Torres, Recaredo J.; Mittman, David S.; Abramyan, Lucy; Crockett, Thomas M.; Shams, Khawaja S.; Fox, Jason M.; hide

    2010-01-01

    Mars missions will generate a large amount of data in various forms, such as daily plans, images, and scientific information. Often, there is a semantic linkage between images that cannot be captured automatically. Software is needed that will provide a method for creating arbitrary tags for this mission data so that items with a similar tag can be related to each other. The tags should be visible and searchable for all users. A new routine was written to offer a new and more flexible search option over previous applications. This software allows users of the MSLICE program to apply any number of arbitrary tags to a piece of mission data through a MSLICE search interface. The application of tags creates relationships between data that did not previously exist. These tags can be easily removed and changed, and contain enough flexibility to be specifically configured for any mission. This gives users the ability to quickly recall or draw attention to particular pieces of mission data, for example: Give a semantic and meaningful description to mission data; for example, tag all images with a rock in them with the tag "rock." Rapidly recall specific and useful pieces of data; for example, tag a plan as"driving template." Call specific data to a user s attention; for example, tag a plan as "for:User." This software is part of the MSLICE release, which was written in Java. It will run on any current Windows, Macintosh, or Linux system.

  7. Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission

    Science.gov (United States)

    Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan

    2010-01-01

    The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine

  8. OMV mission simulator

    Science.gov (United States)

    Cok, Keith E.

    1989-01-01

    The Orbital Maneuvering Vehicle (OMV) will be remotely piloted during rendezvous, docking, or proximity operations with target spacecraft from a ground control console (GCC). The real-time mission simulator and graphics being used to design a console pilot-machine interface are discussed. A real-time orbital dynamics simulator drives the visual displays. The dynamics simulator includes a J2 oblate earth gravity model and a generalized 1962 rotating atmospheric and drag model. The simulator also provides a variable-length communication delay to represent use of the Tracking and Data Relay Satellite System (TDRSS) and NASA Communications (NASCOM). Input parameter files determine the graphics display. This feature allows rapid prototyping since displays can be easily modified from pilot recommendations. A series of pilot reviews are being held to determine an effective pilot-machine interface. Pilots fly missions with nominal to 3-sigma dispersions in translational or rotational axes. Console dimensions, switch type and layout, hand controllers, and graphic interfaces are evaluated by the pilots and the GCC simulator is modified for subsequent runs. Initial results indicate a pilot preference for analog versus digital displays and for two 3-degree-of-freedom hand controllers.

  9. Proba-V Mission Exploitation Platform

    Science.gov (United States)

    Goor, E.

    2017-12-01

    VITO and partners developed the Proba-V Mission Exploitation Platform (MEP) as an end-to-end solution to drastically improve the exploitation of the Proba-V (an EC Copernicus contributing mission) EO-data archive, the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers (e.g. the EC Copernicus Global Land Service) and end-users. The analysis of time series of data (PB range) is addressed, as well as the large scale on-demand processing of near real-time data on a powerful and scalable processing environment. New features are still developed, but the platform is yet fully operational since November 2016 and offers A time series viewer (browser web client and API), showing the evolution of Proba-V bands and derived vegetation parameters for any country, region, pixel or polygon defined by the user. Full-resolution viewing services for the complete data archive. On-demand processing chains on a powerfull Hadoop/Spark backend. Virtual Machines can be requested by users with access to the complete data archive mentioned above and pre-configured tools to work with this data, e.g. various toolboxes and support for R and Python. This allows users to immediately work with the data without having to install tools or download data, but as well to design, debug and test applications on the platform. Jupyter Notebooks is available with some examples python and R projects worked out to show the potential of the data. Today the platform is already used by several international third party projects to perform R&D activities on the data, and to develop/host data analysis toolboxes. From the Proba-V MEP, access to other data sources such as Sentinel-2 and landsat data is also addressed. Selected components of the MEP are also deployed on public cloud infrastructures in various R&D projects. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to

  10. 77 FR 21748 - Oil and Gas Trade Mission to Israel

    Science.gov (United States)

    2012-04-11

    ... DEPARTMENT OF COMMERCE International Trade Administration Oil and Gas Trade Mission to Israel... Foreign Commercial Service (CS), is organizing an Executive-led Oil and Gas Trade Mission to Israel.... The purpose of the mission is to introduce U.S. firms to Israel's rapidly expanding oil and gas market...

  11. Towards a Multi-Mission, Airborne Science Data System Environment

    Science.gov (United States)

    Crichton, D. J.; Hardman, S.; Law, E.; Freeborn, D.; Kay-Im, E.; Lau, G.; Oswald, J.

    2011-12-01

    NASA earth science instruments are increasingly relying on airborne missions. However, traditionally, there has been limited common infrastructure support available to principal investigators in the area of science data systems. As a result, each investigator has been required to develop their own computing infrastructures for the science data system. Typically there is little software reuse and many projects lack sufficient resources to provide a robust infrastructure to capture, process, distribute and archive the observations acquired from airborne flights. At NASA's Jet Propulsion Laboratory (JPL), we have been developing a multi-mission data system infrastructure for airborne instruments called the Airborne Cloud Computing Environment (ACCE). ACCE encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation. This includes improving data system interoperability across each instrument. A principal characteristic is being able to provide an agile infrastructure that is architected to allow for a variety of configurations of the infrastructure from locally installed compute and storage services to provisioning those services via the "cloud" from cloud computer vendors such as Amazon.com. Investigators often have different needs that require a flexible configuration. The data system infrastructure is built on the Apache's Object Oriented Data Technology (OODT) suite of components which has been used for a number of spaceborne missions and provides a rich set of open source software components and services for constructing science processing and data management systems. In 2010, a partnership was formed between the ACCE team and the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to support the data processing and data management needs

  12. End-To-End Solution for Integrated Workload and Data Management using glideinWMS and Globus Online

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Grid computing has enabled scientific communities to effectively share computing resources distributed over many independent sites. Several such communities, or Virtual Organizations (VO), in the Open Science Grid and the European Grid Infrastructure use the glideinWMS system to run complex application work-flows. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, dynamically-sized overlay Condor batch system on Grid resources. While the WMS addresses the management of compute resources, however, data management in the Grid is still the responsibility of the VO. In general, large VOs have resources to develop complex custom solutions, while small VOs would rather push this responsibility to the infrastructure. The latter requires a tight integration of the WMS and the data management layers, an approach still not common in modern Grids. In this paper we describe a solution developed to address this shortcoming in the context of Center for Enabling Distributed Petascale Scienc...

  13. End-to-End Deep Neural Networks and Transfer Learning for Automatic Analysis of Nation-State Malware

    Directory of Open Access Journals (Sweden)

    Ishai Rosenberg

    2018-05-01

    Full Text Available Malware allegedly developed by nation-states, also known as advanced persistent threats (APT, are becoming more common. The task of attributing an APT to a specific nation-state or classifying it to the correct APT family is challenging for several reasons. First, each nation-state has more than a single cyber unit that develops such malware, rendering traditional authorship attribution algorithms useless. Furthermore, the dataset of such available APTs is still extremely small. Finally, those APTs use state-of-the-art evasion techniques, making feature extraction challenging. In this paper, we use a deep neural network (DNN as a classifier for nation-state APT attribution. We record the dynamic behavior of the APT when run in a sandbox and use it as raw input for the neural network, allowing the DNN to learn high level feature abstractions of the APTs itself. We also use the same raw features for APT family classification. Finally, we use the feature abstractions learned by the APT family classifier to solve the attribution problem. Using a test set of 1000 Chinese and Russian developed APTs, we achieved an accuracy rate of 98.6%

  14. End-to-end encryption in on-line payment systems : The industry reluctance and the role of laws

    NARCIS (Netherlands)

    Kasiyanto, Safari

    2016-01-01

    Various security breaches at third-party payment processors show that online payment systems are the primary target for cyber-criminals. In general, the security of online payment systems relies on a number of factors, namely technical factors, processing factors, and legal factors. The industry

  15. Quasi-real-time end-to-end simulations of ELT-scale adaptive optics systems on GPUs

    Science.gov (United States)

    Gratadour, Damien

    2011-09-01

    Our team has started the development of a code dedicated to GPUs for the simulation of AO systems at the E-ELT scale. It uses the CUDA toolkit and an original binding to Yorick (an open source interpreted language) to provide the user with a comprehensive interface. In this paper we present the first performance analysis of our simulation code, showing its ability to provide Shack-Hartmann (SH) images and measurements at the kHz scale for VLT-sized AO system and in quasi-real-time (up to 70 Hz) for ELT-sized systems on a single top-end GPU. The simulation code includes multiple layers atmospheric turbulence generation, ray tracing through these layers, image formation at the focal plane of every sub-apertures of a SH sensor using either natural or laser guide stars and centroiding on these images using various algorithms. Turbulence is generated on-the-fly giving the ability to simulate hours of observations without the need of loading extremely large phase screens in the global memory. Because of its performance this code additionally provides the unique ability to test real-time controllers for future AO systems under nominal conditions.

  16. Portable air quality sensor unit for participatory monitoring: an end-to-end VESNA-AQ based prototype

    Science.gov (United States)

    Vucnik, Matevz; Robinson, Johanna; Smolnikar, Miha; Kocman, David; Horvat, Milena; Mohorcic, Mihael

    2015-04-01

    Key words: portable air quality sensor, CITI-SENSE, participatory monitoring, VESNA-AQ The emergence of low-cost easy to use portable air quality sensors units is opening new possibilities for individuals to assess their exposure to air pollutants at specific place and time, and share this information through the Internet connection. Such portable sensors units are being used in an ongoing citizen science project called CITI-SENSE, which enables citizens to measure and share the data. The project aims through creating citizens observatories' to empower citizens to contribute to and participate in environmental governance, enabling them to support and influence community and societal priorities as well as associated decision making. An air quality measurement system based on VESNA sensor platform was primarily designed within the project for the use as portable sensor unit in selected pilot cities (Belgrade, Ljubljana and Vienna) for monitoring outdoor exposure to pollutants. However, functionally the same unit with different set of sensors could be used for example as an indoor platform. The version designed for the pilot studies was equipped with the following sensors: NO2, O3, CO, temperature, relative humidity, pressure and accelerometer. The personal sensor unit is battery powered and housed in a plastic box. The VESNA-based air quality (AQ) monitoring system comprises the VESNA-AQ portable sensor unit, a smartphone app and the remote server. Personal sensor unit supports wireless connection to an Android smartphone via built-in Wi-Fi. The smartphone in turn serves also as the communication gateway towards the remote server using any of available data connections. Besides the gateway functionality the role of smartphone is to enrich data coming from the personal sensor unit with the GPS location, timestamps and user defined context. This, together with an accelerometer, enables the user to better estimate ones exposure in relation to physical activities, time and location. The end user can monitor the measured parameters through a smartphone application. The smartphone app implements a custom developed LCSP (Lightweight Client Server Protocol) protocol which is used to send requests to the VESNA-AQ unit and to exchange information. When the data is obtained from the VESNA-AQ unit, the mobile application visualizes the data. It also has an option to forward the data to the remote server in a custom JSON structure over a HTTP POST request. The server stores the data in the database and in parallel translates the data to WFS and forwards it to the main CITI-SENSE platform over WFS-T in a common XML format over HTTP POST request. From there data can be accessed through the Internet and visualised in different forms and web applications developed by the CITI-SENSE project. In the course of the project, the collected data will be made publicly available enabling the citizens to participate in environmental governance. Acknowledgements: CITI-SENSE is a Collaborative Project partly funded by the EU FP7-ENV-2012 under grant agreement no 308524 (www.citi-sense.eu).

  17. End-To-End Solution for Integrated Workload and Data Management using GlideinWMS and Globus Online

    International Nuclear Information System (INIS)

    Mhashilkar, Parag; Miller, Zachary; Weiss, Cathrin; Kettimuthu, Rajkumar; Garzoglio, Gabriele; Holzman, Burt; Duan, Xi; Lacinski, Lukasz

    2012-01-01

    Grid computing has enabled scientific communities to effectively share computing resources distributed over many independent sites. Several such communities, or Virtual Organizations (VO), in the Open Science Grid and the European Grid Infrastructure use the GlideinWMS system to run complex application work-flows. GlideinWMS is a pilot-based workload management system (WMS) that creates an on-demand, dynamically-sized overlay Condor batch system on Grid resources. While the WMS addresses the management of compute resources, however, data management in the Grid is still the responsibility of the VO. In general, large VOs have resources to develop complex custom solutions, while small VOs would rather push this responsibility to the infrastructure. The latter requires a tight integration of the WMS and the data management layers, an approach still not common in modern Grids. In this paper we describe a solution developed to address this shortcoming in the context of Center for Enabling Distributed Peta-scale Science (CEDPS) by integrating GlideinWMS with Globus Online (GO). Globus Online is a fast, reliable file transfer service that makes it easy for any user to move data. The solution eliminates the need for the users to provide custom data transfer solutions in the application by making this functionality part of the GlideinWMS infrastructure. To achieve this, GlideinWMS uses the file transfer plug-in architecture of Condor. The paper describes the system architecture and how this solution can be extended to support data transfer services other than Globus Online when used with Condor or GlideinWMS.

  18. Supporting end-to-end resource virtualization for Web 2.0 applications using Service Oriented Architecture

    NARCIS (Netherlands)

    Papagianni, C.; Karagiannis, Georgios; Tselikas, N. D.; Sfakianakis, E.; Chochliouros, I. P.; Kabilafkas, D.; Cinkler, T.; Westberg, L.; Sjödin, P.; Hidell, M.; Heemstra de Groot, S.M.; Kontos, T.; Katsigiannis, C.; Pappas, C.; Antonakopoulou, A.; Venieris, I.S.

    2008-01-01

    In recent years, technologies have been introduced offering a large amount of computing and networking resources. New applications such as Google AdSense and BitTorrent can profit from the use of these resources. An efficient way of discovering and reserving these resources is by using the Service

  19. The End-to-end Demonstrator for improved decision making in the water sector in Europe (EDgE)

    Science.gov (United States)

    Wood, Eric; Wanders, Niko; Pan, Ming; Sheffield, Justin; Samaniego, Luis; Thober, Stephan; Kumar, Rohinni; Prudhomme, Christel; Houghton-Carr, Helen

    2017-04-01

    High-resolution simulations of water resources from hydrological models are vital to supporting important climate services. Apart from a high level of detail, both spatially and temporally, it is important to provide simulations that consistently cover a range of timescales, from historical reanalysis to seasonal forecast and future projections. In the new EDgE project commissioned by the ECMWF (C3S) we try to fulfill these requirements. EDgE is a proof-of-concept project which combines climate data and state-of-the-art hydrological modelling to demonstrate a water-oriented information system implemented through a web application. EDgE is working with key European stakeholders representative of private and public sectors to jointly develop and tailor approaches and techniques. With these tools, stakeholders are assisted in using improved climate information in decision-making, and supported in the development of climate change adaptation and mitigation policies. Here, we present the first results of the EDgE modelling chain, which is divided into three main processes: 1) pre-processing and downscaling; 2) hydrological modelling; 3) post-processing. Consistent downscaling and bias corrections for historical simulations, seasonal forecasts and climate projections ensure that the results across scales are robust. The daily temporal resolution and 5km spatial resolution ensure locally relevant simulations. With the use of four hydrological models (PCR-GLOBWB, VIC, mHM, Noah-MP), uncertainty between models is properly addressed, while consistency is guaranteed by using identical input data for static land surface parameterizations. The forecast results are communicated to stakeholders via Sectoral Climate Impact Indicators (SCIIs) that have been created in collaboration with the end-user community of the EDgE project. The final product of this project is composed of 15 years of seasonal forecast and 10 climate change projections, all combined with four hydrological models. These unique high-resolution climate information simulations in the EDgE project provide an unprecedented information system for decision-making over Europe.

  20. Probability distribution function of the polymer end-to-end molecule vector after retraction and its application to step deformation

    Czech Academy of Sciences Publication Activity Database

    Kharlamov, Alexander; Rolón-Garrido, V. H.; Filip, Petr

    2010-01-01

    Roč. 19, č. 4 (2010), s. 190-194 ISSN 1022-1344 R&D Projects: GA ČR GA103/09/2066 Institutional research plan: CEZ:AV0Z20600510 Keywords : polymer chains * molecular modeling * shear * stress Subject RIV: BK - Fluid Dynamics Impact factor: 1.440, year: 2010

  1. End-to-End Privacy Protection for Facebook Mobile Chat based on AES with Multi-Layered MD5

    Directory of Open Access Journals (Sweden)

    Wibisono Sukmo Wardhono

    2018-01-01

    Full Text Available As social media environments become more interactive and amount of users grown tremendously, privacy is a matter of increasing concern. When personal data become a commodity, social media company can share users data to another party such as government. Facebook, inc is one of the social media company that frequently asked for user’s data. Although this private data request mechanism through a formal and valid legal process, it still undermine the fundamental right to information privacy. In This Case, social media users need protection against privacy violation from social media platform provider itself.  Private chat is the most favorite feature of a social media. Inside a chat room, user can share their private information contents. Cryptography is one of data protection methods that can be used to hides private communication data from unauthorized parties. In our study, we proposed a system that can encrypt chatting content based on AES and multi-layered MD5 to ensure social media users have privacy protection against social media company that use user informations as a commodity. In addition, this system can make users convenience to share their private information through social media platform.

  2. Verification of the active deformation compensation system of the LMT/GTM by end-to-end simulations

    Science.gov (United States)

    Eisentraeger, Peter; Suess, Martin

    2000-07-01

    The 50 m LMT/GTM is exposed to the climatic conditions at 4,600 m height on Cerro La Negra, Mexico. For operating the telescope to the challenging requirements of its millimeter objective, an active approach for monitoring and compensating the structural deformations (Flexible Body Compensation FBC) is necessary. This system includes temperature sensors and strain gages for identifying large scale deformations of the reflector backup structure, a laser system for measuring the subreflector position, and an inclinometer system for measuring the deformations of the alidade. For compensating the monitored deformations, the telescope is equipped with additional actuators for active control of the main reflector surface and the subreflector position. The paper describes the verification of the active deformation system by finite element calculations and MATLAB simulations of the surface accuracy and the pointing including the servo under the operational wind and thermal conditions.

  3. Investigating end-to-end security in the fifth generation wireless capabilities and IoT extensions

    Science.gov (United States)

    Uher, J.; Harper, J.; Mennecke, R. G.; Patton, P.; Farroha, B.

    2016-05-01

    The emerging 5th generation wireless network will be architected and specified to meet the vision of allowing the billions of devices and millions of human users to share spectrum to communicate and deliver services. The expansion of wireless networks from its current role to serve these diverse communities of interest introduces new paradigms that require multi-tiered approaches. The introduction of inherently low security components, like IoT devices, necessitates that critical data be better secured to protect the networks and users. Moreover high-speed communications that are meant to enable the autonomous vehicles require ultra reliable and low latency paths. This research explores security within the proposed new architectures and the cross interconnection of the highly protected assets with low cost/low security components forming the overarching 5th generation wireless infrastructure.

  4. An Anthological Review of Research Utilizing MontyLingua: a Python-Based End-to-End Text Processor

    Directory of Open Access Journals (Sweden)

    2008-06-01

    Full Text Available MontyLingua, an integral part of ConceptNet which is currently the largest commonsense knowledge base, is an English text processor developed using Python programming language in MIT Media Lab. The main feature of MontyLingua is the coverage for all aspects of English text processing from raw input text to semantic meanings and summary generation, yet each component in MontyLingua is loosely-coupled to each other at the architectural and code level, which enabled individual components to be used independently or substituted. However, there has been no review exploring the role of MontyLingua in recent research work utilizing it. This paper aims to review the use of and roles played by MontyLingua and its components in research work published in 19 articles between October 2004 and August 2006. We had observed a diversified use of MontyLingua in many different areas, both generic and domain-specific. Although the use of text summarizing component had not been observe, we are optimistic that it will have a crucial role in managing the current trend of information overload in future research.

  5. SPAN: A Network Providing Integrated, End-to-End, Sensor-to-Database Solutions for Environmental Sciences

    Science.gov (United States)

    Benzel, T.; Cho, Y. H.; Deschon, A.; Gullapalli, S.; Silva, F.

    2009-12-01

    In recent years, advances in sensor network technology have shown great promise to revolutionize environmental data collection. Still, wide spread adoption of these systems by domain experts has been lacking, and these have remained the purview of the engineers who design them. While there are many data logging options for basic data collection in the field currently, scientists are often required to visit the deployment sites to retrieve their data and manually import it into spreadsheets. Some advanced commercial software systems do allow scientists to collect data remotely, but most of these systems only allow point-to-point access, and require proprietary hardware. Furthermore, these commercial solutions preclude the use of sensors from other manufacturers or integration with internet based database repositories and compute engines. Therefore, scientists often must download and manually reformat their data before uploading it to the repositories if they wish to share their data. We present an open-source, low-cost, extensible, turnkey solution called Sensor Processing and Acquisition Network (SPAN) which provides a robust and flexible sensor network service. At the deployment site, SPAN leverages low-power generic embedded processors to integrate variety of commercially available sensor hardware to the network of environmental observation systems. By bringing intelligence close to the sensed phenomena, we can remotely control configuration and re-use, establish rules to trigger sensor activity, manage power requirements, and control the two-way flow of sensed data as well as control information to the sensors. Key features of our design include (1) adoption of a hardware agnostic architecture: our solutions are compatible with several programmable platforms, sensor systems, communication devices and protocols. (2) information standardization: our system supports several popular communication protocols and data formats, and (3) extensible data support: our system works with several existing data storage systems, data models and web based services as needed by the domain experts; examples include standard MySQL databases, Sensorbase (from UCLA), as well as SPAN Cloud, a system built using Google's Application Engine that allows scientists to use Google's cloud computing cyber-infrastructure. We provide a simple, yet flexible data access control mechanism that allows groups of researchers to share their data in SPAN Cloud. In this talk, we will describe the SPAN architecture, its components, our development plans, our vision for the future and results from current deployments that continue to drive the design of our system.

  6. Secondary link adaptation in cognitive radio networks: End-to-end performance with cross-layer design

    KAUST Repository

    Ma, Hao; Yang, Yuli; Aissa, Sonia

    2012-01-01

    the optimal boundary points in closed form to choose the AMC transmission modes by taking into account the channel state information from the secondary transmitter to both the primary receiver and the secondary receiver. Moreover, numerical results

  7. End-to-end performance of cooperative relaying in spectrum-sharing systems with quality of service requirements

    KAUST Repository

    Asghari, Vahid Reza; Aissa, Sonia

    2011-01-01

    We propose adopting a cooperative relaying technique in spectrum-sharing cognitive radio (CR) systems to more effectively and efficiently utilize available transmission resources, such as power, rate, and bandwidth, while adhering to the quality

  8. Defense Computers: DOD Y2K Functional End-to-End Testing Progress and Test Event Management

    National Research Council Canada - National Science Library

    1999-01-01

    ... (DOD) which relies on a complex and broad array of interconnected computer systems-including weapons, command and control, satellite, inventory management, transportation management, health, financial...

  9. An End-to-End Modeling and Simulation Testbed (EMAST) to Support Detailed Quantitative Evaluations of GIG Transport Services

    National Research Council Canada - National Science Library

    Comparetto, G; Schult, N; Mirhakkak, M; Chen, L; Wade, R; Duffalo, S

    2005-01-01

    .... A variety of services must be provided to the users including management of resources to support QoS, a transition path from IPv4 to IPv6, and efficient networking across heterogeneous networks (i.e...

  10. Generic Black-Box End-to-End Attack Against State of the Art API Call Based Malware Classifiers

    OpenAIRE

    Rosenberg, Ishai; Shabtai, Asaf; Rokach, Lior; Elovici, Yuval

    2017-01-01

    In this paper, we present a black-box attack against API call based machine learning malware classifiers, focusing on generating adversarial sequences combining API calls and static features (e.g., printable strings) that will be misclassified by the classifier without affecting the malware functionality. We show that this attack is effective against many classifiers due to the transferability principle between RNN variants, feed forward DNNs, and traditional machine learning classifiers such...

  11. Understanding Effect of Constraint Release Environment on End-to-End Vector Relaxation of Linear Polymer Chains

    KAUST Repository

    Shivokhin, Maksim E.; Read, Daniel J.; Kouloumasis, Dimitris; Kocen, Rok; Zhuge, Flanco; Bailly, Christian; Hadjichristidis, Nikolaos; Likhtman, Alexei E.

    2017-01-01

    of a linear probe chain. For this purpose we first validate the ability of the model to consistently predict both the viscoelastic and dielectric response of monodisperse and binary mixtures of type A polymers, based on published experimental data. We

  12. Future Mission Trends and their Implications for the Deep Space Network

    Science.gov (United States)

    Abraham, Douglas S.

    2006-01-01

    Planning for the upgrade and/or replacement of Deep Space Network (DSN) assets that typically operate for forty or more years necessitates understanding potential customer needs as far into the future as possible. This paper describes the methodology Deep Space Network (DSN) planners use to develop this understanding, some key future mission trends that have emerged from application of this methodology, and the implications of the trends for the DSN's future evolution. For NASA's current plans out to 2030, these trends suggest the need to accommodate: three times as many communication links, downlink rates two orders of magnitude greater than today's, uplink rates some four orders of magnitude greater, and end-to-end link difficulties two-to-three orders of magnitude greater. To meet these challenges, both DSN capacity and capability will need to increase.

  13. Results from the NASA Spacecraft Fault Management Workshop: Cost Drivers for Deep Space Missions

    Science.gov (United States)

    Newhouse, Marilyn E.; McDougal, John; Barley, Bryan; Stephens Karen; Fesq, Lorraine M.

    2010-01-01

    Fault Management, the detection of and response to in-flight anomalies, is a critical aspect of deep-space missions. Fault management capabilities are commonly distributed across flight and ground subsystems, impacting hardware, software, and mission operations designs. The National Aeronautics and Space Administration (NASA) Discovery & New Frontiers (D&NF) Program Office at Marshall Space Flight Center (MSFC) recently studied cost overruns and schedule delays for five missions. The goal was to identify the underlying causes for the overruns and delays, and to develop practical mitigations to assist the D&NF projects in identifying potential risks and controlling the associated impacts to proposed mission costs and schedules. The study found that four out of the five missions studied had significant overruns due to underestimating the complexity and support requirements for fault management. As a result of this and other recent experiences, the NASA Science Mission Directorate (SMD) Planetary Science Division (PSD) commissioned a workshop to bring together invited participants across government, industry, and academia to assess the state of the art in fault management practice and research, identify current and potential issues, and make recommendations for addressing these issues. The workshop was held in New Orleans in April of 2008. The workshop concluded that fault management is not being limited by technology, but rather by a lack of emphasis and discipline in both the engineering and programmatic dimensions. Some of the areas cited in the findings include different, conflicting, and changing institutional goals and risk postures; unclear ownership of end-to-end fault management engineering; inadequate understanding of the impact of mission-level requirements on fault management complexity; and practices, processes, and tools that have not kept pace with the increasing complexity of mission requirements and spacecraft systems. This paper summarizes the

  14. Mission Exploitation Platform PROBA-V

    Science.gov (United States)

    Goor, Erwin

    2016-04-01

    VITO and partners developed an end-to-end solution to drastically improve the exploitation of the PROBA-V EO-data archive (http://proba-v.vgt.vito.be/), the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers and end-users. The analysis of time series of data (+1PB) is addressed, as well as the large scale on-demand processing of near real-time data. From November 2015 an operational Mission Exploitation Platform (MEP) PROBA-V, as an ESA pathfinder project, will be gradually deployed at the VITO data center with direct access to the complete data archive. Several applications will be released to the users, e.g. - A time series viewer, showing the evolution of PROBA-V bands and derived vegetation parameters for any area of interest. - Full-resolution viewing services for the complete data archive. - On-demand processing chains e.g. for the calculation of N-daily composites. - A Virtual Machine will be provided with access to the data archive and tools to work with this data, e.g. various toolboxes and support for R and Python. After an initial release in January 2016, a research platform will gradually be deployed allowing users to design, debug and test applications on the platform. From the MEP PROBA-V, access to Sentinel-2 and landsat data will be addressed as well, e.g. to support the Cal/Val activities of the users. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to the complete data archive. To realise this, private cloud technology (openStack) is used and a distributed processing environment is built based on Hadoop. The Hadoop ecosystem offers a lot of technologies (Spark, Yarn, Accumulo, etc.) which we integrate with several open-source components. The impact of this MEP on the user community will be high and will completely change the way of working with the data and hence open the large time series to a larger

  15. Science Parametrics for Missions to Search for Earth-like Exoplanets by Direct Imaging

    Science.gov (United States)

    Brown, Robert A.

    2015-01-01

    We use Nt , the number of exoplanets observed in time t, as a science metric to study direct-search missions like Terrestrial Planet Finder. In our model, N has 27 parameters, divided into three categories: 2 astronomical, 7 instrumental, and 18 science-operational. For various "27-vectors" of those parameters chosen to explore parameter space, we compute design reference missions to estimate Nt . Our treatment includes the recovery of completeness c after a search observation, for revisits, solar and antisolar avoidance, observational overhead, and follow-on spectroscopy. Our baseline 27-vector has aperture D = 16 m, inner working angle IWA = 0.039'', mission time t = 0-5 yr, occurrence probability for Earth-like exoplanets η = 0.2, and typical values for the remaining 23 parameters. For the baseline case, a typical five-year design reference mission has an input catalog of ~4700 stars with nonzero completeness, ~1300 unique stars observed in ~2600 observations, of which ~1300 are revisits, and it produces N 1 ~ 50 exoplanets after one year and N 5 ~ 130 after five years. We explore offsets from the baseline for 10 parameters. We find that N depends strongly on IWA and only weakly on D. It also depends only weakly on zodiacal light for Z end-to-end efficiency for h > 0.2, and scattered starlight for ζ revisits, solar and antisolar avoidance, and follow-on spectroscopy are all important factors in estimating N.

  16. Insight into the Physical and Dynamical Processes that Control Rapid Increases in Total Flash Rate

    Science.gov (United States)

    Schultz, Christopher J.; Carey, Lawrence D.; Schultz, Elise V.; Blakeslee, Richard J.; Goodman, Steven J.

    2015-01-01

    Rapid increases in total lightning (also termed "lightning jumps") have been observed for many decades. Lightning jumps have been well correlated to severe and hazardous weather occurrence. The main focus of lightning jump work has been on the development of lightning algorithms to be used in real-time assessment of storm intensity. However, in these studies it is typically assumed that the updraft "increases" without direct measurements of the vertical motion, or specification of which updraft characteristic actually increases (e.g., average speed, maximum speed, or convective updraft volume). Therefore, an end-to-end physical and dynamical basis for coupling rapid increases in total flash rate to increases in updraft speed and volume must be understood in order to ultimately relate lightning occurrence to severe storm metrics. Herein, we use polarimetric, multi-Doppler, and lightning mapping array measurements to provide physical context as to why rapid increases in total lightning are closely tied to severe and hazardous weather.

  17. NASA's Preparations for ESA's L3 Gravitational Wave Mission

    Science.gov (United States)

    Stebbins, Robin

    2016-01-01

    Telescope Subsystem - Jeff Livas (GSFC): Demonstrate pathlength stability, straylight and manufacturability. Phase Measurement System - Bill Klipstein (JPL): Key measurement functions demonstrated. Incorporate full flight functionality. Laser Subsystem - Jordan Camp (GSFC): ECL master oscillator, phase noise of fiber power amplifier, demonstrate end-to-end performance in integrated system, lifetime. Micronewton Thrusters - John Ziemer (JPL): Propellant storage and distribution, system robustness, manufacturing yield, lifetime. Arm-locking Demonstration - Kirk McKenzie (JPL): Studying a demonstration of laser frequency stabilization with GRACE Follow-On. Torsion Pendulum - John Conklin (UF): Develop U.S. capability with GRS and torsion pendulum test bed. Multi-Axis Heterodyne Interferometry - Ira Thorpe (GSFC): Investigate test mass/optical bench interface. UV LEDs - John Conklin+ (UF): Flight qualify UV LEDs to replace mercury lamps in discharging system. Optical Bench - Guido Mueller (UF): Investigate alternate designs and fabrication processes to ease manufacturability. LISA researchers at JPL are leading the Laser Ranging Interferometer instrument on the GRACE Follow-On mission.

  18. GOCE gravity field simulation based on actual mission scenario

    Science.gov (United States)

    Pail, R.; Goiginger, H.; Mayrhofer, R.; Höck, E.; Schuh, W.-D.; Brockmann, J. M.; Krasbutter, I.; Fecher, T.; Gruber, T.

    2009-04-01

    In the framework of the ESA-funded project "GOCE High-level Processing Facility", an operational hardware and software system for the scientific processing (Level 1B to Level 2) of GOCE data has been set up by the European GOCE Gravity Consortium EGG-C. One key component of this software system is the processing of a spherical harmonic Earth's gravity field model and the corresponding full variance-covariance matrix from the precise GOCE orbit and calibrated and corrected satellite gravity gradiometry (SGG) data. In the framework of the time-wise approach a combination of several processing strategies for the optimum exploitation of the information content of the GOCE data has been set up: The Quick-Look Gravity Field Analysis is applied to derive a fast diagnosis of the GOCE system performance and to monitor the quality of the input data. In the Core Solver processing a rigorous high-precision solution of the very large normal equation systems is derived by applying parallel processing techniques on a PC cluster. Before the availability of real GOCE data, by means of a realistic numerical case study, which is based on the actual GOCE orbit and mission scenario and simulation data stemming from the most recent ESA end-to-end simulation, the expected GOCE gravity field performance is evaluated. Results from this simulation as well as recently developed features of the software system are presented. Additionally some aspects on data combination with complementary data sources are addressed.

  19. Intelligent Mission Controller Node

    National Research Council Canada - National Science Library

    Perme, David

    2002-01-01

    The goal of the Intelligent Mission Controller Node (IMCN) project was to improve the process of translating mission taskings between real-world Command, Control, Communications, Computers, and Intelligence (C41...

  20. Critical Robotic Lunar Missions

    Science.gov (United States)

    Plescia, J. B.

    2018-04-01

    Perhaps the most critical missions to understanding lunar history are in situ dating and network missions. These would constrain the volcanic and thermal history and interior structure. These data would better constrain lunar evolution models.

  1. Dukovany ASSET mission preparation

    Energy Technology Data Exchange (ETDEWEB)

    Kouklik, I [NPP Dukovany (Czech Republic)

    1997-12-31

    We are in the final stages of the Dukovany ASSET mission 1996 preparation. I would like to present some of our recent experiences. Maybe they would be helpful to other plants, that host ASSET missions in future.

  2. Dukovany ASSET mission preparation

    International Nuclear Information System (INIS)

    Kouklik, I.

    1996-01-01

    We are in the final stages of the Dukovany ASSET mission 1996 preparation. I would like to present some of our recent experiences. Maybe they would be helpful to other plants, that host ASSET missions in future

  3. Computer graphics aid mission operations. [NASA missions

    Science.gov (United States)

    Jeletic, James F.

    1990-01-01

    The application of computer graphics techniques in NASA space missions is reviewed. Telemetric monitoring of the Space Shuttle and its components is discussed, noting the use of computer graphics for real-time visualization problems in the retrieval and repair of the Solar Maximum Mission. The use of the world map display for determining a spacecraft's location above the earth and the problem of verifying the relative position and orientation of spacecraft to celestial bodies are examined. The Flight Dynamics/STS Three-dimensional Monitoring System and the Trajectroy Computations and Orbital Products System world map display are described, emphasizing Space Shuttle applications. Also, consideration is given to the development of monitoring systems such as the Shuttle Payloads Mission Monitoring System and the Attitude Heads-Up Display and the use of the NASA-Goddard Two-dimensional Graphics Monitoring System during Shuttle missions and to support the Hubble Space Telescope.

  4. Mobile Ad Hoc Networks in Bandwidth-Demanding Mission-Critical Applications: Practical Implementation Insights

    KAUST Repository

    Bader, Ahmed

    2016-09-28

    There has been recently a growing trend of using live video feeds in mission-critical applications. Real-time video streaming from front-end personnel or mobile agents is believed to substantially improve situational awareness in mission-critical operations such as disaster relief, law enforcement, and emergency response. Mobile Ad Hoc Networks (MANET) is a natural contender in such contexts. However, classical MANET routing schemes fall short in terms of scalability, bandwidth and latency; all three metrics being quite essential for mission-critical applications. As such, autonomous cooperative routing (ACR) has gained traction as the most viable MANET proposition. Nonetheless, ACR is also associated with a few implementation challenges. If they go unaddressed, will deem ACR practically useless. In this paper, efficient and low-complexity remedies to those issues are presented, analyzed, and validated. The validation is based on field experiments carried out using software-defined radio (SDR) platforms. Compared to classical MANET routing schemes, ACR was shown to offer up to 2X better throughput, more than 4X reduction in end-to-end latency, while observing a given target of transport rate normalized to energy consumption.

  5. The STEREO Mission

    CERN Document Server

    2008-01-01

    The STEREO mission uses twin heliospheric orbiters to track solar disturbances from their initiation to 1 AU. This book documents the mission, its objectives, the spacecraft that execute it and the instruments that provide the measurements, both remote sensing and in situ. This mission promises to unlock many of the mysteries of how the Sun produces what has become to be known as space weather.

  6. VEGA Space Mission

    Science.gov (United States)

    Moroz, V.; Murdin, P.

    2000-11-01

    VEGA (mission) is a combined spacecraft mission to VENUS and COMET HALLEY. It was launched in the USSR at the end of 1984. The mission consisted of two identical spacecraft VEGA 1 and VEGA 2. VEGA is an acronym built from the words `Venus' and `Halley' (`Galley' in Russian spelling). The basic design of the spacecraft was the same as has been used many times to deliver Soviet landers and orbiter...

  7. The EXIST Mission Concept Study

    Science.gov (United States)

    Fishman, Gerald J.; Grindlay, J.; Hong, J.

    2008-01-01

    EXIST is a mission designed to find and study black holes (BHs) over a wide range of environments and masses, including: 1) BHs accreting from binary companions or dense molecular clouds throughout our Galaxy and the Local Group, 2) supermassive black holes (SMBHs) lying dormant in galaxies that reveal their existence by disrupting passing stars, and 3) SMBHs that are hidden from our view at lower energies due to obscuration by the gas that they accrete. 4) the birth of stellar mass BHs which is accompanied by long cosmic gamma-ray bursts (GRBs) which are seen several times a day and may be associated with the earliest stars to form in the Universe. EXIST will provide an order of magnitude increase in sensitivity and angular resolution as well as greater spectral resolution and bandwidth compared with earlier hard X-ray survey telescopes. With an onboard optical-infra red (IR) telescope, EXIST will measure the spectra and redshifts of GRBs and their utility as cosmological probes of the highest z universe and epoch of reionization. The mission would retain its primary goal of being the Black Hole Finder Probe in the Beyond Einstein Program. However, the new design for EXIST proposed to be studied here represents a significant advance from its previous incarnation as presented to BEPAC. The mission is now less than half the total mass, would be launched on the smallest EELV available (Atlas V-401) for a Medium Class mission, and most importantly includes a two-telescope complement that is ideally suited for the study of both obscured and very distant BHs. EXIST retains its very wide field hard X-ray imaging High Energy Telescope (HET) as the primary instrument, now with improved angular and spectral resolution, and in a more compact payload that allows occasional rapid slews for immediate optical/IR imaging and spectra of GRBs and AGN as well as enhanced hard X-ray spectra and timing with pointed observations. The mission would conduct a 2 year full sky survey in

  8. Proba-V Mission Exploitation Platform

    Science.gov (United States)

    Goor, Erwin; Dries, Jeroen

    2017-04-01

    VITO and partners developed the Proba-V Mission Exploitation Platform (MEP) as an end-to-end solution to drastically improve the exploitation of the Proba-V (a Copernicus contributing mission) EO-data archive (http://proba-v.vgt.vito.be/), the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers and end-users. The analysis of time series of data (+1PB) is addressed, as well as the large scale on-demand processing of near real-time data on a powerful and scalable processing environment. Furthermore data from the Copernicus Global Land Service is in scope of the platform. From November 2015 an operational Proba-V MEP environment, as an ESA operation service, is gradually deployed at the VITO data center with direct access to the complete data archive. Since autumn 2016 the platform is operational and yet several applications are released to the users, e.g. - A time series viewer, showing the evolution of Proba-V bands and derived vegetation parameters from the Copernicus Global Land Service for any area of interest. - Full-resolution viewing services for the complete data archive. - On-demand processing chains on a powerfull Hadoop/Spark backend e.g. for the calculation of N-daily composites. - Virtual Machines can be provided with access to the data archive and tools to work with this data, e.g. various toolboxes (GDAL, QGIS, GrassGIS, SNAP toolbox, …) and support for R and Python. This allows users to immediately work with the data without having to install tools or download data, but as well to design, debug and test applications on the platform. - A prototype of jupyter Notebooks is available with some examples worked out to show the potential of the data. Today the platform is used by several third party projects to perform R&D activities on the data, and to develop/host data analysis toolboxes. In parallel the platform is further improved and extended. From the MEP PROBA-V, access to Sentinel-2 and landsat data will

  9. Mission of Mercy.

    Science.gov (United States)

    Humenik, Mark

    2014-01-01

    Some dentists prefer solo charity work, but there is much to be said for collaboration within the profession in reaching out to those who are dentally underserved. Mission of Mercy (MOM) programs are regularly organized across the country for this purpose. This article describes the structure, reach, and personal satisfaction to be gained from such missions.

  10. Model-Based Systems Engineering for Capturing Mission Architecture System Processes with an Application Case Study - Orion Flight Test 1

    Science.gov (United States)

    Bonanne, Kevin H.

    2011-01-01

    Model-based Systems Engineering (MBSE) is an emerging methodology that can be leveraged to enhance many system development processes. MBSE allows for the centralization of an architecture description that would otherwise be stored in various locations and formats, thus simplifying communication among the project stakeholders, inducing commonality in representation, and expediting report generation. This paper outlines the MBSE approach taken to capture the processes of two different, but related, architectures by employing the Systems Modeling Language (SysML) as a standard for architecture description and the modeling tool MagicDraw. The overarching goal of this study was to demonstrate the effectiveness of MBSE as a means of capturing and designing a mission systems architecture. The first portion of the project focused on capturing the necessary system engineering activities that occur when designing, developing, and deploying a mission systems architecture for a space mission. The second part applies activities from the first to an application problem - the system engineering of the Orion Flight Test 1 (OFT-1) End-to-End Information System (EEIS). By modeling the activities required to create a space mission architecture and then implementing those activities in an application problem, the utility of MBSE as an approach to systems engineering can be demonstrated.

  11. The Science and Technology of Future Space Missions

    Science.gov (United States)

    Bonati, A.; Fusi, R.; Longoni, F.

    1999-12-01

    processing. Powerful computers with customized architectures are designed and developed. High-speed intercommunication networks are studied and tested. In parallel to the hardware research activities, software development is undertaken for several purposes: digital and video compression algorithms, payload and spacecraft control and diagnostics, scientific processing algorithms, etc. Besides, embedded Java virtual machines are studied for tele-science applications (direct link between scientist console and scientific payload). At system engineering level, the demand for spacecraft autonomy is increased for planetology missions: reliable intelligent systems that can operate for long periods of time without human intervention from ground are requested and investigated. A technologically challenging but less glamorous area of development is represented by the laboratory equipment for end-to-end testing (on ground) of payload instruments. The main fields are cryogenics, laser and X-ray optics, microwave radiometry, UV and infrared testing systems.

  12. GPS Navigation for the Magnetospheric Multi-Scale Mission

    Science.gov (United States)

    Bamford, William; Mitchell, Jason; Southward, Michael; Baldwin, Philip; Winternitz, Luke; Heckler, Gregory; Kurichh, Rishi; Sirotzky, Steve

    2009-01-01

    utilizing a TDMA schedule to distribute a science quality message to all constellation members every ten seconds. Additionally the system generates one-way range measurements between formation members which is used as input to the Kalman filter. In preparation for the MMS Preliminary Design Review (PDR), the Navigator was required to pass a series of Technology Readiness Level (TRL) tests to earn the necessary TRL-6 classification. The TRL-6 level is achieved by demonstrating a prototype unit in a relevant end-to-end environment. The IRAS unit was able to meet all requirements during the testing phase, and has thus been TRL-6 qualified

  13. EUCLID mission design

    Science.gov (United States)

    Wallner, Oswald; Ergenzinger, Klaus; Tuttle, Sean; Vaillon, L.; Johann, Ulrich

    2017-11-01

    EUCLID, a medium-class mission candidate of ESA's Cosmic Vision 2015-2025 Program, currently in Definition Phase (Phase A/B1), shall map the geometry of the Dark Universe by investigating dark matter distributions, the distance-redshift relationship, and the evolution of cosmic structures. EUCLID consists of a 1.2 m telescope and two scientific instruments for ellipticity and redshift measurements in the visible and nearinfrared wavelength regime. We present a design concept of the EUCLID mission which is fully compliant with the mission requirements. Preliminary concepts of the spacecraft and of the payload including the scientific instruments are discussed.

  14. Rapid Mission Assurance Assessment via Sociotechnical Modeling and Simulation

    Science.gov (United States)

    2015-05-01

    their abstractions and omissions of details, are repeated within the computer security world in the egg -defense model (aka the M&M model)—a hard shell...McPherson & Smith-Lovin, 1987) and revisited by those authors in (M. McPherson, Lovin, & Cook , 2001) among many others. The third interaction pattern is...Face-to-Face Groups. American Sociological Review, 52, 370-379. McPherson, M., Lovin, L., & Cook , J. (2001). Birds of a Feather: Homophily in Social

  15. PLA Missions Beyond Taiwan

    National Research Council Canada - National Science Library

    Miller, Marc

    2008-01-01

    KEY INSIGHTS: *The PLA is being assigned and training for an increasing variety of missions, including nontraditional battlefields such as outer space and cyber space, as well as nontraditional functions...

  16. Human exploration mission studies

    Science.gov (United States)

    Cataldo, Robert L.

    1989-01-01

    The Office of Exploration has established a process whereby all NASA field centers and other NASA Headquarters offices participate in the formulation and analysis of a wide range of mission strategies. These strategies were manifested into specific scenarios or candidate case studies. The case studies provided a systematic approach into analyzing each mission element. First, each case study must address several major themes and rationale including: national pride and international prestige, advancement of scientific knowledge, a catalyst for technology, economic benefits, space enterprise, international cooperation, and education and excellence. Second, the set of candidate case studies are formulated to encompass the technology requirement limits in the life sciences, launch capabilities, space transfer, automation, and robotics in space operations, power, and propulsion. The first set of reference case studies identify three major strategies: human expeditions, science outposts, and evolutionary expansion. During the past year, four case studies were examined to explore these strategies. The expeditionary missions include the Human Expedition to Phobos and Human Expedition to Mars case studies. The Lunar Observatory and Lunar Outpost to Early Mars Evolution case studies examined the later two strategies. This set of case studies established the framework to perform detailed mission analysis and system engineering to define a host of concepts and requirements for various space systems and advanced technologies. The details of each mission are described and, specifically, the results affecting the advanced technologies required to accomplish each mission scenario are presented.

  17. Missions to Venus

    Science.gov (United States)

    Titov, D. V.; Baines, K. H.; Basilevsky, A. T.; Chassefiere, E.; Chin, G.; Crisp, D.; Esposito, L. W.; Lebreton, J.-P.; Lellouch, E.; Moroz, V. I.; Nagy, A. F.; Owen, T. C.; Oyama, K.-I.; Russell, C. T.; Taylor, F. W.; Young, R. E.

    2002-10-01

    Venus has always been a fascinating objective for planetary studies. At the beginning of the space era Venus became one of the first targets for spacecraft missions. Our neighbour in the solar system and, in size, the twin sister of Earth, Venus was expected to be very similar to our planet. However, the first phase of Venus spacecraft exploration in 1962-1992 by the family of Soviet Venera and Vega spacecraft and US Mariner, Pioneer Venus, and Magellan missions discovered an entirely different, exotic world hidden behind a curtain of dense clouds. These studies gave us a basic knowledge of the conditions on the planet, but generated many more questions concerning the atmospheric composition, chemistry, structure, dynamics, surface-atmosphere interactions, atmospheric and geological evolution, and the plasma environment. Despite all of this exploration by more than 20 spacecraft, the "morning star" still remains a mysterious world. But for more than a decade Venus has been a "forgotten" planet with no new missions featuring in the plans of the world space agencies. Now we are witnessing the revival of interest in this planet: the Venus Orbiter mission is approved in Japan, Venus Express - a European orbiter mission - has successfully passed the selection procedure in ESA, and several Venus Discovery proposals are knocking at the doors of NASA. The paper presents an exciting story of Venus spacecraft exploration, summarizes open scientific problems, and builds a bridge to the future missions.

  18. A pilot biomedical engineering course in rapid prototyping for mobile health.

    Science.gov (United States)

    Stokes, Todd H; Venugopalan, Janani; Hubbard, Elena N; Wang, May D

    2013-01-01

    Rapid prototyping of medically assistive mobile devices promises to fuel innovation and provides opportunity for hands-on engineering training in biomedical engineering curricula. This paper presents the design and outcomes of a course offered during a 16-week semester in Fall 2011 with 11 students enrolled. The syllabus covered a mobile health design process from end-to-end, including storyboarding, non-functional prototypes, integrated circuit programming, 3D modeling, 3D printing, cloud computing database programming, and developing patient engagement through animated videos describing the benefits of a new device. Most technologies presented in this class are open source and thus provide unlimited "hackability". They are also cost-effective and easily transferrable to other departments.

  19. SCIENCE PARAMETRICS FOR MISSIONS TO SEARCH FOR EARTH-LIKE EXOPLANETS BY DIRECT IMAGING

    International Nuclear Information System (INIS)

    Brown, Robert A.

    2015-01-01

    We use N t , the number of exoplanets observed in time t, as a science metric to study direct-search missions like Terrestrial Planet Finder. In our model, N has 27 parameters, divided into three categories: 2 astronomical, 7 instrumental, and 18 science-operational. For various ''27-vectors'' of those parameters chosen to explore parameter space, we compute design reference missions to estimate N t . Our treatment includes the recovery of completeness c after a search observation, for revisits, solar and antisolar avoidance, observational overhead, and follow-on spectroscopy. Our baseline 27-vector has aperture D = 16 m, inner working angle IWA = 0.039'', mission time t = 0-5 yr, occurrence probability for Earth-like exoplanets η = 0.2, and typical values for the remaining 23 parameters. For the baseline case, a typical five-year design reference mission has an input catalog of ∼4700 stars with nonzero completeness, ∼1300 unique stars observed in ∼2600 observations, of which ∼1300 are revisits, and it produces N 1 ∼ 50 exoplanets after one year and N 5 ∼ 130 after five years. We explore offsets from the baseline for 10 parameters. We find that N depends strongly on IWA and only weakly on D. It also depends only weakly on zodiacal light for Z < 50 zodis, end-to-end efficiency for h > 0.2, and scattered starlight for ζ < 10 –10 . We find that observational overheads, completeness recovery and revisits, solar and antisolar avoidance, and follow-on spectroscopy are all important factors in estimating N

  20. Packaging a successful NASA mission to reach a large audience within a small budget. Earth's Dynamic Space: Solar-Terrestrial Physics & NASA's Polar Mission

    Science.gov (United States)

    Fox, N. J.; Goldberg, R.; Barnes, R. J.; Sigwarth, J. B.; Beisser, K. B.; Moore, T. E.; Hoffman, R. A.; Russell, C. T.; Scudder, J.; Spann, J. F.; Newell, P. T.; Hobson, L. J.; Gribben, S. P.; Obrien, J. E.; Menietti, J. D.; Germany, G. G.; Mobilia, J.; Schulz, M.

    2004-12-01

    To showcase the on-going and wide-ranging scope of the Polar science discoveries, the Polar science team has created a one-stop shop for a thorough introduction to geospace physics, in the form of a DVD with supporting website. The DVD, Earth's Dynamic Space: Solar-Terrestrial Physics & NASA's Polar Mission, can be viewed as an end-to-end product or split into individual segments and tailored to lesson plans. Capitalizing on the Polar mission and its amazing science return, the Polar team created an exciting multi-use DVD intended for audiences ranging from a traditional classroom and after school clubs, to museums and science centers. The DVD tackles subjects such as the aurora, the magnetosphere and space weather, whilst highlighting the science discoveries of the Polar mission. This platform introduces the learner to key team members as well as the science principles. Dramatic visualizations are used to illustrate the complex principles that describe Earth’s dynamic space. In order to produce such a wide-ranging product on a shoe-string budget, the team poured through existing NASA resources to package them into the Polar story, and visualizations were created using Polar data to complement the NASA stock footage. Scientists donated their time to create and review scripts in order to make this a real team effort, working closely with the award winning audio-visual group at JHU/Applied Physics Laboratory. The team was excited to be invited to join NASA’s Sun-Earth Day 2005 E/PO program and the DVD will be distributed as part of the supporting educational packages.

  1. Mission operations technology

    Science.gov (United States)

    Varsi, Giulio

    In the last decade, the operation of a spacecraft after launch has emerged as a major component of the total cost of the mission. This trend is sustained by the increasing complexity, flexibility, and data gathering capability of the space assets and by their greater reliability and consequent longevity. The trend can, however, be moderated by the progressive transfer of selected functions from the ground to the spacecraft and by application, on the ground, of new technology. Advances in ground operations derive from the introduction in the mission operations environment of advanced microprocessor-based workstations in the class of a few million instructions per second and from the selective application of artificial intelligence technology. In the last few years a number of these applications have been developed, tested in operational settings and successfully demonstrated to users. Some are now being integrated in mission operations facilities. An analysis of mission operations indicates that the key areas are: concurrent control of multiple missions; automated/interactive production of command sequences of high integrity at low cost; automated monitoring of spacecraft health and automated aides for fault diagnosis; automated allocation of resources; automated processing of science data; and high-fidelity, high-speed spacecraft simulation. Examples of major advances in selected areas are described.

  2. NASA CYGNSS Tropical Cyclone Mission

    Science.gov (United States)

    Ruf, Chris; Atlas, Robert; Majumdar, Sharan; Ettammal, Suhas; Waliser, Duane

    2017-04-01

    The NASA Cyclone Global Navigation Satellite System (CYGNSS) mission consists of a constellation of eight microsatellites that were launched into low-Earth orbit on 15 December 2016. Each observatory carries a four-channel bistatic scatterometer receiver to measure near surface wind speed over the ocean. The transmitter half of the scatterometer is the constellation of GPS satellites. CYGNSS is designed to address the inadequacy in observations of the inner core of tropical cyclones (TCs) that result from two causes: 1) much of the TC inner core is obscured from conventional remote sensing instruments by intense precipitation in the eye wall and inner rain bands; and 2) the rapidly evolving (genesis and intensification) stages of the TC life cycle are poorly sampled in time by conventional polar-orbiting, wide-swath surface wind imagers. The retrieval of wind speed by CYGNSS in the presence of heavy precipitation is possible due to the long operating wavelength used by GPS (19 cm), at which scattering and attenuation by rain are negligible. Improved temporal sampling by CYGNSS is possible due to the use of eight spacecraft with 4 scatterometer channels on each one. Median and mean revisit times everywhere in the tropics are 3 and 7 hours, respectively. Wind speed referenced to 10m height above the ocean surface is retrieved from CYGNSS measurements of bistatic radar cross section in a manner roughly analogous to that of conventional ocean wind scatterometers. The technique has been demonstrated previously from space by the UK-DMC and UK-TDS missions. Wind speed is retrieved with 25 km spatial resolution and an uncertainty of 2 m/s at low wind speeds and 10% at wind speeds above 20 m/s. Extensive simulation studies conducted prior to launch indicate that there will be a significant positive impact on TC forecast skill for both track and intensity with CYGNSS measurements assimilated into HWRF numerical forecasts. Simulations of CYGNSS spatial and temporal sampling

  3. Teamwork Reasoning and Multi-Satellite Missions

    Science.gov (United States)

    Marsella, Stacy C.; Plaunt, Christian (Technical Monitor)

    2002-01-01

    NASA is rapidly moving towards the use of spatially distributed multiple satellites operating in near Earth orbit and Deep Space. Effective operation of such multi-satellite constellations raises many key research issues. In particular, the satellites will be required to cooperate with each other as a team that must achieve common objectives with a high degree of autonomy from ground based operations. The multi-agent research community has made considerable progress in investigating the challenges of realizing such teamwork. In this report, we discuss some of the teamwork issues that will be faced by multi-satellite operations. The basis of the discussion is a particular proposed mission, the Magnetospheric MultiScale mission to explore Earth's magnetosphere. We describe this mission and then consider how multi-agent technologies might be applied in the design and operation of these missions. We consider the potential benefits of these technologies as well as the research challenges that will be raised in applying them to NASA multi-satellite missions. We conclude with some recommendations for future work.

  4. Mission to the comets

    International Nuclear Information System (INIS)

    Hughes, D.

    1980-01-01

    The plans of space agencies in the United States and Europe for an exploratory comet mission including a one year rendezvous with comet Temple-2 and a fast fly-by of comet Halley are discussed. The mission provides an opportunity to make comparative measurements on the two different types of comets and also satisfies the three major scientific objectives of cometary missions namely: (1) To determine the chemical nature and the physical structure of cometary nuclei, and the changes that occur with time and orbital position. (2) To study the chemical and physical nature of the atmospheres and ionospheres of comets, the processes that occur in them, and their development with time and orbital position. (3) To determine the nature of the tails of comets and the processes by which they are formed, and to characterise the interaction of comets with solar wind. (UK)

  5. Country programming mission. Namibia

    International Nuclear Information System (INIS)

    1991-01-01

    In response to a request from the Government of Namibia conveyed in a letter dated 29 November 1990 IAEA provided a multi-disciplinary Programming Mission which visited Namibia from 15 - 19 July 1991. The terms of reference of the Mission were: 1. To assess the possibilities and benefits of nuclear energy applications in Namibia's development; 2. To advise on the infrastructure required for nuclear energy projects; 3. To assist in the formulation of project proposals which could be submitted for Agency assistance. This report is based on the findings of the Mission and falls into 3 sections with 8 appendices. The first section is a country profile providing background information, the second section deals with sectorial needs and institutional review of the sectors of agriculture including animal production, life sciences (nuclear medicine and radiotherapy) and radiation protection. The third section includes possible future technical co-operation activities

  6. TRISTAN - mission complete

    International Nuclear Information System (INIS)

    Anon.

    1995-01-01

    The high energy physics mission of the TRISTAN electron-positron collider at the Japanese KEK Laboratory ended in May. TRISTAN was the first accelerator in Japan at the high energy frontier, and its success owes a great deal to help and encouragement from the world high energy physics community. Its success also marks the first step toward the KEKB project now underway and the subsequent Linear Collider scheme. TRISTAN began operation in November 1986 with a collision energy of 50 GeV, the world's highest electron-positron collision energy at that time. With the addition of superconducting radiofrequency cavities, the energy was continuously increased, reaching a maximum of 64 GeV in 1989. In this exploratory era, the three large detectors - AMY,TOPAZ and VENUS - together with the smaller SHIP group made a rapid survey of particle phenomena in this new energy range. The sixth ('top') quark was first on the list of wanted particles, but the three large groups concluded that there were no new quarks below 32 GeV. The CDF and DO Collaborations at Fermilab's Tevatron recently reported the top quark as being six times as heavy as TRISTAN'S physics reach. Although initial experimental results suggested that the event-shape distributions of multi-hadron events were broadly consistent with the production of the five known quarks, the production rate of hadrons, compared to muons, was seen to rise with energy. The increased energy reach of TRISTAN increased the visibility of the subtle virtual effects of the Z (the electrically neutral carrier of the weak force) produced through the interference of weak and electromagnetic interactions. The rise was found to be slightly larger than expected from five quarks and a Z mass of 92 or 93 GeV, the accepted value at that time. This hinted that the Z mass had to be smaller, as later verified when the SLC and LEP electron-positron colliders at SLAC (Stanford) and CERN respectively came into operation in 1989

  7. MIV Project: Mission scenario

    DEFF Research Database (Denmark)

    Ravazzotti, Mariolina T.; Jørgensen, John Leif; Thuesen, Gøsta

    1997-01-01

    Under the ESA contract #11453/95/NL/JG(SC), aiming at assessing the feasibility of Rendez-vous and docking of unmanned spacecrafts, a msiision scenario was defined. This report describes the secquence of manouvres and task allocations for such missions.......Under the ESA contract #11453/95/NL/JG(SC), aiming at assessing the feasibility of Rendez-vous and docking of unmanned spacecrafts, a msiision scenario was defined. This report describes the secquence of manouvres and task allocations for such missions....

  8. Mars Stratigraphy Mission

    Science.gov (United States)

    Budney, C. J.; Miller, S. L.; Cutts, J. A.

    2000-01-01

    The Mars Stratigraphy Mission lands a rover on the surface of Mars which descends down a cliff in Valles Marineris to study the stratigraphy. The rover carries a unique complement of instruments to analyze and age-date materials encountered during descent past 2 km of strata. The science objective for the Mars Stratigraphy Mission is to identify the geologic history of the layered deposits in the Valles Marineris region of Mars. This includes constraining the time interval for formation of these deposits by measuring the ages of various layers and determining the origin of the deposits (volcanic or sedimentary) by measuring their composition and imaging their morphology.

  9. The OICETS mission

    Science.gov (United States)

    Jono, Takashi; Arai, Katsuyoshi

    2017-11-01

    The Optical Inter-orbit Communications Engineering Test Satellite (OICETS) was successfully launched on 23th August 2005 and thrown into a circular orbit at the altitude of 610 km. The main mission is to demonstrate the free-space inter satellite laser communications with the cooperation of the Advanced Relay and Technology Mission (ARTEMIS) geostationary satellite developed by the European Space Agency. This paper presents the overview of the OICETS and laser terminal, a history of international cooperation between Japan Aerospace Exploration Agency (JAXA) and ESA and typical results of the inter-orbit laser communication experiment carried out with ARTEMIS.

  10. Robust UAV Mission Planning

    NARCIS (Netherlands)

    Evers, L.; Dollevoet, T.; Barros, A.I.; Monsuur, H.

    2014-01-01

    Unmanned Aerial Vehicles (UAVs) can provide significant contributions to information gathering in military missions. UAVs can be used to capture both full motion video and still imagery of specific target locations within the area of interest. In order to improve the effectiveness of a

  11. Robust UAV mission planning

    NARCIS (Netherlands)

    Evers, L.; Dollevoet, T.; Barros, A.I.; Monsuur, H.

    2011-01-01

    Unmanned Areal Vehicles (UAVs) can provide significant contributions to information gathering in military missions. UAVs can be used to capture both full motion video and still imagery of specific target locations within the area of interest. In order to improve the effectiveness of a reconnaissance

  12. Robust UAV Mission Planning

    NARCIS (Netherlands)

    Evers, L.; Dollevoet, T; Barros, A.I.; Monsuur, H.

    2011-01-01

    Unmanned Aerial Vehicles (UAVs) can provide significant contributions to information gathering in military missions. UAVs can be used to capture both full motion video and still imagery of specific target locations within the area of interest. In order to improve the effectiveness of a

  13. Robust UAV Mission Planning

    NARCIS (Netherlands)

    L. Evers (Lanah); T.A.B. Dollevoet (Twan); A.I. Barros (Ana); H. Monsuur (Herman)

    2011-01-01

    textabstractUnmanned Areal Vehicles (UAVs) can provide significant contributions to information gathering in military missions. UAVs can be used to capture both full motion video and still imagery of specific target locations within the area of interest. In order to improve the effectiveness of a

  14. The Lobster Mission

    Science.gov (United States)

    Barthelmy, Scott

    2011-01-01

    I will give an overview of the Goddard Lobster mission: the science goals, the two instruments, the overall instruments designs, with particular attention to the wide-field x-ray instrument (WFI) using the lobster-eye-like micro-channel optics.

  15. Towards A Shared Mission

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen; Orth Gaarn-Larsen, Carsten

    A mission shared by stakeholders, management and employees is a prerequisite for an engaging dialog about the many and substantial changes and challenges currently facing universities. Too often this essen-tial dialog reveals mistrust and misunderstandings about the role and outcome of the univer......A mission shared by stakeholders, management and employees is a prerequisite for an engaging dialog about the many and substantial changes and challenges currently facing universities. Too often this essen-tial dialog reveals mistrust and misunderstandings about the role and outcome...... on a shared mission aiming at value creation (in the broadest interpretation). One important aspect of choosing value as the cornerstone of the mission of universities is to stress that the outcome is measured by external stakeholders and by their standards. Most of the paper is devoted to discussing value...... it possible to lead through processes that engage and excite while creating transparency and accountability. The paper will be illustrated with examples from Denmark and the Helios initiative taken by the Danish Academy of Technical Sciences (ATV) under the headline “The value creating university – courage...

  16. Titan Orbiter Aerorover Mission

    Science.gov (United States)

    Sittler Jr., E. C.; Acuna, M.; Burchell, M. J.; Coates, A.; Farrell, W.; Flasar, M.; Goldstein, B. E.; Gorevan, S.; Hartle, R. E.; Johnson, W. T. K.

    2001-01-01

    We propose a combined Titan orbiter and Titan Aerorover mission with an emphasis on both in situ and remote sensing measurements of Titan's surface, atmosphere, ionosphere, and magnetospheric interaction. The biological aspect of the Titan environment will be emphasized by the mission (i.e., search for organic materials which may include simple organics to 'amono' analogues of amino acids and possibly more complex, lightening detection and infrared, ultraviolet, and charged particle interactions with Titan's surface and atmosphere). An international mission is assumed to control costs. NASA will provide the orbiter, launch vehicle, DSN coverage and operations, while international partners will provide the Aerorover and up to 30% of the cost for the scientific instruments through collaborative efforts. To further reduce costs we propose a single PI for orbiter science instruments and a single PI for Aerorover science instruments. This approach will provide single command/data and power interface between spacecraft and orbiter instruments that will have redundant central DPU and power converter for their instruments. A similar approach could be used for the Aerorover. The mission profile will be constructed to minimize conflicts between Aerorover science, orbiter radar science, orbiter radio science, orbiter imaging science, and orbiter fields and particles (FP) science. Additional information is contained in the original extended abstract.

  17. The LISA Pathfinder Mission

    International Nuclear Information System (INIS)

    Armano, M; Audley, H; Born, M; Danzmann, K; Diepholz, I; Auger, G; Binetruy, P; Baird, J; Bortoluzzi, D; Brandt, N; Fitzsimons, E; Bursi, A; Caleno, M; Cavalleri, A; Cesarini, A; Dolesi, R; Ferroni, V; Cruise, M; Dunbar, N; Ferraioli, L

    2015-01-01

    LISA Pathfinder (LPF), the second of the European Space Agency's Small Missions for Advanced Research in Technology (SMART), is a dedicated technology validation mission for future spaceborne gravitational wave detectors, such as the proposed eLISA mission. LISA Pathfinder, and its scientific payload - the LISA Technology Package - will test, in flight, the critical technologies required for low frequency gravitational wave detection: it will put two test masses in a near-perfect gravitational free-fall and control and measure their motion with unprecedented accuracy. This is achieved through technology comprising inertial sensors, high precision laser metrology, drag-free control and an ultra-precise micro-Newton propulsion system. LISA Pathfinder is due to be launched in mid-2015, with first results on the performance of the system being available 6 months thereafter.The paper introduces the LISA Pathfinder mission, followed by an explanation of the physical principles of measurement concept and associated hardware. We then provide a detailed discussion of the LISA Technology Package, including both the inertial sensor and interferometric readout. As we approach the launch of the LISA Pathfinder, the focus of the development is shifting towards the science operations and data analysis - this is described in the final section of the paper (paper)

  18. The Gaia mission

    NARCIS (Netherlands)

    Collaboration, Gaia; Prusti, T.; de Bruijne, J. H. J.; Brown, A. G. A.; Vallenari, A.; Babusiaux, C.; Bailer-Jones, C. A. L.; Bastian, U.; Biermann, M.; Evans, D. W.; Eyer, L.; Jansen, F.; Jordi, C.; Klioner, S. A.; Lammers, U.; Lindegren, L.; Luri, X.; Mignard, F.; Milligan, D. J.; Panem, C.; Poinsignon, V.; Pourbaix, D.; Randich, S.; Sarri, G.; Sartoretti, P.; Siddiqui, H. I.; Soubiran, C.; Valette, V.; van Leeuwen, F.; Walton, N. A.; Aerts, C.; Arenou, F.; Cropper, M.; Drimmel, R.; Høg, E.; Katz, D.; Lattanzi, M. G.; O'Mullane, W.; Grebel, E. K.; Holland, A. D.; Huc, C.; Passot, X.; Bramante, L.; Cacciari, C.; Castañeda, J.; Chaoul, L.; Cheek, N.; De Angeli, F.; Fabricius, C.; Guerra, R.; Hernández, J.; Jean-Antoine-Piccolo, A.; Masana, E.; Messineo, R.; Mowlavi, N.; Nienartowicz, K.; Ordóñez-Blanco, D.; Panuzzo, P.; Portell, J.; Richards, P. J.; Riello, M.; Seabroke, G. M.; Tanga, P.; Thévenin, F.; Torra, J.; Els, S. G.; Gracia-Abril, G.; Comoretto, G.; Garcia-Reinaldos, M.; Lock, T.; Mercier, E.; Altmann, M.; Andrae, R.; Astraatmadja, T. L.; Bellas-Velidis, I.; Benson, K.; Berthier, J.; Blomme, R.; Busso, G.; Carry, B.; Cellino, A.; Clementini, G.; Cowell, S.; Creevey, O.; Cuypers, J.; Davidson, M.; De Ridder, J.; de Torres, A.; Delchambre, L.; Dell'Oro, A.; Ducourant, C.; Frémat, Y.; García-Torres, M.; Gosset, E.; Halbwachs, J. -L; Hambly, N. C.; Harrison, D. L.; Hauser, M.; Hestroffer, D.; Hodgkin, S. T.; Huckle, H. E.; Hutton, A.; Jasniewicz, G.; Jordan, S.; Kontizas, M.; Korn, A. J.; Lanzafame, A. C.; Manteiga, M.; Moitinho, A.; Muinonen, K.; Osinde, J.; Pancino, E.; Pauwels, T.; Petit, J. -M; Recio-Blanco, A.; Robin, A. C.; Sarro, L. M.; Siopis, C.; Smith, M.; Smith, K. W.; Sozzetti, A.; Thuillot, W.; van Reeven, W.; Viala, Y.; Abbas, U.; Abreu Aramburu, A.; Accart, S.; Aguado, J. J.; Allan, P. M.; Allasia, W.; Altavilla, G.; Álvarez, M. A.; Alves, J.; Anderson, R. I.; Andrei, A. H.; Anglada Varela, E.; Antiche, E.; Antoja, T.; Antón, S.; Arcay, B.; Atzei, A.; Ayache, L.; Bach, N.; Baker, S. G.; Balaguer-Núñez, L.; Barache, C.; Barata, C.; Barbier, A.; Barblan, F.; Baroni, M.; Barrado y Navascués, D.; Barros, M.; Barstow, M. A.; Becciani, U.; Bellazzini, M.; Bellei, G.; Bello García, A.; Belokurov, V.; Bendjoya, P.; Berihuete, A.; Bianchi, L.; Bienaymé, O.; Billebaud, F.; Blagorodnova, N.; Blanco-Cuaresma, S.; Boch, T.; Bombrun, A.; Borrachero, R.; Bouquillon, S.; Bourda, G.; Bouy, H.; Bragaglia, A.; Breddels, M. A.; Brouillet, N.; Brüsemeister, T.; Bucciarelli, B.; Budnik, F.; Burgess, P.; Burgon, R.; Burlacu, A.; Busonero, D.; Buzzi, R.; Caffau, E.; Cambras, J.; Campbell, H.; Cancelliere, R.; Cantat-Gaudin, T.; Carlucci, T.; Carrasco, J. M.; Castellani, M.; Charlot, P.; Charnas, J.; Charvet, P.; Chassat, F.; Chiavassa, A.; Clotet, M.; Cocozza, G.; Collins, R. S.; Collins, P.; Costigan, G.; Crifo, F.; Cross, N. J. G.; Crosta, M.; Crowley, C.; Dafonte, C.; Damerdji, Y.; Dapergolas, A.; David, P.; David, M.; De Cat, P.; de Felice, F.; de Laverny, P.; De Luise, F.; De March, R.; de Martino, D.; de Souza, R.; Debosscher, J.; del Pozo, E.; Delbo, M.; Delgado, A.; Delgado, H. E.; di Marco, F.; Di Matteo, P.; Diakite, S.; Distefano, E.; Dolding, C.; Dos Anjos, S.; Drazinos, P.; Durán, J.; Dzigan, Y.; Ecale, E.; Edvardsson, B.; Enke, H.; Erdmann, M.; Escolar, D.; Espina, M.; Evans, N. W.; Eynard Bontemps, G.; Fabre, C.; Fabrizio, M.; Faigler, S.; Falcão, A. J.; Farràs Casas, M.; Faye, F.; Federici, L.; Fedorets, G.; Fernández-Hernández, J.; Fernique, P.; Fienga, A.; Figueras, F.; Filippi, F.; Findeisen, K.; Fonti, A.; Fouesneau, M.; Fraile, E.; Fraser, M.; Fuchs, J.; Furnell, R.; Gai, M.; Galleti, S.; Galluccio, L.; Garabato, D.; García-Sedano, F.; Garé, P.; Garofalo, A.; Garralda, N.; Gavras, P.; Gerssen, J.; Geyer, R.; Gilmore, G.; Girona, S.; Giuffrida, G.; Gomes, M.; González-Marcos, A.; González-Núñez, J.; González-Vidal, J. J.; Granvik, M.; Guerrier, A.; Guillout, P.; Guiraud, J.; Gúrpide, A.; Gutiérrez-Sánchez, R.; Guy, L. P.; Haigron, R.; Hatzidimitriou, D.; Haywood, M.; Heiter, U.; Helmi, A.; Hobbs, D.; Hofmann, W.; Holl, B.; Holland, G.; Hunt, J. A. S.; Hypki, A.; Icardi, V.; Irwin, M.; Jevardat de Fombelle, G.; Jofré, P.; Jonker, P. G.; Jorissen, A.; Julbe, F.; Karampelas, A.; Kochoska, A.; Kohley, R.; Kolenberg, K.; Kontizas, E.; Koposov, S. E.; Kordopatis, G.; Koubsky, P.; Kowalczyk, A.; Krone-Martins, A.; Kudryashova, M.; Kull, I.; Bachchan, R. K.; Lacoste-Seris, F.; Lanza, A. F.; Lavigne, J. -B; Le Poncin-Lafitte, C.; Lebreton, Y.; Lebzelter, T.; Leccia, S.; Leclerc, N.; Lecoeur-Taibi, I.; Lemaitre, V.; Lenhardt, H.; Leroux, F.; Liao, S.; Licata, E.; Lindstrøm, H. E. P.; Lister, T. A.; Livanou, E.; Lobel, A.; Löffler, W.; López, M.; Lopez-Lozano, A.; Lorenz, D.; Loureiro, T.; MacDonald, I.; Magalhães Fernandes, T.; Managau, S.; Mann, R. G.; Mantelet, G.; Marchal, O.; Marchant, J. M.; Marconi, M.; Marie, J.; Marinoni, S.; Marrese, P. M.; Marschalkó, G.; Marshall, D. J.; Martín-Fleitas, J. M.; Martino, M.; Mary, N.; Matijevič, G.; Mazeh, T.; McMillan, P. J.; Messina, S.; Mestre, A.; Michalik, D.; Millar, N. R.; Miranda, B. M. H.; Molina, D.; Molinaro, R.; Molinaro, M.; Molnár, L.; Moniez, M.; Montegriffo, P.; Monteiro, D.; Mor, R.; Mora, A.; Morbidelli, R.; Morel, T.; Morgenthaler, S.; Morley, T.; Morris, D.; Mulone, A. F.; Muraveva, T.; Musella, I.; Narbonne, J.; Nelemans, G.; Nicastro, L.; Noval, L.; Ordénovic, C.; Ordieres-Meré, J.; Osborne, P.; Pagani, C.; Pagano, I.; Pailler, F.; Palacin, H.; Palaversa, L.; Parsons, P.; Paulsen, T.; Pecoraro, M.; Pedrosa, R.; Pentikäinen, H.; Pereira, J.; Pichon, B.; Piersimoni, A. M.; Pineau, F. -X; Plachy, E.; Plum, G.; Poujoulet, E.; Prša, A.; Pulone, L.; Ragaini, S.; Rago, S.; Rambaux, N.; Ramos-Lerate, M.; Ranalli, P.; Rauw, G.; Read, A.; Regibo, S.; Renk, F.; Reylé, C.; Ribeiro, R. A.; Rimoldini, L.; Ripepi, V.; Riva, A.; Rixon, G.; Roelens, M.; Romero-Gómez, M.; Rowell, N.; Royer, F.; Rudolph, A.; Ruiz-Dern, L.; Sadowski, G.; Sagristà Sellés, T.; Sahlmann, J.; Salgado, J.; Salguero, E.; Sarasso, M.; Savietto, H.; Schnorhk, A.; Schultheis, M.; Sciacca, E.; Segol, M.; Segovia, J. C.; Segransan, D.; Serpell, E.; Shih, I. -C; Smareglia, R.; Smart, R. L.; Smith, C.; Solano, E.; Solitro, F.; Sordo, R.; Soria Nieto, S.; Souchay, J.; Spagna, A.; Spoto, F.; Stampa, U.; Steele, I. A.; Steidelmüller, H.; Stephenson, C. A.; Stoev, H.; Suess, F. F.; Süveges, M.; Surdej, J.; Szabados, L.; Szegedi-Elek, E.; Tapiador, D.; Taris, F.; Tauran, G.; Taylor, M. B.; Teixeira, R.; Terrett, D.; Tingley, B.; Trager, S. C.; Turon, C.; Ulla, A.; Utrilla, E.; Valentini, G.; van Elteren, A.; Van Hemelryck, E.; van Leeuwen, M.; Varadi, M.; Vecchiato, A.; Veljanoski, J.; Via, T.; Vicente, D.; Vogt, S.; Voss, H.; Votruba, V.; Voutsinas, S.; Walmsley, G.; Weiler, M.; Weingrill, K.; Werner, D.; Wevers, T.; Whitehead, G.; Wyrzykowski, Ł.; Yoldas, A.; Žerjal, M.; Zucker, S.; Zurbach, C.; Zwitter, T.; Alecu, A.; Allen, M.; Allende Prieto, C.; Amorim, A.; Anglada-Escudé, G.; Arsenijevic, V.; Azaz, S.; Balm, P.; Beck, M.; Bernstein, H. -H; Bigot, L.; Bijaoui, A.; Blasco, C.; Bonfigli, M.; Bono, G.; Boudreault, S.; Bressan, A.; Brown, S.; Brunet, P. -M; Bunclark, P.; Buonanno, R.; Butkevich, A. G.; Carret, C.; Carrion, C.; Chemin, L.; Chéreau, F.; Corcione, L.; Darmigny, E.; de Boer, K. S.; de Teodoro, P.; de Zeeuw, P. T.; Delle Luche, C.; Domingues, C. D.; Dubath, P.; Fodor, F.; Frézouls, B.; Fries, A.; Fustes, D.; Fyfe, D.; Gallardo, E.; Gallegos, J.; Gardiol, D.; Gebran, M.; Gomboc, A.; Gómez, A.; Grux, E.; Gueguen, A.; Heyrovsky, A.; Hoar, J.; Iannicola, G.; Isasi Parache, Y.; Janotto, A. -M; Joliet, E.; Jonckheere, A.; Keil, R.; Kim, D. -W; Klagyivik, P.; Klar, J.; Knude, J.; Kochukhov, O.; Kolka, I.; Kos, J.; Kutka, A.; Lainey, V.; LeBouquin, D.; Liu, C.; Loreggia, D.; Makarov, V. V.; Marseille, M. G.; Martayan, C.; Martinez-Rubi, O.; Massart, B.; Meynadier, F.; Mignot, S.; Munari, U.; Nguyen, A. -T; Nordlander, T.; Ocvirk, P.; O'Flaherty, K. S.; Olias Sanz, A.; Ortiz, P.; Osorio, J.; Oszkiewicz, D.; Ouzounis, A.; Palmer, M.; Park, P.; Pasquato, E.; Peltzer, C.; Peralta, J.; Péturaud, F.; Pieniluoma, T.; Pigozzi, E.; Poels, J.; Prat, G.; Prod'homme, T.; Raison, F.; Rebordao, J. M.; Risquez, D.; Rocca-Volmerange, B.; Rosen, S.; Ruiz-Fuertes, M. I.; Russo, F.; Sembay, S.; Serraller Vizcaino, I.; Short, A.; Siebert, A.; Silva, H.; Sinachopoulos, D.; Slezak, E.; Soffel, M.; Sosnowska, D.; Straižys, V.; ter Linden, M.; Terrell, D.; Theil, S.; Tiede, C.; Troisi, L.; Tsalmantza, P.; Tur, D.; Vaccari, M.; Vachier, F.; Valles, P.; Van Hamme, W.; Veltz, L.; Virtanen, J.; Wallut, J. -M; Wichmann, R.; Wilkinson, M. I.; Ziaeepour, H.; Zschocke, S.

    2016-01-01

    Gaia is a cornerstone mission in the science programme of the EuropeanSpace Agency (ESA). The spacecraft construction was approved in 2006, following a study in which the original interferometric concept was changed to a direct-imaging approach. Both the spacecraft and the payload were built by

  19. The Mothership Mission Architecture

    Science.gov (United States)

    Ernst, S. M.; DiCorcia, J. D.; Bonin, G.; Gump, D.; Lewis, J. S.; Foulds, C.; Faber, D.

    2015-12-01

    The Mothership is considered to be a dedicated deep space carrier spacecraft. It is currently being developed by Deep Space Industries (DSI) as a mission concept that enables a broad participation in the scientific exploration of small bodies - the Mothership mission architecture. A Mothership shall deliver third-party nano-sats, experiments and instruments to Near Earth Asteroids (NEOs), comets or moons. The Mothership service includes delivery of nano-sats, communication to Earth and visuals of the asteroid surface and surrounding area. The Mothership is designed to carry about 10 nano-sats, based upon a variation of the Cubesat standard, with some flexibility on the specific geometry. The Deep Space Nano-Sat reference design is a 14.5 cm cube, which accommodates the same volume as a traditional 3U CubeSat. To reduce cost, Mothership is designed as a secondary payload aboard launches to GTO. DSI is offering slots for nano-sats to individual customers. This enables organizations with relatively low operating budgets to closely examine an asteroid with highly specialized sensors of their own choosing and carry out experiments in the proximity of or on the surface of an asteroid, while the nano-sats can be built or commissioned by a variety of smaller institutions, companies, or agencies. While the overall Mothership mission will have a financial volume somewhere between a European Space Agencies' (ESA) S- and M-class mission for instance, it can be funded through a number of small and individual funding sources and programs, hence avoiding the processes associated with traditional space exploration missions. DSI has been able to identify a significant interest in the planetary science and nano-satellite communities.

  20. The Double Star mission

    Directory of Open Access Journals (Sweden)

    Liu

    2005-11-01

    Full Text Available The Double Star Programme (DSP was first proposed by China in March, 1997 at the Fragrant Hill Workshop on Space Science, Beijing, organized by the Chinese Academy of Science. It is the first mission in collaboration between China and ESA. The mission is made of two spacecraft to investigate the magnetospheric global processes and their response to the interplanetary disturbances in conjunction with the Cluster mission. The first spacecraft, TC-1 (Tan Ce means "Explorer", was launched on 29 December 2003, and the second one, TC-2, on 25 July 2004 on board two Chinese Long March 2C rockets. TC-1 was injected in an equatorial orbit of 570x79000 km altitude with a 28° inclination and TC-2 in a polar orbit of 560x38000 km altitude. The orbits have been designed to complement the Cluster mission by maximizing the time when both Cluster and Double Star are in the same scientific regions. The two missions allow simultaneous observations of the Earth magnetosphere from six points in space. To facilitate the comparison of data, half of the Double Star payload is made of spare or duplicates of the Cluster instruments; the other half is made of Chinese instruments. The science operations are coordinated by the Chinese DSP Scientific Operations Centre (DSOC in Beijing and the European Payload Operations Service (EPOS at RAL, UK. The spacecraft and ground segment operations are performed by the DSP Operations and Management Centre (DOMC and DSOC in China, using three ground station, in Beijing, Shanghai and Villafranca.

  1. Creating Communications, Computing, and Networking Technology Development Road Maps for Future NASA Human and Robotic Missions

    Science.gov (United States)

    Bhasin, Kul; Hayden, Jeffrey L.

    2005-01-01

    For human and robotic exploration missions in the Vision for Exploration, roadmaps are needed for capability development and investments based on advanced technology developments. A roadmap development process was undertaken for the needed communications, and networking capabilities and technologies for the future human and robotics missions. The underlying processes are derived from work carried out during development of the future space communications architecture, an d NASA's Space Architect Office (SAO) defined formats and structures for accumulating data. Interrelationships were established among emerging requirements, the capability analysis and technology status, and performance data. After developing an architectural communications and networking framework structured around the assumed needs for human and robotic exploration, in the vicinity of Earth, Moon, along the path to Mars, and in the vicinity of Mars, information was gathered from expert participants. This information was used to identify the capabilities expected from the new infrastructure and the technological gaps in the way of obtaining them. We define realistic, long-term space communication architectures based on emerging needs and translate the needs into interfaces, functions, and computer processing that will be required. In developing our roadmapping process, we defined requirements for achieving end-to-end activities that will be carried out by future NASA human and robotic missions. This paper describes: 10 the architectural framework developed for analysis; 2) our approach to gathering and analyzing data from NASA, industry, and academia; 3) an outline of the technology research to be done, including milestones for technology research and demonstrations with timelines; and 4) the technology roadmaps themselves.

  2. MERLIN: a Franco-German LIDAR space mission for atmospheric methane

    Science.gov (United States)

    Bousquet, P.; Ehret, G.; Pierangelo, C.; Marshall, J.; Bacour, C.; Chevallier, F.; Gibert, F.; Armante, R.; Crevoisier, C. D.; Edouart, D.; Esteve, F.; Julien, E.; Kiemle, C.; Alpers, M.; Millet, B.

    2017-12-01

    The Methane Remote Sensing Lidar Mission (MERLIN), currently in phase C, is a joint cooperation between France and Germany on the development, launch and operation of a space LIDAR dedicated to the retrieval of total weighted methane (CH4) atmospheric columns. Atmospheric methane is the second most potent anthropogenic greenhouse gas, contributing 20% to climate radiative forcing but also plying an important role in atmospheric chemistry as a precursor of tropospheric ozone and low-stratosphere water vapour. Its short lifetime ( 9 years) and the nature and variety of its anthropogenic sources also offer interesting mitigation options in regards to the 2° objective of the Paris agreement. For the first time, measurements of atmospheric composition will be performed from space thanks to an IPDA (Integrated Path Differential Absorption) LIDAR (Light Detecting And Ranging), with a precision (target ±27 ppb for a 50km aggregation along the trace) and accuracy (target recall the MERLIN objectives and mission characteristics. We also propose an end-to-end error analysis, from the causes of random and systematic errors of the instrument, of the platform and of the data treatment, to the error on methane emissions. To do so, we propose an OSSE analysis (observing system simulation experiment) to estimate the uncertainty reduction on methane emissions brought by MERLIN XCH4. The originality of our inversion system is to transfer both random and systematic errors from the observation space to the flux space, thus providing more realistic error reductions than usually provided in OSSE only using the random part of errors. Uncertainty reductions are presented using two different atmospheric transport models, TM3 and LMDZ, and compared with error reduction achieved with the GOSAT passive mission.

  3. B plant mission analysis report

    International Nuclear Information System (INIS)

    Lund, D.P.

    1995-01-01

    This report further develops the mission for B Plant originally defined in WHC-EP-0722, ''System Engineering Functions and Requirements for the Hanford Cleanup Mission: First Issue.'' The B Plant mission analysis will be the basis for a functional analysis that breaks down the B Plant mission statement into the necessary activities to accomplish the mission. These activities are the product of the functional analysis and will then be used in subsequent steps of the systems engineering process, such as identifying requirements and allocating those requirements to B Plant functions. The information in this mission analysis and the functional and requirements analysis are a part of the B Plant technical baseline

  4. Enabling the First Interstellar Missions

    Science.gov (United States)

    Lubin, P.

    2017-12-01

    All propulsion systems that leave the Earth are based on chemical reactions. Chemical reactions, at best, have an efficiency compared to rest mass of 10-10 (or about 1eV per bond). All the mass in the universe converted to chemical reactions would not propel even a single proton to relativistic speeds. While chemistry will get us to Mars it will not allow interstellar capability in any reasonable mission time. Barring new physics we are left with few realistic solutions. None of our current propulsion systems, including nuclear, are capable of the relativistic speeds needed for exploring the many nearby stellar systems and exo-planets. However recent advances in photonics and directed energy systems now allow us to realize what was only a decade ago, simply science fiction, namely the ability to seriously conceive of and plan for relativistic flight. From fully-functional gram-level wafer-scale spacecraft capable of speeds greater than c/4 that could reach the nearest star in 20 years to spacecraft for large missions capable of supporting human life with masses more than 105 kg (100 tons) for rapid interplanetary transit that could reach speeds of greater than 1000 km/s can be realized. With this technology spacecraft can be propelled to speeds currently unimaginable. Photonics, like electronics, and unlike chemical propulsion is an exponential technology with a current double time of about 20 months. This is the key. The cost of such a system is amortized over the essentially unlimited number of launches. In addition, the same photon driver can be used for many other purposes including beamed energy to power high Isp ion engines, remote asteroid composition analysis and planetary defense. This would be a profound change in human capability with enormous implications. Known as Starlight we are now in a NASA Phase II study. The FY 2017 congressional appropriations request directs NASA to study the feasibility of an interstellar mission to coincide with the 100th

  5. Spacelab 3 mission

    Science.gov (United States)

    Dalton, Bonnie P.

    1990-01-01

    Spacelab-3 (SL-3) was the first microgravity mission of extended duration involving crew interaction with animal experiments. This interaction involved sharing the Spacelab environmental system, changing animal food, and changing animal waste trays by the crew. Extensive microbial testing was conducted on the animal specimens and crew and on their ground and flight facilities during all phases of the mission to determine the potential for cross contamination. Macroparticulate sampling was attempted but was unsuccessful due to the unforseen particulate contamination occurring during the flight. Particulate debris of varying size (250 micron to several inches) and composition was recovered post flight from the Spacelab floor, end cones, overhead areas, avionics fan filter, cabin fan filters, tunnel adaptor, and from the crew module. These data are discussed along with solutions, which were implemented, for particulate and microbial containment for future flight facilities.

  6. Cyber Network Mission Dependencies

    Science.gov (United States)

    2015-09-18

    leak paths”) and determine if firewalls and router access control lists are violating network policy. Visualization tools are provided to help analysts...with which a supply agent may not be familiar. In this environment, errors in requisition are easy to make, and they are costly : an incomplete cyber...establishing an email network and recommend a firewall and additional laptops. YMAL would also match mission details like the deployment location with

  7. A Somalia mission experience.

    Science.gov (United States)

    Mahomed, Zeyn; Moolla, Muhammad; Motara, Feroza; Laher, Abdullah

    2012-06-28

    Reports about The Horn of Africa Famine Crisis in 2011 flooded our news bulletins and newspapers. Yet the nations of the world failed to respond and alleviate the unfolding disaster. In August 2011, the Gift of the Givers Foundation mobilised what was to become the largest humanitarian mission ever conducted by an African organisation. Almost a year later, the effort continues, changing the face of disaster medicine as we know it.

  8. The money mission matrix

    OpenAIRE

    Cuperus, Mirthe

    2017-01-01

    Social entrepreneurship is popular in current academics and other media. This thesis adds to this literature by discovering what the drivers are for sustainable social entrepreneurship. Several stakeholders were identified, creating profiles of the key players in social entrepreneurship. These stakeholders uncovered key factors that represent the drivers for sustainable social entrepreneurship. Key factors were then aligned along the two dimensions: Money and Mission. This crea...

  9. Asteroid Kinetic Impactor Missions

    Science.gov (United States)

    Chesley, Steven

    2015-08-01

    Asteroid impact missions can be carried out as a relatively low-cost add-ons to most asteroid rendezvous missions and such impact experiments have tremendous potential, both scientifically and in the arena of planetary defense.The science returns from an impactor demonstration begin with the documentation of the global effects of the impact, such as changes in orbit and rotation state, the creation and dissipation of an ejecta plume and debris disk, and morphological changes across the body due to the transmission of seismic waves, which might induce landslides and toppling of boulders, etc. At a local level, an inspection of the impact crater and ejecta blanket reveals critical material strength information, as well as spectral differences between the surface and subsurface material.From the planetary defense perspective, an impact demonstration will prove humankind’s capacity to alter the orbit of a potentially threatening asteroid. This technological leap comes in two parts. First, terminal guidance systems that can deliver an impactor with small errors relative to the ~100-200 meter size of a likely impactor have yet to be demonstrated in a deep space environment. Second, the response of an asteroid to such an impact is only understood theoretically due to the potentially significant dependence on the momentum carried by escaping ejecta, which would tend to enhance the deflection by tens of percent and perhaps as much as a factor of a few. A lack of validated understanding of momentum enhancement is a significant obstacle in properly sizing a real-world impactor deflection mission.This presentation will describe the drivers for asteroid impact demonstrations and cover the range of such concepts, starting with ESA’s pioneering Don Quijote mission concept and leading to a brief description of concepts under study at the present time, including the OSIRIS-REx/ISIS, BASiX/KIX and AIM/DART (AIDA) concepts.

  10. The Gaia mission

    OpenAIRE

    Prusti, T.; de Bruijne, J. H. J.; Brown, A. G. A.; Vallenari, A.; Babusiaux, C.; Bailer-Jones, C. A. L.; Bastian, U.; Biermann, M.; Evans, D. W.; Eyer, L.; Jansen, F.; Jordi, C.; Klioner, S. A.; Lammers, U.; Lindegren, L.

    2016-01-01

    Gaia is a cornerstone mission in the science programme of the European Space Agency (ESA). The spacecraft construction was approved in 2006, following a study in which the original interferometric concept was changed to direct-imaging approach. Both the spacecraft and the payload were built by European industry. The involvement of the scientific community focusses on data processing for which the international Gaia Data Processing and Analysis Consortium (DPAC) was selected in 2007. Gaia wa...

  11. Nanosatellite missions - the future

    Science.gov (United States)

    Koudelka, O.; Kuschnig, R.; Wenger, M.; Romano, P.

    2017-09-01

    In the beginning, nanosatellite projects were focused on educational aspects. In the meantime, the technology matured and now allows to test, demonstrate and validate new systems, operational procedures and services in space at low cost and within much shorter timescales than traditional space endeavors. The number of spacecraft developed and launched has been increasing exponentially in the last years. The constellation of BRITE nanosatellites is demonstrating impressively that demanding scientific requirements can be met with small, low-cost satellites. Industry and space agencies are now embracing small satellite technology. Particularly in the USA, companies have been established to provide commercial services based on CubeSats. The approach is in general different from traditional space projects with their strict product/quality assurance and documentation requirements. The paper gives an overview of nanosatellite missions in different areas of application. Based on lessons learnt from the BRITE mission and recent developments at TU Graz (in particular the implementation of the OPS-SAT nanosatellite for ESA), enhanced technical possibilities for a future astronomy mission after BRITE will be discussed. Powerful on-board computers will allow on-board data pre-processing. A state-of-the-art telemetry system with high data rates would facilitate interference-free operations and increase science data return.

  12. Dawn Mission Update

    Science.gov (United States)

    Sykes, M. V.; Russell, C. T.; Coradini, A.; Christensen, U.; de Sanctis, M. C.; Feldman, W. C.; Jaumann, R.; Keller, U.; Konopliv, A. S.; McCord, T. B.; McFadden, L. A.; McSween, H. Y.; Mottola, S.; Neukum, G.; Pieters, C. M.; Prettyman, T. H.; Raymond, C. A.; Smith, D. E.; Williams, B. G.; Wise, J.; Zuber, M. T.

    2004-11-01

    Dawn, the ninth Discovery mission, will be the first spacecraft to rendezvous with two solar system bodies, the main belt asteroids Vesta and Ceres. This is made possible by utilizing ion propulsion to reach its targets and to maneuver into (and depart) orbits about these bodies. Vesta and Ceres are two terrestrial protoplanets that have survived since the earliest epoch of the solar system and will provide important insights into planet building processes and their evolution under very different circumstances, with and without water. Dawn carries a double framing camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron detector. At Vesta our studies will include the volcanic emplacement of basalts, its differentiation, the possible exposure of its interior near the south pole. At Ceres our studies will include the role of water in its evolution, hydration processes on its surface, and the possible existence of a subsurface ocean. The mission has passed its critical design review and is scheduled to be launched in June 2006 with arrival at Vesta in 2011 and Ceres in 2015. Operation strategies will be presented. Groundbased observations of Vesta, Ceres, and Vesta family members over broad wavelengths, periods and phases will play an important role in detailed mission planning.

  13. Landsat Data Continuity Mission

    Science.gov (United States)

    ,

    2012-01-01

    The Landsat Data Continuity Mission (LDCM) is a partnership formed between the National Aeronautics and Space Administration (NASA) and the U.S. Geological Survey (USGS) to place the next Landsat satellite in orbit in January 2013. The Landsat era that began in 1972 will become a nearly 41-year global land record with the successful launch and operation of the LDCM. The LDCM will continue the acquisition, archiving, and distribution of multispectral imagery affording global, synoptic, and repetitive coverage of the Earth's land surfaces at a scale where natural and human-induced changes can be detected, differentiated, characterized, and monitored over time. The mission objectives of the LDCM are to (1) collect and archive medium resolution (30-meter spatial resolution) multispectral image data affording seasonal coverage of the global landmasses for a period of no less than 5 years; (2) ensure that LDCM data are sufficiently consistent with data from the earlier Landsat missions in terms of acquisition geometry, calibration, coverage characteristics, spectral characteristics, output product quality, and data availability to permit studies of landcover and land-use change over time; and (3) distribute LDCM data products to the general public on a nondiscriminatory basis at no cost to the user.

  14. The Spartan 1 mission

    Science.gov (United States)

    Cruddace, Raymond G.; Fritz, G. G.; Shrewsberry, D. J.; Brandenstein, D. J.; Creighton, D. C.; Gutschewski, G.; Lucid, S. W.; Nagel, J. M.; Fabian, J. M.; Zimmerman, D.

    1989-01-01

    The first Spartan mission is documented. The Spartan program, an outgrowth of a joint Naval Research Laboratory (NRL)/National Aeronautics and Space Administration (NASA)-Goddard Space Flight Center (GSFC) development effort, was instituted by NASA for launching autonomous, recoverable payloads from the space shuttle. These payloads have a precise pointing system and are intended to support a wide range of space-science observations and experiments. The first Spartan, carrying an NRL X-ray astronomy instrument, was launched by the orbiter Discovery (STS51G) on June 20, 1985 and recovered successfully 45 h later, on June 22. During this period, Spartan 1 conducted a preprogrammed series of observations of two X-ray sources: the Perseus cluster of galaxies and the center of our galaxy. The mission was successful from both on engineering and a scientific viewpoint. Only one problem was encountered, the attitude control system (ACS) shut down earlier than planned because of high attitude control system gas consumption. A preplanned emergency mode then placed Spartan 1 into a stable, safe condition and allowed a safe recovery. The events are described of the mission and presents X-ray maps of the two observed sources, which were produced from the flight data.

  15. SPICE for ESA Planetary Missions

    Science.gov (United States)

    Costa, M.

    2018-04-01

    The ESA SPICE Service leads the SPICE operations for ESA missions and is responsible for the generation of the SPICE Kernel Dataset for ESA missions. This contribution will describe the status of these datasets and outline the future developments.

  16. Mission Critical Occupation (MCO) Charts

    Data.gov (United States)

    Office of Personnel Management — Agencies report resource data and targets for government-wide mission critical occupations and agency specific mission critical and/or high risk occupations. These...

  17. Rapid analysis and exploration of fluorescence microscopy images.

    Science.gov (United States)

    Pavie, Benjamin; Rajaram, Satwik; Ouyang, Austin; Altschuler, Jason M; Steininger, Robert J; Wu, Lani F; Altschuler, Steven J

    2014-03-19

    Despite rapid advances in high-throughput microscopy, quantitative image-based assays still pose significant challenges. While a variety of specialized image analysis tools are available, most traditional image-analysis-based workflows have steep learning curves (for fine tuning of analysis parameters) and result in long turnaround times between imaging and analysis. In particular, cell segmentation, the process of identifying individual cells in an image, is a major bottleneck in this regard. Here we present an alternate, cell-segmentation-free workflow based on PhenoRipper, an open-source software platform designed for the rapid analysis and exploration of microscopy images. The pipeline presented here is optimized for immunofluorescence microscopy images of cell cultures and requires minimal user intervention. Within half an hour, PhenoRipper can analyze data from a typical 96-well experiment and generate image profiles. Users can then visually explore their data, perform quality control on their experiment, ensure response to perturbations and check reproducibility of replicates. This facilitates a rapid feedback cycle between analysis and experiment, which is crucial during assay optimization. This protocol is useful not just as a first pass analysis for quality control, but also may be used as an end-to-end solution, especially for screening. The workflow described here scales to large data sets such as those generated by high-throughput screens, and has been shown to group experimental conditions by phenotype accurately over a wide range of biological systems. The PhenoBrowser interface provides an intuitive framework to explore the phenotypic space and relate image properties to biological annotations. Taken together, the protocol described here will lower the barriers to adopting quantitative analysis of image based screens.

  18. Rapidly Adaptable Instrumentation Tester (RAIT)

    International Nuclear Information System (INIS)

    Vargo, Timothy D.

    1999-01-01

    Emerging technologies in the field of ''Test ampersand Measurement'' have recently enabled the development of the Rapidly Adaptable Instrumentation Tester (RAIT). Based on software developed with LabVIEW, the RAIT design enables quick reconfiguration to test and calibrate a wide variety of telemetry systems. The consequences of inadequate testing could be devastating if a telemetry system were to fail during an expensive flight mission. Supporting both open-bench testing as well as automated test sequences, the RAIT has significantly lowered total time required to test and calibrate a system. This has resulted in an overall lower per unit testing cost than has been achievable in the past

  19. CryoSat-2 science algorithm status, expected future improvements and impacts concerning Sentinel-3 and Jason-CS missions

    Science.gov (United States)

    Cullen, R.; Wingham, D.; Francis, R.; Parrinello, T.

    2011-12-01

    With CryoSat-2 soon to enter its second year of post commissioning operations there is now sufficient experience and evidence showing improvements of the SIRAL's (Synthetic interferometric radar altimeter) SAR and SARIn modes over conventional pulse-width limited altimeters for both the targeted marine/land ice fields but also for non mission relevant surfaces such as the ocean, for example. In the process of understanding the CryoSat data some side effects of the end-to-end platform measurement and ground retrieval system have been identified and whilst those key to mission success are understood and are being handled others, remain open and pave the way to longer term fine-tuning. Of interest to the session will be a summary of the manditory changes made during 2011 to all the modes of CryoSat-2 science processing with a view to longer term algorithm improvements that could benefit the planned mid-to-late nominal operations re-processing. Since some of the science processor improvements have direct implication to the SAR mode processing of Sentinel-3 and Jason-CS science then these will also be highlighted. Finally a summary of the CryoSat-2 in-orbit platform and payload performances and their stability will also be provided. Expectations of the longer term uses of CryoSat's primary sensor (SIRAL) and its successors will be discussed.

  20. Mission to Planet Earth

    Science.gov (United States)

    Tilford, Shelby G.; Asrar, Ghassem; Backlund, Peter W.

    1994-01-01

    Mission to Planet Earth (MTPE) is NASA's concept for an international science program to produce the understanding needed to predict changes in the Earth's environment. NASA and its interagency and international partners will place satellites carrying advanced sensors in strategic Earth orbits to gather multidisciplinary data. A sophisticated data system will process and archive an unprecedented amount of information about the Earth and how it works as a system. Increased understanding of the Earth system is a basic human responsibility, a prerequisite to informed management of the planet's resources and to the preservation of the global environment.

  1. The ARTEMIS mission

    CERN Document Server

    Angelopoulos, Vassilis

    2014-01-01

    The ARTEMIS mission was initiated by skillfully moving the two outermost Earth-orbiting THEMIS spacecraft into lunar orbit to conduct unprecedented dual spacecraft observations of the lunar environment. ARTEMIS stands for Acceleration, Reconnection, Turbulence and Electrodynamics of the Moon's Interaction with the Sun. Indeed, this volume discusses initial findings related to the Moon’s magnetic and plasma environments and the electrical conductivity of the lunar interior. This work is aimed at researchers and graduate students in both heliophysics and planetary physics. Originally published in Space Science Reviews, Vol. 165/1-4, 2011.

  2. The solar probe mission

    International Nuclear Information System (INIS)

    Feldman, W.C.; Anderson, J.; Bohlin, J.D.; Burlaga, L.F.; Farquhar, R.; Gloeckler, G.; Goldstein, B.E.; Harvey, J.W.; Holzer, T.E.; Jones, W.V.; Kellogg, P.J.; Krimigis, S.M.; Kundu, M.R.; Lazarus, A.J.; Mellott, M.M.; Parker, E.N.; Rosner, R.; Rottman, G.J.; Slavin, J.A.; Suess, S.T.; Tsurutani, B.T.; Woo, R.T.; Zwickl, R.D.

    1990-01-01

    The Solar Probe will deliver a 133.5 kg science payload into a 4 R s perihelion solar polar orbit (with the first perihelion passage in 2004) to explore in situ one of the last frontiers in the solar system---the solar corona. This mission is both affordable and technologically feasible. Using a payload of 12 (predominantly particles and fields) scientific experiments, it will be possible to answer many long-standing, fundamental problems concerning the structure and dynamics of the outer solar atmosphere, including the acceleration, storage, and transport of energetic particles near the Sun and in the inner ( s ) heliosphere

  3. Mission to Planet Earth

    International Nuclear Information System (INIS)

    Wilson, G.S.; Backlund, P.W.

    1992-01-01

    Mission to Planet Earth (MTPE) is NASA's concept for an international science program to produce the understanding needed to predict changes in the earth's environment. NASA and its interagency and international partners will place satellites carrying advanced sensors in strategic earth orbits to gather multidisciplinary data. A sophisticated data system will process and archive an unprecedented amount of information about the earth and how it works as a system. Increased understanding of the earth system is a basic human responsibility, a prerequisite to informed management of the planet's resources and to the preservation of the global environment. 8 refs

  4. High Performance Configurable Electrical Power System for LEO Missions, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Leveraging Tyvak personnel?s extensive experience in end-to-end technology development cycles, we propose to design, fabricate and qualify an EPS system targeting 50...

  5. Collaboration support system for "Phobos-Soil" space mission.

    Science.gov (United States)

    Nazarov, V.; Nazirov, R.; Zakharov, A.

    2009-04-01

    Rapid development of communication facilities leads growth of interactions done via electronic means. However we can see some paradox in this segment in last times: Extending of communication facilities increases collaboration chaos. And it is very sensitive for space missions in general and scientific space mission particularly because effective decision of this task provides successful realization of the missions and promises increasing the ratio of functional characteristic and cost of mission at all. Resolving of this problem may be found by using respective modern technologies and methods which widely used in different branches and not in the space researches only. Such approaches as Social Networking, Web 2.0 and Enterprise 2.0 look most prospective in this context. The primary goal of the "Phobos-Soil" mission is an investigation of the Phobos which is the Martian moon and particularly its regolith, internal structure, peculiarities of the orbital and proper motion, as well as a number of different scientific measurements and experiments for investigation of the Martian environment. A lot of investigators involved in the mission. Effective collaboration system is key facility for information support of the mission therefore. Further to main goal: communication between users of the system, modern approaches allows using such capabilities as self-organizing community, user generated content, centralized and federative control of the system. Also it may have one unique possibility - knowledge management which is very important for space mission realization. Therefore collaboration support system for "Phobos-Soil" mission designed on the base of multilayer model which includes such levels as Communications, Announcement and Information, Data sharing and Knowledge management. The collaboration support system for "Phobos-Soil" mission will be used as prototype for prospective Russian scientific space missions and the presentation describes its architecture

  6. STS-61 mission director's post-mission report

    Science.gov (United States)

    Newman, Ronald L.

    1995-01-01

    To ensure the success of the complex Hubble Space Telescope servicing mission, STS-61, NASA established a number of independent review groups to assess management, design, planning, and preparation for the mission. One of the resulting recommendations for mission success was that an overall Mission Director be appointed to coordinate management activities of the Space Shuttle and Hubble programs and to consolidate results of the team reviews and expedite responses to recommendations. This report presents pre-mission events important to the experience base of mission management, with related Mission Director's recommendations following the event(s) to which they apply. All Mission Director's recommendations are presented collectively in an appendix. Other appendixes contain recommendations from the various review groups, including Payload Officers, the JSC Extravehicular Activity (EVA) Section, JSC EVA Management Office, JSC Crew and Thermal Systems Division, and the STS-61 crew itself. This report also lists mission events in chronological order and includes as an appendix a post-mission summary by the lead Payload Deployment and Retrieval System Officer. Recommendations range from those pertaining to specific component use or operating techniques to those for improved management, review, planning, and safety procedures.

  7. Solar maximum mission

    International Nuclear Information System (INIS)

    Ryan, J.

    1981-01-01

    By understanding the sun, astrophysicists hope to expand this knowledge to understanding other stars. To study the sun, NASA launched a satellite on February 14, 1980. The project is named the Solar Maximum Mission (SMM). The satellite conducted detailed observations of the sun in collaboration with other satellites and ground-based optical and radio observations until its failure 10 months into the mission. The main objective of the SMM was to investigate one aspect of solar activity: solar flares. A brief description of the flare mechanism is given. The SMM satellite was valuable in providing information on where and how a solar flare occurs. A sequence of photographs of a solar flare taken from SMM satellite shows how a solar flare develops in a particular layer of the solar atmosphere. Two flares especially suitable for detailed observations by a joint effort occurred on April 30 and May 21 of 1980. These flares and observations of the flares are discussed. Also discussed are significant discoveries made by individual experiments

  8. The Euclid mission design

    Science.gov (United States)

    Racca, Giuseppe D.; Laureijs, René; Stagnaro, Luca; Salvignol, Jean-Christophe; Lorenzo Alvarez, José; Saavedra Criado, Gonzalo; Gaspar Venancio, Luis; Short, Alex; Strada, Paolo; Bönke, Tobias; Colombo, Cyril; Calvi, Adriano; Maiorano, Elena; Piersanti, Osvaldo; Prezelus, Sylvain; Rosato, Pierluigi; Pinel, Jacques; Rozemeijer, Hans; Lesna, Valentina; Musi, Paolo; Sias, Marco; Anselmi, Alberto; Cazaubiel, Vincent; Vaillon, Ludovic; Mellier, Yannick; Amiaux, Jérôme; Berthé, Michel; Sauvage, Marc; Azzollini, Ruyman; Cropper, Mark; Pottinger, Sabrina; Jahnke, Knud; Ealet, Anne; Maciaszek, Thierry; Pasian, Fabio; Zacchei, Andrea; Scaramella, Roberto; Hoar, John; Kohley, Ralf; Vavrek, Roland; Rudolph, Andreas; Schmidt, Micha

    2016-07-01

    Euclid is a space-based optical/near-infrared survey mission of the European Space Agency (ESA) to investigate the nature of dark energy, dark matter and gravity by observing the geometry of the Universe and on the formation of structures over cosmological timescales. Euclid will use two probes of the signature of dark matter and energy: Weak gravitational Lensing, which requires the measurement of the shape and photometric redshifts of distant galaxies, and Galaxy Clustering, based on the measurement of the 3-dimensional distribution of galaxies through their spectroscopic redshifts. The mission is scheduled for launch in 2020 and is designed for 6 years of nominal survey operations. The Euclid Spacecraft is composed of a Service Module and a Payload Module. The Service Module comprises all the conventional spacecraft subsystems, the instruments warm electronics units, the sun shield and the solar arrays. In particular the Service Module provides the extremely challenging pointing accuracy required by the scientific objectives. The Payload Module consists of a 1.2 m three-mirror Korsch type telescope and of two instruments, the visible imager and the near-infrared spectro-photometer, both covering a large common field-of-view enabling to survey more than 35% of the entire sky. All sensor data are downlinked using K-band transmission and processed by a dedicated ground segment for science data processing. The Euclid data and catalogues will be made available to the public at the ESA Science Data Centre.

  9. EU Universities’ Mission Statements

    Directory of Open Access Journals (Sweden)

    Liudmila Arcimaviciene

    2015-04-01

    Full Text Available In the last 10 years, a highly productive space of metaphor analysis has been established in the discourse studies of media, politics, business, and education. In the theoretical framework of Conceptual Metaphor Theory and Critical Discourse Analysis, the restored metaphorical patterns are especially valued for their implied ideological value as realized both conceptually and linguistically. By using the analytical framework of Critical Metaphor Analysis and procedurally employing Pragglejaz Group’s Metaphor Identification Procedure, this study aims at analyzing the implied value of the evoked metaphors in the mission statements of the first 20 European Universities, according to the Webometrics ranking. In this article, it is proposed that Universities’ mission statements are based on the positive evaluation of the COMMERCE metaphor, which does not fully correlate with the ideological framework of sustainability education but is rather oriented toward consumerism in both education and society. Despite this overall trend, there are some traceable features of the conceptualization reflecting the sustainability approach to higher education, as related to freedom of speech, tolerance, and environmental concerns. Nonetheless, these are suppressed by the metaphoric usages evoking traditional dogmas of the conservative ideology grounded in the concepts of the transactional approach to relationship, competitiveness for superiority, the importance of self-interest and strength, and quantifiable quality.

  10. STS-78 Mission Insignia

    Science.gov (United States)

    1996-01-01

    The STS-78 patch links past with present to tell the story of its mission and science through a design imbued with the strength and vitality of the 2-dimensional art of North America's northwest coast Indians. Central to the design is the space Shuttle whose bold lines and curves evoke the Indian image for the eagle, a native American symbol of power and prestige as well as the national symbol of the United States. The wings of the Shuttle suggest the wings of the eagle whose feathers, indicative of peace and friendship in Indian tradition, are captured by the U forms, a characteristic feature of Northwest coast Indian art. The nose of the Shuttle is the strong downward curve of the eagle's beak, and the Shuttle's forward windows, the eagle's eyes, represented through the tapered S forms again typical of this Indian art form. The basic black and red atoms orbiting the mission number recall the original NASA emblem while beneath, utilizing Indian ovoid forms, the major mission scientific experiment package LMS (Life and Materials Sciences) housed in the Shuttle's cargo bay is depicted in a manner reminiscent of totem-pole art. This image of a bird poised for flight, so common to Indian art, is counterpointed by an equally familiar Tsimshian Indian symbol, a pulsating sun with long hyperbolic rays, the symbol of life. Within each of these rays are now encased crystals, the products of this mission's 3 major, high-temperature materials processing furnaces. And as the sky in Indian lore is a lovely open country, home of the Sun Chief and accessible to travelers through a hole in the western horizon, so too, space is a vast and beckoning landscape for explorers launched beyond the horizon. Beneath the Tsimshian sun, the colors of the earth limb are appropriately enclosed by a red border representing life to the Northwest coast Indians. The Indian colors of red, navy blue, white, and black pervade the STS-78 path. To the right of the Shuttle-eagle, the constellation

  11. SU-E-J-55: End-To-End Effectiveness Analysis of 3D Surface Image Guided Voluntary Breath-Holding Radiotherapy for Left Breast

    Energy Technology Data Exchange (ETDEWEB)

    Lin, M; Feigenberg, S [University of Maryland School of Medicine, Baltimore, MD (United States)

    2015-06-15

    Purpose To evaluate the effectiveness of using 3D-surface-image to guide breath-holding (BH) left-side breast treatment. Methods Two 3D surface image guided BH procedures were implemented and evaluated: normal-BH, taking BH at a comfortable level, and deep-inspiration-breath-holding (DIBH). A total of 20 patients (10 Normal-BH and 10 DIBH) were recruited. Patients received a BH evaluation using a commercialized 3D-surface- tracking-system (VisionRT, London, UK) to quantify the reproducibility of BH positions prior to CT scan. Tangential 3D/IMRT plans were conducted. Patients were initially setup under free-breathing (FB) condition using the FB surface obtained from the untaged CT to ensure a correct patient position. Patients were then guided to reach the planned BH position using the BH surface obtained from the BH CT. Action-levels were set at each phase of treatment process based on the information provided by the 3D-surface-tracking-system for proper interventions (eliminate/re-setup/ re-coaching). We reviewed the frequency of interventions to evaluate its effectiveness. The FB-CBCT and port-film were utilized to evaluate the accuracy of 3D-surface-guided setups. Results 25% of BH candidates with BH positioning uncertainty > 2mm are eliminated prior to CT scan. For >90% of fractions, based on the setup deltas from3D-surface-trackingsystem, adjustments of patient setup are needed after the initial-setup using laser. 3D-surface-guided-setup accuracy is comparable as CBCT. For the BH guidance, frequency of interventions (a re-coaching/re-setup) is 40%(Normal-BH)/91%(DIBH) of treatments for the first 5-fractions and then drops to 16%(Normal-BH)/46%(DIBH). The necessity of re-setup is highly patient-specific for Normal-BH but highly random among patients for DIBH. Overall, a −0.8±2.4 mm accuracy of the anterior pericardial shadow position was achieved. Conclusion 3D-surface-image technology provides effective intervention to the treatment process and ensures favorable day-to-day setup accuracy. DIBH setup appears to be more uncertain and this would be the patient group who will definitely benefit from the extra information of 3D surface setup.

  12. SU-E-J-55: End-To-End Effectiveness Analysis of 3D Surface Image Guided Voluntary Breath-Holding Radiotherapy for Left Breast

    International Nuclear Information System (INIS)

    Lin, M; Feigenberg, S

    2015-01-01

    Purpose To evaluate the effectiveness of using 3D-surface-image to guide breath-holding (BH) left-side breast treatment. Methods Two 3D surface image guided BH procedures were implemented and evaluated: normal-BH, taking BH at a comfortable level, and deep-inspiration-breath-holding (DIBH). A total of 20 patients (10 Normal-BH and 10 DIBH) were recruited. Patients received a BH evaluation using a commercialized 3D-surface- tracking-system (VisionRT, London, UK) to quantify the reproducibility of BH positions prior to CT scan. Tangential 3D/IMRT plans were conducted. Patients were initially setup under free-breathing (FB) condition using the FB surface obtained from the untaged CT to ensure a correct patient position. Patients were then guided to reach the planned BH position using the BH surface obtained from the BH CT. Action-levels were set at each phase of treatment process based on the information provided by the 3D-surface-tracking-system for proper interventions (eliminate/re-setup/ re-coaching). We reviewed the frequency of interventions to evaluate its effectiveness. The FB-CBCT and port-film were utilized to evaluate the accuracy of 3D-surface-guided setups. Results 25% of BH candidates with BH positioning uncertainty > 2mm are eliminated prior to CT scan. For >90% of fractions, based on the setup deltas from3D-surface-trackingsystem, adjustments of patient setup are needed after the initial-setup using laser. 3D-surface-guided-setup accuracy is comparable as CBCT. For the BH guidance, frequency of interventions (a re-coaching/re-setup) is 40%(Normal-BH)/91%(DIBH) of treatments for the first 5-fractions and then drops to 16%(Normal-BH)/46%(DIBH). The necessity of re-setup is highly patient-specific for Normal-BH but highly random among patients for DIBH. Overall, a −0.8±2.4 mm accuracy of the anterior pericardial shadow position was achieved. Conclusion 3D-surface-image technology provides effective intervention to the treatment process and ensures favorable day-to-day setup accuracy. DIBH setup appears to be more uncertain and this would be the patient group who will definitely benefit from the extra information of 3D surface setup

  13. Wide Area Recovery and Resiliency Program (WARRP) Biological Attack Response and Recovery: End to End Medical Countermeasure Distribution and Dispensing Processes

    Science.gov (United States)

    2012-04-24

    system consists of providing maternal and child health assistance, conducting studies and confronting local health issues such as tuberculosis and...cooperate if messaging is clear, consistent, timely and authoritative . Additionally, the different communications requirements and decision making styles...the public will look to government and other authoritative sources for guidance. The public currently has no expectation of communications

  14. The CTTC 5G End-to-End Experimental Platform : Integrating Heterogeneous Wireless/Optical Networks, Distributed Cloud, and IoT Devices

    OpenAIRE

    Munoz, Raul; Mangues-Bafalluy, Josep; Vilalta, Ricard; Verikoukis, Christos; Alonso-Zarate, Jesus; Bartzoudis, Nikolaos; Georgiadis, Apostolos; Payaro, Miquel; Perez-Neira, Ana; Casellas, Ramon; Martinez, Ricardo; Nunez-Martinez, Jose; Requena Esteso, Manuel; Pubill, David; Font-Bach, Oriol

    2016-01-01

    The Internet of Things (IoT) will facilitate a wide variety of applications in different domains, such as smart cities, smart grids, industrial automation (Industry 4.0), smart driving, assistance of the elderly, and home automation. Billions of heterogeneous smart devices with different application requirements will be connected to the networks and will generate huge aggregated volumes of data that will be processed in distributed cloud infrastructures. On the other hand, there is also a gen...

  15. SU-E-T-508: End to End Testing of a Prototype Eclipse Module for Planning Modulated Arc Therapy On the Siemens Platform

    International Nuclear Information System (INIS)

    Huang, L; Sarkar, V; Spiessens, S; Rassiah-Szegedi, P; Huang, Y; Salter, B; Zhao, H; Szegedi, M

    2014-01-01

    Purpose: The latest clinical implementation of the Siemens Artiste linac allows for delivery of modulated arcs (mARC) using full-field flattening filter free (FFF) photon beams. The maximum doserate of 2000 MU/min is well suited for high dose treatments such as SBRT. We tested and report on the performance of a prototype Eclipse TPS module supporting mARC capability on the Artiste platform. Method: our spine SBRT patients originally treated with 12/13 field static-gantry IMRT (SGIMRT) were chosen for this study. These plans were designed to satisfy RTOG0631 guidelines with a prescription of 16Gy in a single fraction. The cases were re-planned as mARC plans in the prototype Eclipse module using the 7MV FFF beam and required to satisfy RTOG0631 requirements. All plans were transferred from Eclipse, delivered on a Siemens Artiste linac and dose-validated using the Delta4 system. Results: All treatment plans were straightforwardly developed, in timely fashion, without challenge or inefficiency using the prototype module. Due to the limited number of segments in a single arc, mARC plans required 2-3 full arcs to yield plan quality comparable to SGIMRT plans containing over 250 total segments. The average (3%/3mm) gamma pass-rate for all arcs was 98.5±1.1%, thus demonstrating both excellent dose prediction by the AAA dose algorithm and excellent delivery fidelity. Mean delivery times for the mARC plans(10.5±1.7min) were 50-70% lower than the SGIMRT plans(26±2min), with both delivered at 2000 MU/min. Conclusion: A prototype Eclipse module capable of planning for Burst Mode modulated arc delivery on the Artiste platform has been tested and found to perform efficiently and accurately for treatment plan development and delivered-dose prediction. Further investigation of more treatment sites is being carried out and data will be presented

  16. Web-based bioinformatics workflows for end-to-end RNA-seq data computation and analysis in agricultural animal species

    Science.gov (United States)

    Remarkable advances in next-generation sequencing (NGS) technologies, bioinformatics algorithms, and computational technologies have significantly accelerated genomic research. However, complicated NGS data analysis still remains as a major bottleneck. RNA-seq, as one of the major area in the NGS fi...

  17. Ex vivo proof-of-concept of end-to-end scaffold-enhanced laser-assisted vascular anastomosis of porcine arteries

    NARCIS (Netherlands)

    Pabittei, Dara R.; Heger, Michal; van Tuijl, Sjoerd; Simonet, Marc; de Boon, Wadim; van der Wal, Allard C.; Balm, Ron; de Mol, Bas A.

    2015-01-01

    The low welding strength of laser-assisted vascular anastomosis (LAVA) has hampered the clinical application of LAVA as an alternative to suture anastomosis. To improve welding strength, LAVA in combination with solder and polymeric scaffolds (ssLAVA) has been optimized in vitro. Currently, ssLAVA

  18. Operating performance of the gamma-ray Cherenkov telescope: An end-to-end Schwarzschild–Couder telescope prototype for the Cherenkov Telescope Array

    Energy Technology Data Exchange (ETDEWEB)

    Dournaux, J.L., E-mail: jean-laurent.dournaux@obspm.fr [GEPI, Observatoire de Paris, PSL Research University, CNRS, Sorbonne Paris Cité, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); De Franco, A. [Department of Physics, University of Oxford, Keble Road, Oxford OX1 3RH (United Kingdom); Laporte, P. [GEPI, Observatoire de Paris, PSL Research University, CNRS, Sorbonne Paris Cité, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); White, R. [Max-Planck-Institut für Kernphysik, Saupfercheckweg 1, 69117 Heidelberg (Germany); Greenshaw, T. [University of Liverpool, Oliver Lodge Laboratory, P.O. Box 147, Oxford Street, Liverpool L69 3BX (United Kingdom); Sol, H. [LUTH, Observatoire de Paris, PSL Research University, CNRS, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); Abchiche, A. [CNRS, Division technique DT-INSU, 1 Place Aristide Briand, 92190 Meudon (France); Allan, D. [Department of Physics and Centre for Advanced Instrumentation, Durham University, South Road, Durham DH1 3LE (United Kingdom); Amans, J.P. [GEPI, Observatoire de Paris, PSL Research University, CNRS, Sorbonne Paris Cité, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); Armstrong, T.P. [Department of Physics and Centre for Advanced Instrumentation, Durham University, South Road, Durham DH1 3LE (United Kingdom); Balzer, A.; Berge, D. [GRAPPA, University of Amsterdam, Science Park 904, 1098 XH Amsterdam (Netherlands); Boisson, C. [LUTH, Observatoire de Paris, PSL Research University, CNRS, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); and others

    2017-02-11

    The Cherenkov Telescope Array (CTA) consortium aims to build the next-generation ground-based very-high-energy gamma-ray observatory. The array will feature different sizes of telescopes allowing it to cover a wide gamma-ray energy band from about 20 GeV to above 100 TeV. The highest energies, above 5 TeV, will be covered by a large number of Small-Sized Telescopes (SSTs) with a field-of-view of around 9°. The Gamma-ray Cherenkov Telescope (GCT), based on Schwarzschild–Couder dual-mirror optics, is one of the three proposed SST designs. The GCT is described in this contribution and the first images of Cherenkov showers obtained using the telescope and its camera are presented. These were obtained in November 2015 in Meudon, France.

  19. Design and implementation of a secure and user-friendly broker platform supporting the end-to-end provisioning of e-homecare services.

    Science.gov (United States)

    Van Hoecke, Sofie; Steurbaut, Kristof; Taveirne, Kristof; De Turck, Filip; Dhoedt, Bart

    2010-01-01

    We designed a broker platform for e-homecare services using web service technology. The broker allows efficient data communication and guarantees quality requirements such as security, availability and cost-efficiency by dynamic selection of services, minimizing user interactions and simplifying authentication through a single user sign-on. A prototype was implemented, with several e-homecare services (alarm, telemonitoring, audio diary and video-chat). It was evaluated by patients with diabetes and multiple sclerosis. The patients found that the start-up time and overhead imposed by the platform was satisfactory. Having all e-homecare services integrated into a single application, which required only one login, resulted in a high quality of experience for the patients.

  20. End-to-End System Test of the Relative Precision and Stability of the Photometric Method for Detecting Earth-Size Extrasolar Planets

    Science.gov (United States)

    Dunham, Edward W.

    2000-01-01

    We developed the CCD camera system for the laboratory test demonstration and designed the optical system for this test. The camera system was delivered to Ames in April, 1999 with continuing support mostly in the software area as the test progressed. The camera system has been operating successfully since delivery. The optical system performed well during the test. The laboratory demonstration activity is now nearly complete and is considered to be successful by the Technical Advisory Group, which met on 8 February, 2000 at the SETI Institute. A final report for the Technical Advisory Group and NASA Headquarters will be produced in the next few months. This report will be a comprehensive report on all facets of the test including those covered under this grant. A copy will be forwarded, if desired, when it is complete.

  1. Niti CAR 27 Versus a Conventional End-to-End Anastomosis Stapler in a Laparoscopic Anterior Resection for Sigmoid Colon Cancer

    Science.gov (United States)

    Kwag, Seung-Jin; Kim, Jun-Gi; Kang, Won-Kyung; Lee, Jin-Kwon

    2014-01-01

    Purpose The Niti CAR 27 (ColonRing) uses compression to create an anastomosis. This study aimed to investigate the safety and the effectiveness of the anastomosis created with the Niti CAR 27 in a laparoscopic anterior resection for sigmoid colon cancer. Methods In a single-center study, 157 consecutive patients who received an operation between March 2010 and December 2011 were retrospectively assessed. The Niti CAR 27 (CAR group, 63 patients) colorectal anastomoses were compared with the conventional double-stapled (CDS group, 94 patients) colorectal anastomoses. Intraoperative, immediate postoperative and 6-month follow-up data were recorded. Results There were no statistically significant differences between the two groups in terms of age, gender, tumor location and other clinical characteristics. One patient (1.6%) in the CAR group and 2 patients (2.1%) in the CDS group experienced complications of anastomotic leakage (P = 0.647). These three patients underwent a diverting loop ileostomy. There were 2 cases (2.1%) of bleeding at the anastomosis site in the CDS group. All patients underwent a follow-up colonoscopy (median, 6 months). One patient in the CAR group experienced anastomotic stricture (1.6% vs. 0%; P = 0.401). This complication was solved by using balloon dilatation. Conclusion Anastomosis using the Niti CAR 27 device in a laparoscopic anterior resection for sigmoid colon cancer is safe and feasible. Its use is equivalent to that of the conventional double-stapler. PMID:24851217

  2. An End-to-End DNA Taxonomy Methodology for Benthic Biodiversity Survey in the Clarion-Clipperton Zone, Central Pacific Abyss

    Directory of Open Access Journals (Sweden)

    Adrian G. Glover

    2015-12-01

    Full Text Available Recent years have seen increased survey and sampling expeditions to the Clarion-Clipperton Zone (CCZ, central Pacific Ocean abyss, driven by commercial interests from contractors in the potential extraction of polymetallic nodules in the region. Part of the International Seabed Authority (ISA regulatory requirements are that these contractors undertake environmental research expeditions to their CCZ exploration claims following guidelines approved by the ISA Legal and Technical Commission (ISA, 2010. Section 9 (e of these guidelines instructs contractors to “…collect data on the sea floor communities specifically relating to megafauna, macrofauna, meiofauna, microfauna, nodule fauna and demersal scavengers”. There are a number of methodological challenges to this, including the water depth (4000–5000 m, extremely warm surface waters (~28 °C compared to bottom water (~1.5 °C and great distances to ports requiring a large and long seagoing expedition with only a limited number of scientists. Both scientists and regulators have recently realized that a major gap in our knowledge of the region is the fundamental taxonomy of the animals that live there; this is essential to inform our knowledge of the biogeography, natural history and ultimately our stewardship of the region. Recognising this, the ISA is currently sponsoring a series of taxonomic workshops on the CCZ fauna and to assist in this process we present here a series of methodological pipelines for DNA taxonomy (incorporating both molecular and morphological data of the macrofauna and megafauna from the CCZ benthic habitat in the recent ABYSSLINE cruise program to the UK-1 exploration claim. A major problem on recent CCZ cruises has been the collection of high-quality samples suitable for both morphology and DNA taxonomy, coupled with a workflow that ensures these data are made available. The DNA sequencing techniques themselves are relatively standard, once good samples have been obtained. The key to quality taxonomic work on macrofaunal animals from the tropical abyss is careful extraction of the animals (in cold, filtered seawater, microscopic observation and preservation of live specimens, from a variety of sampling devices by experienced zoologists at sea. Essential to the long-term iterative building of taxonomic knowledge from the CCZ is an “end-to-end” methodology to the taxonomic science that takes into account careful sampling design, at-sea taxonomic identification and fixation, post-cruise laboratory work with both DNA and morphology and finally a careful sample and data management pipeline that results in specimens and data in accessible open museum collections and online repositories.

  3. An integrated end-to-end modeling framework for testing ecosystem-wide effects of human-induced pressures in the Baltic Sea

    DEFF Research Database (Denmark)

    Palacz, Artur; Nielsen, J. Rasmus; Christensen, Asbjørn

    with respect to the underlying assumptions, strengths and weaknesses of individual models. Furthermore, we describe how to possibly expand the framework to account for spatial impacts and economic consequences, for instance by linking to the individual-vessel based DISPLACE modeling approach. We conclude...

  4. National Renewable Energy Laboratory (NREL) Topic 2 Final Report: End-to-End Communication and Control System to Support Clean Energy Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Hudgins, Andrew P. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Carrillo, Ismael M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Jin, Xin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Simmins, John [Electric Power Research Inst. (EPRI), Palo Alto, CA (United States)

    2018-02-21

    This document is the final report of a two-year development, test, and demonstration project, 'Cohesive Application of Standards- Based Connected Devices to Enable Clean Energy Technologies.' The project was part of the National Renewable Energy Laboratory's (NREL's) Integrated Network Testbed for Energy Grid Research and Technology (INTEGRATE) initiative hosted at Energy Systems Integration Facility (ESIF). This project demonstrated techniques to control distribution grid events using the coordination of traditional distribution grid devices and high-penetration renewable resources and demand response. Using standard communication protocols and semantic standards, the project examined the use cases of high/low distribution voltage, requests for volt-ampere-reactive (VAR) power support, and transactive energy strategies using Volttron. Open source software, written by EPRI to control distributed energy resources (DER) and demand response (DR), was used by an advanced distribution management system (ADMS) to abstract the resources reporting to a collection of capabilities rather than needing to know specific resource types. This architecture allows for scaling both horizontally and vertically. Several new technologies were developed and tested. Messages from the ADMS based on the common information model (CIM) were developed to control the DER and DR management systems. The OpenADR standard was used to help manage grid events by turning loads off and on. Volttron technology was used to simulate a homeowner choosing the price at which to enter the demand response market. Finally, the ADMS used newly developed algorithms to coordinate these resources with a capacitor bank and voltage regulator to respond to grid events.

  5. FROM UAS DATA ACQUISITION TO ACTIONABLE INFORMATION – HOW AN END-TO-END SOLUTION HELPS OIL PALM PLANTATION OPERATORS TO PERFORM A MORE SUSTAINABLE PLANTATION MANAGEMENT

    Directory of Open Access Journals (Sweden)

    C. Hoffmann

    2016-06-01

    Full Text Available Palm oil represents the most efficient oilseed crop in the world but the production of palm oil involves plantation operations in one of the most fragile environments - the tropical lowlands. Deforestation, the drying-out of swampy lowlands and chemical fertilizers lead to environmental problems that are putting pressure on this industry. Unmanned aircraft systems (UAS together with latest photogrammetric processing and image analysis capabilities represent an emerging technology that was identified to be suitable to optimize oil palm plantation operations. This paper focuses on two key elements of a UAS-based oil palm monitoring system. The first is the accuracy of the acquired data that is necessary to achieve meaningful results in later analysis steps. High performance GNSS technology was utilized to achieve those accuracies while decreasing the demand for cost-intensive GCP measurements. The second key topic is the analysis of the resulting data in order to optimize plantation operations. By automatically extracting information on a block level as well as on a single-tree level, operators can utilize the developed application to increase their productivity. The research results describe how operators can successfully make use of a UAS-based solution together with the developed software solution to improve their efficiency in oil palm plantation management.

  6. NPP Information Model as an Innovative Approach to End-to-End Lifecycle Management of the NPP and Nuclear Knowledge Management Proven in Russia

    International Nuclear Information System (INIS)

    Tikhonovsky, V.; Kanischev, A.; Kononov, V.; Salnikov, N.; Shkarin, A.; Dorobin, D.

    2016-01-01

    Full text: Managing engineering data for an industrial facility, including integration and maintenance of all engineering and technical data, ensuring fast and convenient access to that information and its analysis, proves to be necessary in order to perform the following tasks: 1) to increase economic efficiency of the plant during its lifecycle, including the decommissioning stage; 2) to ensure strict adherence to industrial safety requirements, radiation safety requirements (in case of nuclear facilities) and environmental safety requirements during operation (including refurbishment and restoration projects) and decommissioning. While performing tasks 1) and 2), one faces a range of challenges: 1. A huge amount of information describing the plant configuration. 2. Complexity of engineering procedures, step-by-step commissioning and significant geographical distribution of industrial infrastructure. 3. High importance of plant refurbishment projects. 4. The need to ensure comprehensive knowledge transfer between different generations of operational personnel and, which is especially important for the nuclear energy industry, between the commissioning personnel generations. NPP information model is an innovative method of NPP knowledge management throughout the whole plant lifecycle. It is an integrated database with all NPP technical engineering information (design, construction, operation, diagnosing, maintenance, refurbishment). (author

  7. IMP - INTEGRATED MISSION PROGRAM

    Science.gov (United States)

    Dauro, V. A.

    1994-01-01

    IMP is a simulation language that is used to model missions around the Earth, Moon, Mars, or other planets. It has been used to model missions for the Saturn Program, Apollo Program, Space Transportation System, Space Exploration Initiative, and Space Station Freedom. IMP allows a user to control the mission being simulated through a large event/maneuver menu. Up to three spacecraft may be used: a main, a target and an observer. The simulation may begin at liftoff, suborbital, or orbital. IMP incorporates a Fehlberg seventh order, thirteen evaluation Runge-Kutta integrator with error and step-size control to numerically integrate the equations of motion. The user may choose oblate or spherical gravity for the central body (Earth, Mars, Moon or other) while a spherical model is used for the gravity of an additional perturbing body. Sun gravity and pressure and Moon gravity effects are user-selectable. Earth/Mars atmospheric effects can be included. The optimum thrust guidance parameters are calculated automatically. Events/maneuvers may involve many velocity changes, and these velocity changes may be impulsive or of finite duration. Aerobraking to orbit is also an option. Other simulation options include line-of-sight communication guidelines, a choice of propulsion systems, a soft landing on the Earth or Mars, and rendezvous with a target vehicle. The input/output is in metric units, with the exception of thrust and weight which are in English units. Input is read from the user's input file to minimize real-time keyboard input. Output includes vehicle state, orbital and guide parameters, event and total velocity changes, and propellant usage. The main output is to the user defined print file, but during execution, part of the input/output is also displayed on the screen. An included FORTRAN program, TEKPLOT, will display plots on the VDT as well as generating a graphic file suitable for output on most laser printers. The code is double precision. IMP is written in

  8. The Waste Negotiator's mission

    International Nuclear Information System (INIS)

    Bataille, Christian

    1993-01-01

    The mission of the Waste Negotiator is to seek out sites for deep underground laboratories to study their potential for disposal of high level radioactive waste. Although appointed by the government, he acts independently. In 1990, faced by severe public criticism at the way that the waste disposal was being handled, and under increasing pressure to find an acceptable solution, the government stopped the work being carried out by ANDRA (Agence nationale pour la gestion des dechets radioactifs) and initiated a full review of the issues involved. At the same time, parliament also started its own extensive investigation to find a way forward. These efforts finally led to the provision of a detailed framework for the management of long lived radioactive waste, including the construction of two laboratories to investigate possible repository sites. The Waste Negotiator was appointed to carry out a full consultative process in the communities which are considering accepting an underground laboratory. (Author)

  9. STS-40 Mission Insignia

    Science.gov (United States)

    1990-01-01

    The STS-40 patch makes a contemporary statement focusing on human beings living and working in space. Against a background of the universe, seven silver stars, interspersed about the orbital path of Columbia, represent the seven crew members. The orbiter's flight path forms a double-helix, designed to represent the DNA molecule common to all living creatures. In the words of a crew spokesman, ...(the helix) affirms the ceaseless expansion of human life and American involvement in space while simultaneously emphasizing the medical and biological studies to which this flight is dedicated. Above Columbia, the phrase Spacelab Life Sciences 1 defines both the Shuttle mission and its payload. Leonardo Da Vinci's Vitruvian man, silhouetted against the blue darkness of the heavens, is in the upper center portion of the patch. With one foot on Earth and arms extended to touch Shuttle's orbit, the crew feels, he serves as a powerful embodiment of the extension of human inquiry from the boundaries of Earth to the limitless laboratory of space. Sturdily poised amid the stars, he serves to link scentists on Earth to the scientists in space asserting the harmony of efforts which produce meaningful scientific spaceflight missions. A brilliant red and yellow Earth limb (center) links Earth to space as it radiates from a native American symbol for the sun. At the frontier of space, the traditional symbol for the sun vividly links America's past to America's future, the crew states. Beneath the orbiting Shuttle, darkness of night rests peacefully over the United States. Drawn by artist Sean Collins, the STS 40 Space Shuttle patch was designed by the crewmembers for the flight.

  10. NASA CYGNSS Mission Overview

    Science.gov (United States)

    Ruf, C. S.; Balasubramaniam, R.; Gleason, S.; McKague, D. S.; O'Brien, A.

    2017-12-01

    The CYGNSS constellation of eight satellites was successfully launched on 15 December 2016 into a low inclination (tropical) Earth orbit. Each satellite carries a four-channel bi-static radar receiver that measures GPS signals scattered by the ocean, from which ocean surface roughness, near surface wind speed, and air-sea latent heat flux are estimated. The measurements are unique in several respects, most notably in their ability to penetrate through all levels of precipitation, made possible by the low frequency at which GPS operates, and in the frequent sampling of tropical cyclone intensification and of the diurnal cycle of winds, made possible by the large number of satellites. Engineering commissioning of the constellation was successfully completed in March 2017 and the mission is currently in the early phase of science operations. Level 2 science data products have been developed for near surface (10 m referenced) ocean wind speed, ocean surface roughness (mean square slope) and latent heat flux. Level 3 gridded versions of the L2 products have also been developed. A set of Level 4 products have also been developed specifically for direct tropical cyclone overpasses. These include the storm intensity (peak sustained winds) and size (radius of maximum winds), its extent (34, 50 and 64 knot wind radii), and its integrated kinetic energy. Assimilation of CYGNSS L2 wind speed data into the HWRF hurricane weather prediction model has also been developed. An overview and the current status of the mission will be presented, together with highlights of early on-orbit performance and scientific results.

  11. Scientific Challenges for a New X-ray Timing Mission

    International Nuclear Information System (INIS)

    Lamb, Frederick K.

    2004-01-01

    The Rossi X-ray Timing Explorer (RXTE) is an immensely successful mission of exploration and discovery. It has discovered a wealth of rapid X-ray variability phenomena that can be used to address fundamental questions concerning the properties of dense matter and strong gravitational fields as well as important astrophysical questions. It has answered many questions and is likely to answer many more, but to follow up fully on the major discoveries RXTE has made will require a new X-ray timing mission with greater capabilities. This introduction to the present volume describes briefly the advantages of X-ray timing measurements for determining the properties of dense matter and strong gravitational fields, indicates some of the key scientific questions that can be addressed using X-ray timing, and summarizes selected achievements of the RXTE mission. It concludes by citing some of the scientific capabilities a proposed follow-on mission will need in order to be successful

  12. Artificial intelligence in a mission operations and satellite test environment

    Science.gov (United States)

    Busse, Carl

    1988-01-01

    A Generic Mission Operations System using Expert System technology to demonstrate the potential of Artificial Intelligence (AI) automated monitor and control functions in a Mission Operations and Satellite Test environment will be developed at the National Aeronautics and Space Administration (NASA) Jet Propulsion Laboratory (JPL). Expert system techniques in a real time operation environment are being studied and applied to science and engineering data processing. Advanced decommutation schemes and intelligent display technology will be examined to develop imaginative improvements in rapid interpretation and distribution of information. The Generic Payload Operations Control Center (GPOCC) will demonstrate improved data handling accuracy, flexibility, and responsiveness in a complex mission environment. The ultimate goal is to automate repetitious mission operations, instrument, and satellite test functions by the applications of expert system technology and artificial intelligence resources and to enhance the level of man-machine sophistication.

  13. The Messenger Mission to Mercury

    CERN Document Server

    Domingue, D. L

    2007-01-01

    NASA’s MESSENGER mission, launched on 3 August, 2004 is the seventh mission in the Discovery series. MESSENGER encounters the planet Mercury four times, culminating with an insertion into orbit on 18 March 2011. It carries a comprehensive package of geophysical, geological, geochemical, and space environment experiments to complete the complex investigations of this solar-system end member, which begun with Mariner 10. The articles in this book, written by the experts in each area of the MESSENGER mission, describe the mission, spacecraft, scientific objectives, and payload. The book is of interest to all potential users of the data returned by the MESSENGER mission, to those studying the nature of the planet Mercury, and by all those interested in the design and implementation of planetary exploration missions.

  14. Rapid shallow breathing

    Science.gov (United States)

    Tachypnea; Breathing - rapid and shallow; Fast shallow breathing; Respiratory rate - rapid and shallow ... Shallow, rapid breathing has many possible medical causes, including: Asthma Blood clot in an artery in the ...

  15. Lunar Exploration Missions Since 2006

    Science.gov (United States)

    Lawrence, S. J. (Editor); Gaddis, L. R.; Joy, K. H.; Petro, N. E.

    2017-01-01

    The announcement of the Vision for Space Exploration in 2004 sparked a resurgence in lunar missions worldwide. Since the publication of the first "New Views of the Moon" volume, as of 2017 there have been 11 science-focused missions to the Moon. Each of these missions explored different aspects of the Moon's geology, environment, and resource potential. The results from this flotilla of missions have revolutionized lunar science, and resulted in a profoundly new emerging understanding of the Moon. The New Views of the Moon II initiative itself, which is designed to engage the large and vibrant lunar science community to integrate the results of these missions into new consensus viewpoints, is a direct outcome of this impressive array of missions. The "Lunar Exploration Missions Since 2006" chapter will "set the stage" for the rest of the volume, introducing the planetary community at large to the diverse array of missions that have explored the Moon in the last decade. Content: This chapter will encompass the following missions: Kaguya; ARTEMIS (Acceleration, Reconnection, Turbulence, and Electrodynamics of the Moon’s Interaction with the Sun); Chang’e-1; Chandrayaan-1; Moon Impact Probe; Lunar Reconnaissance Orbiter (LRO); Lunar Crater Observation Sensing Satellite (LCROSS); Chang’e-2; Gravity Recovery and Interior Laboratory (GRAIL); Lunar Atmosphere and Dust Environment Explorer (LADEE); Chang’e-3.

  16. IRIS Mission Operations Director's Colloquium

    Science.gov (United States)

    Carvalho, Robert; Mazmanian, Edward A.

    2014-01-01

    Pursuing the Mysteries of the Sun: The Interface Region Imaging Spectrograph (IRIS) Mission. Flight controllers from the IRIS mission will present their individual experiences on IRIS from development through the first year of flight. This will begin with a discussion of the unique nature of IRISs mission and science, and how it fits into NASA's fleet of solar observatories. Next will be a discussion of the critical roles Ames contributed in the mission including spacecraft and flight software development, ground system development, and training for launch. This will be followed by experiences from launch, early operations, ongoing operations, and unusual operations experiences. The presentation will close with IRIS science imagery and questions.

  17. Bomber Deterrence Missions: Criteria To Evaluate Mission Effectiveness

    Science.gov (United States)

    2016-02-16

    international security, the practice of general deterrence usually occurs when nations feel insecure , suspicious or even hostility towards them but...both a deterrence and assurance mission even though it was not planned or advertised as such. Since the intent of this mission was partly perceived

  18. Simulation of Mission Phases

    Science.gov (United States)

    Carlstrom, Nicholas Mercury

    2016-01-01

    This position with the Simulation and Graphics Branch (ER7) at Johnson Space Center (JSC) provided an introduction to vehicle hardware, mission planning, and simulation design. ER7 supports engineering analysis and flight crew training by providing high-fidelity, real-time graphical simulations in the Systems Engineering Simulator (SES) lab. The primary project assigned by NASA mentor and SES lab manager, Meghan Daley, was to develop a graphical simulation of the rendezvous, proximity operations, and docking (RPOD) phases of flight. The simulation is to include a generic crew/cargo transportation vehicle and a target object in low-Earth orbit (LEO). Various capsule, winged, and lifting body vehicles as well as historical RPOD methods were evaluated during the project analysis phase. JSC core mission to support the International Space Station (ISS), Commercial Crew Program (CCP), and Human Space Flight (HSF) influenced the project specifications. The simulation is characterized as a 30 meter +V Bar and/or -R Bar approach to the target object's docking station. The ISS was selected as the target object and the international Low Impact Docking System (iLIDS) was selected as the docking mechanism. The location of the target object's docking station corresponds with the RPOD methods identified. The simulation design focuses on Guidance, Navigation, and Control (GNC) system architecture models with station keeping and telemetry data processing capabilities. The optical and inertial sensors, reaction control system thrusters, and the docking mechanism selected were based on CCP vehicle manufacturer's current and proposed technologies. A significant amount of independent study and tutorial completion was required for this project. Multiple primary source materials were accessed using the NASA Technical Report Server (NTRS) and reference textbooks were borrowed from the JSC Main Library and International Space Station Library. The Trick Simulation Environment and User

  19. Business analysis: The commercial mission of the International Asteroid Mission

    Science.gov (United States)

    The mission of the International Asteroid Mission (IAM) is providing asteroidal resources to support activities in space. The short term goal is to initiate IAM by mining a near-Earth, hydrous carbonaceous chondrite asteroid to service the nearer-term market of providing cryogenic rocket fuel in low lunar orbit (LLO). The IAM will develop and contract for the building of the transportation vehicles and equipment necessary for this undertaking. The long-term goal is to expand operations by exploiting asteroids in other manners, as these options become commercially viable. The primary business issues are what revenue can be generated from the baseline mission, how much will the mission cost, and how funding for this mission can be raised. These issues are addressed.

  20. The Impact of Mission Duration on a Mars Orbital Mission

    Science.gov (United States)

    Arney, Dale; Earle, Kevin; Cirillo, Bill; Jones, Christopher; Klovstad, Jordan; Grande, Melanie; Stromgren, Chel

    2017-01-01

    Performance alone is insufficient to assess the total impact of changing mission parameters on a space mission concept, architecture, or campaign; the benefit, cost, and risk must also be understood. This paper examines the impact to benefit, cost, and risk of changing the total mission duration of a human Mars orbital mission. The changes in the sizing of the crew habitat, including consumables and spares, was assessed as a function of duration, including trades of different life support strategies; this was used to assess the impact on transportation system requirements. The impact to benefit is minimal, while the impact on cost is dominated by the increases in transportation costs to achieve shorter total durations. The risk is expected to be reduced by decreasing total mission duration; however, large uncertainty exists around the magnitude of that reduction.

  1. Hipparcos: mission accomplished

    Science.gov (United States)

    1993-08-01

    During the last few months of its life, as the high radiation environment to which the satellite was exposed took its toll on the on-board system, Hipparcos was operated with only two of the three gyroscopes normally required for such a satellite, following an ambitious redesign of the on-board and on-ground systems. Plans were in hand to operate the satellite without gyroscopes at all, and the first such "gyro- less" data had been acquired, when communication failure with the on-board computers on 24 June 1993 put an end to the relentless flow of 24000 bits of data that have been sent down from the satellite each second, since launch. Further attempts to continue operations proved unsuccessful, and after a short series of sub-systems tests, operations were terminated four years and a week after launch. An enormous wealth of scientific data was gathered by Hipparcos. Even though data analysis by the scientific teams involved in the programme is not yet completed, it is clear that the mission has been an overwhelming success. "The ESA advisory bodies took a calculated risk in selecting this complex but fundamental programme" said Dr. Roger Bonnet, ESA's Director of Science, "and we are delighted to have been able to bring it to a highly successful conclusion, and to have contributed unique information that will take a prominent place in the history and development of astrophysics". Extremely accurate positions of more than one hundred thousand stars, precise distance measurements (in most cases for the first time), and accurate determinations of the stars' velocity through space have been derived. The resulting HIPPARCOS Star Catalogue, expected to be completed in 1996, will be of unprecedented accuracy, achieving results some 10-100 times more accurate than those routinely determined from ground-based astronomical observatories. A further star catalogue, the Thyco Star Catalogue of more than a million stars, is being compiled from additional data accumulated by the

  2. Status of the Megha-Tropiques Mission

    Science.gov (United States)

    Gosset, M.; Roca, R.; French Megha-Tropiques Science Team

    2011-12-01

    The Megha-Tropiques mission is an Indo-French mission built by the Centre National d'Études Spatiales and the Indian Space Research Organisation due to launch in September 2011. Megha means cloud in Sanskrit and Tropiques is the French for tropics. The major innovation of MT is to bring together a suite of complementary instruments on a dedicated orbit that strongly improves the sampling of the water cycle elements. Indeed the low inclination on the equator (20°) combined to the elevated height of the orbit (865km) provides unique observing capabilities with up to 6 over-passes per day. The scientific objective of the mission concerns i) Atmospheric energy budget in the inter-tropical zone and at system scale (radiation, latent heat, . . . ) ii) Life cycle of Mesoscale Convective Complexes in the Tropics (over Oceans and Continents) and iii) Monitoring and assimilation for Cyclones, Monsoons, Meso-scale Convective Systems forecasting. These scientific objectives are achieved thanks to the following payload: SCARAB : wide band instrument for inferring longwave and shortwave outgoing fluxes at the top of the atmosphere (cross track scanning, 40 km resolution at nadir); SAPHIR: microwave sounder for water vapour sounding: 6 channels in the WV absorption band at 183.31 GHz. (cross track, 10 km) and MADRAS: microwave imager for precipitation: channels at 18, 23, 37, 89 and 157 GHz, H and V polarisations. (conical swath,<10 km to 40 km). In this presentation, a rapid overview of the Mission will be given as well as a first status depending on the actual launch of the satellite.

  3. The AGILE Mission

    CERN Document Server

    Tavani, M.; Argan, A.; Boffelli, F.; Bulgarelli, A.; Caraveo, P.; Cattaneo, P.W.; Chen, A.W.; Cocco, V.; Costa, E.; D'Ammando, F.; Del Monte, E.; De Paris, G.; Di Cocco, G.; Di Persio, G.; Donnarumma, I.; Evangelista, Y.; Feroci, M.; Ferrari, A.; Fiorini, M.; Fornari, F.; Fuschino, F.; Froysland, T.; Frutti, M.; Galli, M.; Gianotti, F.; Giuliani, A.; Labanti, C.; Lapshov, I.; Lazzarotto, F.; Liello, F.; Lipari, P.; Longo, F.; Mattaini, E.; Marisaldi, M.; Mastropietro, M.; Mauri, A.; Mauri, F.; Mereghetti, S.; Morelli, E.; Morselli, A.; Pacciani, L.; Pellizzoni, A.; Perotti, F.; Piano, G.; Picozza, P.; Pontoni, C.; Porrovecchio, G.; Prest, M.; Pucella, G.; Rapisarda, M.; Rappoldi, A.; Rossi, E.; Rubini, A.; Soffitta, P.; Traci, A.; Trifoglio, M.; Trois, A.; Vallazza, E.; Vercellone, S.; Vittorini, V.; Zambra, A.; Zanello, D.; Pittori, C.; Preger, B.; Santolamazza, P.; Verrecchia, F.; Giommi, P.; Colafrancesco, S.; Antonelli, A.; Cutini, S.; Gasparrini, D.; Stellato, S.; Fanari, G.; Primavera, R.; Tamburelli, F.; Viola, F.; Guarrera, G.; Salotti, L.; D'Amico, F.; Marchetti, E.; Crisconio, M.; Sabatini, P.; Annoni, G.; Alia, S.; Longoni, A.; Sanquerin, R.; Battilana, M.; Concari, P.; Dessimone, E.; Grossi, R.; Parise, A.; Monzani, F.; Artina, E.; Pavesi, R.; Marseguerra, G.; Nicolini, L.; Scandelli, L.; Soli, L.; Vettorello, V.; Zardetto, E.; Bonati, A.; Maltecca, L.; D'Alba, E.; Patane, M.; Babini, G.; Onorati, F.; Acquaroli, L.; Angelucci, M.; Morelli, B.; Agostara, C.; Cerone, M.; Michetti, A.; Tempesta, P.; D'Eramo, S.; Rocca, F.; Giannini, F.; Borghi, G.; Garavelli, B.; Conte, M.; Balasini, M.; Ferrario, I.; Vanotti, M.; Collavo, E.; Giacomazzo, M.

    2008-01-01

    AGILE is an Italian Space Agency mission dedicated to the observation of the gamma-ray Universe. The AGILE very innovative instrumentation combines for the first time a gamma-ray imager (sensitive in the energy range 30 MeV - 50 GeV), a hard X-ray imager (sensitive in the range 18-60 keV) together with a Calorimeter (sensitive in the range 300 keV - 100 MeV) and an anticoincidence system. AGILE was successfully launched on April 23, 2007 from the Indian base of Sriharikota and was inserted in an equatorial orbit with a very low particle background. AGILE provides crucial data for the study of Active Galactic Nuclei, Gamma-Ray Bursts, pulsars, unidentified gamma-ray sources, Galactic compact objects, supernova remnants, TeV sources, and fundamental physics by microsecond timing. An optimal angular resolution (reaching 0.1-0.2 degrees in gamma-rays, 1-2 arcminutes in hard X-rays) and very large fields of view (2.5 sr and 1 sr, respectively) are obtained by the use of Silicon detectors integrated in a very compa...

  4. STS-68 Mission Insignia

    Science.gov (United States)

    1994-01-01

    This STS-68 patch was designed by artist Sean Collins. Exploration of Earth from space is the focus of the design of the insignia, the second flight of the Space Radar Laboratory (SRL-2). SRL-2 was part of NASA's Mission to Planet Earth (MTPE) project. The world's land masses and oceans dominate the center field, with the Space Shuttle Endeavour circling the globe. The SRL-2 letters span the width and breadth of planet Earth, symbolizing worldwide coverage of the two prime experiments of STS-68: The Shuttle Imaging Radar-C and X-Band Synthetic Aperture Radar (SIR-C/X-SAR) instruments; and the Measurement of Air Pollution from Satellites (MAPS) sensor. The red, blue, and black colors of the insignia represent the three operating wavelengths of SIR-C/X-SAR, and the gold band surrounding the globe symbolizes the atmospheric envelope examined by MAPS. The flags of international partners Germany and Italy are shown opposite Endeavour. The relationship of the Orbiter to Earth highlights the usefulness of human space flights in understanding Earth's environment, and the monitoring of its changing surface and atmosphere. In the words of the crew members, the soaring Orbiter also typifies the excellence of the NASA team in exploring our own world, using the tools which the Space Program developed to explore the other planets in the solar system.

  5. Draft Mission Plan Amendment

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1991-09-01

    The Department of Energy`s Office Civilian Radioactive Waste Management has prepared this document to report plans for the Civilian Radioactive Waste Management Program, whose mission is to manage and dispose of the nation`s spent fuel and high-level radioactive waste in a manner that protects the health and safety of the public and of workers and the quality of the environment. The Congress established this program through the Nuclear Waste Policy Act of 1982. Specifically, the Congress directed us to isolate these wastes in geologic repositories constructed in suitable rock formations deep beneath the surface of the earth. In the Nuclear Waste Policy Amendments Act of 1987, the Congress mandated that only one repository was to be developed at present and that only the Yucca Mountain candidate site in Nevada was to be characterized at this time. The Amendments Act also authorized the construction of a facility for monitored retrievable storage (MRS) and established the Office of the Nuclear Waste Negotiator and the Nuclear Waste Technical Review Board. After a reassessment in 1989, the Secretary of Energy restructured the program, focusing the repository effort scientific evaluations of the Yucca Mountain candidate site, deciding to proceed with the development of an MRS facility, and strengthening the management of the program. 48 refs., 32 figs.

  6. NASA's interstellar probe mission

    International Nuclear Information System (INIS)

    Liewer, P.C.; Ayon, J.A.; Wallace, R.A.; Mewaldt, R.A.

    2000-01-01

    NASA's Interstellar Probe will be the first spacecraft designed to explore the nearby interstellar medium and its interaction with our solar system. As envisioned by NASA's Interstellar Probe Science and Technology Definition Team, the spacecraft will be propelled by a solar sail to reach >200 AU in 15 years. Interstellar Probe will investigate how the Sun interacts with its environment and will directly measure the properties and composition of the dust, neutrals and plasma of the local interstellar material which surrounds the solar system. In the mission concept developed in the spring of 1999, a 400-m diameter solar sail accelerates the spacecraft to ∼15 AU/year, roughly 5 times the speed of Voyager 1 and 2. The sail is used to first bring the spacecraft to ∼0.25 AU to increase the radiation pressure before heading out in the interstellar upwind direction. After jettisoning the sail at ∼5 AU, the spacecraft coasts to 200-400 AU, exploring the Kuiper Belt, the boundaries of the heliosphere, and the nearby interstellar medium

  7. Draft Mission Plan Amendment

    International Nuclear Information System (INIS)

    1991-09-01

    The Department of Energy's Office Civilian Radioactive Waste Management has prepared this document to report plans for the Civilian Radioactive Waste Management Program, whose mission is to manage and dispose of the nation's spent fuel and high-level radioactive waste in a manner that protects the health and safety of the public and of workers and the quality of the environment. The Congress established this program through the Nuclear Waste Policy Act of 1982. Specifically, the Congress directed us to isolate these wastes in geologic repositories constructed in suitable rock formations deep beneath the surface of the earth. In the Nuclear Waste Policy Amendments Act of 1987, the Congress mandated that only one repository was to be developed at present and that only the Yucca Mountain candidate site in Nevada was to be characterized at this time. The Amendments Act also authorized the construction of a facility for monitored retrievable storage (MRS) and established the Office of the Nuclear Waste Negotiator and the Nuclear Waste Technical Review Board. After a reassessment in 1989, the Secretary of Energy restructured the program, focusing the repository effort scientific evaluations of the Yucca Mountain candidate site, deciding to proceed with the development of an MRS facility, and strengthening the management of the program. 48 refs., 32 figs

  8. Liquid Effluents Program mission analysis

    International Nuclear Information System (INIS)

    Lowe, S.S.

    1994-01-01

    Systems engineering is being used to identify work to cleanup the Hanford Site. The systems engineering process transforms an identified mission need into a set of performance parameters and a preferred system configuration. Mission analysis is the first step in the process. Mission analysis supports early decision-making by clearly defining the program objectives, and evaluating the feasibility and risks associated with achieving those objectives. The results of the mission analysis provide a consistent basis for subsequent systems engineering work. A mission analysis was performed earlier for the overall Hanford Site. This work was continued by a ''capstone'' team which developed a top-level functional analysis. Continuing in a top-down manner, systems engineering is now being applied at the program and project levels. A mission analysis was conducted for the Liquid Effluents Program. The results are described herein. This report identifies the initial conditions and acceptable final conditions, defines the programmatic and physical interfaces and sources of constraints, estimates the resources to carry out the mission, and establishes measures of success. The mission analysis reflects current program planning for the Liquid Effluents Program as described in Liquid Effluents FY 1995 Multi-Year Program Plan

  9. STS-51J Mission Insignia

    Science.gov (United States)

    1985-01-01

    The 51-J mission insignia, designed by Atlantis's first crew, pays tribute to the Statue of Liberty and the ideas it symbolizes. The historical gateway figure bears additional significance for Astronauts Karol J. Bobko, mission commander; and Ronald J. Grabe, pilot, both New Your Natives.

  10. Pastoral ministry in a missional age: Towards a practical theological understanding of missional pastoral care

    Directory of Open Access Journals (Sweden)

    Guillaume H. Smit

    2015-03-01

    Full Text Available This article concerns itself with the development of a missional ecclesiology and the practices that may accept the challenge of conducting pastoral ministry in the context of South African, middleclass congregations adapting to a rapidly changing, post-apartheid environment. Some practical theological perspectives on pastoral counselling are investigated, whilst Narrative Therapy is explored as an emerging theory of deconstruction to enable the facilitating of congregational change towards a missional understanding of church life in local communities. Subsequently, the theological paradigm of missional ecclesiology is investigated before drawing the broad lines of a theory for pastoral ministry within missional ecclesiology.Intradisciplinary and/or interdisciplinary implications: In this article, a missional base theory is proposed for pastoral counselling, consisting of interdisciplinary insights gained from the fields of Missiology, Practical Theology, Narrative Therapy and Cognitive Behaviour Therapy. The implications of this proposal for the development of a missional pastoral theory focus on the following three aspects:� re-establishing pastoral identity: exploring Christ� pastoral development: intentional faith formation� pastoral ministry: enabling Christ-centred lives.In such a missional pastoral theory four practices should be operationalised: first of all, a cognitive approach to increasing knowledge of the biblical narrative is necessary. This provides the hermeneutical skills necessary to enable people to internalise the biblical ethics and character traits ascribed to the Christian life. Secondly, a pastoral theory needs to pay close attention to development of emotional intelligence. Thirdly, this should be done in the context of small groups, where the focus falls on the personality development of members. Finally, missional pastoral theory should also include the acquisition of life coaching skills, where leaders can be

  11. Modeling Constellation Virtual Missions Using the Vdot(Trademark) Process Management Tool

    Science.gov (United States)

    Hardy, Roger; ONeil, Daniel; Sturken, Ian; Nix, Michael; Yanez, Damian

    2011-01-01

    The authors have identified a software tool suite that will support NASA's Virtual Mission (VM) effort. This is accomplished by transforming a spreadsheet database of mission events, task inputs and outputs, timelines, and organizations into process visualization tools and a Vdot process management model that includes embedded analysis software as well as requirements and information related to data manipulation and transfer. This paper describes the progress to date, and the application of the Virtual Mission to not only Constellation but to other architectures, and the pertinence to other aerospace applications. Vdot s intuitive visual interface brings VMs to life by turning static, paper-based processes into active, electronic processes that can be deployed, executed, managed, verified, and continuously improved. A VM can be executed using a computer-based, human-in-the-loop, real-time format, under the direction and control of the NASA VM Manager. Engineers in the various disciplines will not have to be Vdot-proficient but rather can fill out on-line, Excel-type databases with the mission information discussed above. The author s tool suite converts this database into several process visualization tools for review and into Microsoft Project, which can be imported directly into Vdot. Many tools can be embedded directly into Vdot, and when the necessary data/information is received from a preceding task, the analysis can be initiated automatically. Other NASA analysis tools are too complex for this process but Vdot automatically notifies the tool user that the data has been received and analysis can begin. The VM can be simulated from end-to-end using the author s tool suite. The planned approach for the Vdot-based process simulation is to generate the process model from a database; other advantages of this semi-automated approach are the participants can be geographically remote and after refining the process models via the human-in-the-loop simulation, the

  12. GRACE Status at Mission End

    Science.gov (United States)

    Tapley, B. D.; Flechtner, F. M.; Watkins, M. M.; Bettadpur, S. V.

    2017-12-01

    The twin satellites of the Gravity Recovery and Climate Experiment (GRACE) were launched on March 17, 2002 and have operated for nearly 16 years. The mission objectives are to observe the spatial and temporal variations of the Earth's mass through its effects on the gravity field at the GRACE satellite altitude. The mass changes observed are related to both the changes within the solid earth and the change within and between the Erath system components. A significant cause of the time varying mass is water motion and the GRACE mission has provided a continuous decade long measurement sequence which characterizes the seasonal cycle of mass transport between the oceans, land, cryosphere and atmosphere; its inter-annual variability; and the climate driven secular, or long period, mass transport signals. The fifth reanalysis on the mission data set, the RL05 data, were released in mid-2013. With the planned launch of GRACE Follow-On in early 2018, plans are underway for a reanalysis that will be consistent with the GRACE FO processing standards. The mission is entering the final phases of its operation life with mission end expected to occur in early 2018. The current mission operations strategy emphasizes extending the mission lifetime to obtain an overlap with the GRACE FO. This presentation will review the mission status and the projections for mission lifetime, describe the current operations philosophy and its impact on the science data, discuss the issues related to achieving the GRACE and GRACE FO connection and discuss issues related to science data products during this phase of the mission period.

  13. The Magnetospheric Multiscale Mission

    Science.gov (United States)

    Burch, James

    Magnetospheric Multiscale (MMS), a NASA four-spacecraft mission scheduled for launch in November 2014, will investigate magnetic reconnection in the boundary regions of the Earth’s magnetosphere, particularly along its dayside boundary with the solar wind and the neutral sheet in the magnetic tail. Among the important questions about reconnection that will be addressed are the following: Under what conditions can magnetic-field energy be converted to plasma energy by the annihilation of magnetic field through reconnection? How does reconnection vary with time, and what factors influence its temporal behavior? What microscale processes are responsible for reconnection? What determines the rate of reconnection? In order to accomplish its goals the MMS spacecraft must probe both those regions in which the magnetic fields are very nearly antiparallel and regions where a significant guide field exists. From previous missions we know the approximate speeds with which reconnection layers move through space to be from tens to hundreds of km/s. For electron skin depths of 5 to 10 km, the full 3D electron population (10 eV to above 20 keV) has to be sampled at rates greater than 10/s. The MMS Fast-Plasma Instrument (FPI) will sample electrons at greater than 30/s. Because the ion skin depth is larger, FPI will make full ion measurements at rates of greater than 6/s. 3D E-field measurements will be made by MMS once every ms. MMS will use an Active Spacecraft Potential Control device (ASPOC), which emits indium ions to neutralize the photoelectron current and keep the spacecraft from charging to more than +4 V. Because ion dynamics in Hall reconnection depend sensitively on ion mass, MMS includes a new-generation Hot Plasma Composition Analyzer (HPCA) that corrects problems with high proton fluxes that have prevented accurate ion-composition measurements near the dayside magnetospheric boundary. Finally, Energetic Particle Detector (EPD) measurements of electrons and

  14. The SCOPE Mission

    International Nuclear Information System (INIS)

    Fujimoto, M.; Tsuda, Y.; Saito, Y.; Shinohara, I.; Takashima, T.; Matsuoka, A.; Kojima, H.; Kasaba, Y.

    2009-01-01

    In order to reach the new horizon of the space physics research, the Plasma Universe, via in-situ measurements in the Earth's magnetosphere, SCOPE will perform formation flying observations combined with high-time resolution electron measurements. The simultaneous multi-scale observations by SCOPE of various plasma dynamical phenomena will enable data-based study of the key space plasma processes from the cross-scale coupling point of view. Key physical processes to be studied are magnetic reconnection under various boundary conditions, shocks in space plasma, collisionless plasma mixing at the boundaries, and physics of current sheets embedded in complex magnetic geometries. The SCOPE formation is made up of 5 spacecraft and is put into the equatorial orbit with the apogee at 30 Re (Re: earth radius). One of the spacecraft is a large mother ship which is equipped with a full suite of particle detectors including ultra-high time resolution electron detector. Among other 4 small spacecraft, one remains near (∼10 km) the mother ship and the spacecraft-pair will focus on the electron-scale physics. Others at the distance of 100∼3000 km(electron∼ion spatial scales) from the mother ship will monitor plasma dynamics surrounding the mother-daughter pair. There is lively on-going discussion on Japan-Europe international collaboration (ESA's Cross-Scale), which would certainly make better the coverage over the scales of interest and thus make the success of the mission, i.e., clarifying the multi-scale nature of the Plasma Universe, to be attained at an even higher level.

  15. Collaborative Mission Design at NASA Langley Research Center

    Science.gov (United States)

    Gough, Kerry M.; Allen, B. Danette; Amundsen, Ruth M.

    2005-01-01

    NASA Langley Research Center (LaRC) has developed and tested two facilities dedicated to increasing efficiency in key mission design processes, including payload design, mission planning, and implementation plan development, among others. The Integrated Design Center (IDC) is a state-of-the-art concurrent design facility which allows scientists and spaceflight engineers to produce project designs and mission plans in a real-time collaborative environment, using industry-standard physics-based development tools and the latest communication technology. The Mission Simulation Lab (MiSL), a virtual reality (VR) facility focused on payload and project design, permits engineers to quickly translate their design and modeling output into enhanced three-dimensional models and then examine them in a realistic full-scale virtual environment. The authors were responsible for envisioning both facilities and turning those visions into fully operational mission design resources at LaRC with multiple advanced capabilities and applications. In addition, the authors have created a synergistic interface between these two facilities. This combined functionality is the Interactive Design and Simulation Center (IDSC), a meta-facility which offers project teams a powerful array of highly advanced tools, permitting them to rapidly produce project designs while maintaining the integrity of the input from every discipline expert on the project. The concept-to-flight mission support provided by IDSC has shown improved inter- and intra-team communication and a reduction in the resources required for proposal development, requirements definition, and design effort.

  16. Executive Summary - Our mission

    International Nuclear Information System (INIS)

    2005-01-01

    On September 1 st 2003, the Henryk Niewodniczanski Institute of Nuclear Physics in Cracow joined the Polish Academy of Sciences. The Polish Academy of Sciences (PAN), founded in 1952, is a state-sponsored scientific institution acting through an elected corporation of leading scholars, their research organizations and through numerous scientific establishments. PAN is a major national scientific advisory body acting via its scientific committees which represent all disciplines of science. There are currently 79 PAN research establishments (institutes and research centers, research stations, botanical gardens and other research units) and a number of auxiliary scientific units (such as archives, libraries, museums, and PAN stations abroad). Our Institute is currently one of the largest research institutions of the Polish Academy of Sciences. The research activity of the Academy is financed mainly from the State budget via the Ministry of Scientific Research and Information Technology. The mission of the Institute of Nuclear Physics, IFJ is stated in its Charter. According to Paragraphs 5, 6, and 7 of the 2004 Charter, the Institute's duty is to carry out research activities in the following areas:1. High energy and elementary particle physics (including astrophysics), 2. Nuclear physics and physics of mechanisms of nuclear interaction, 3. Condensed matter physics, 4. Interdisciplinary research, and in particular: in radiation and environmental biology, environmental physics, medical physics, dosimetry, nuclear geophysics, radiochemistry and material engineering. The main tasks of the Institute are: 1. To perform research in the above disciplines, 2. To promote the development of scientists and of specialists qualified to carry out research in these disciplines, 3. To organize a Post-Doctoral Study Course, 4. To permit, through agreements with national and foreign research institutions, external scholars to train and gain academic qualifications in the Institute

  17. The Ulysses mission: An introduction

    International Nuclear Information System (INIS)

    Marsden, R.G.

    1996-01-01

    On 30 September 1995, Ulysses completed its initial, highly successful, survey of the polar regions of the heliosphere in both southern and northern hemispheres, thereby fulfilling its prime mission. The results obtained to date are leading to a revision of many earlier ideas concerning the solar wind and the heliosphere. Now embarking on the second phase of the mission, Ulysses will continue along its out-of-ecliptic flight path for another complete orbit of the Sun. In contrast to the high-latitude phase of the prime mission, which occurred near solar minimum, the next polar passes (in 2000 and 2001) will take place when the Sun is at its most active

  18. International partnership in lunar missions

    Indian Academy of Sciences (India)

    related to space science and Moon missions are being addressed in this conference. .... flight. The studies in India suggest that an 'aerobic' space transportation vehicle can indeed have a ... space from Earth at very, very low cost first before.

  19. Telepresence for Deep Space Missions

    Data.gov (United States)

    National Aeronautics and Space Administration — Incorporating telepresence technologies into deep space mission operations can give the crew and ground personnel the impression that they are in a location at time...

  20. Mission Level Autonomy for USSV

    Science.gov (United States)

    Huntsberger, Terry; Stirb, Robert C.; Brizzolara, Robert

    2011-01-01

    On-water demonstration of a wide range of mission-proven, advanced technologies at TRL 5+ that provide a total integrated, modular approach to effectively address the majority of the key needs for full mission-level autonomous, cross-platform control of USV s. Wide baseline stereo system mounted on the ONR USSV was shown to be an effective sensing modality for tracking of dynamic contacts as a first step to automated retrieval operations. CASPER onboard planner/replanner successfully demonstrated realtime, on-water resource-based analysis for mission-level goal achievement and on-the-fly opportunistic replanning. Full mixed mode autonomy was demonstrated on-water with a seamless transition between operator over-ride and return to current mission plan. Autonomous cooperative operations for fixed asset protection and High Value Unit escort using 2 USVs (AMN1 & 14m RHIB) were demonstrated during Trident Warrior 2010 in JUN 2010

  1. Green Propellant Infusion Mission Program

    Data.gov (United States)

    National Aeronautics and Space Administration — The mission is architected as a collaboration of NASA, Industry, and Air Force partners with the objective to advance the technology for propulsion components using...

  2. Urinary albumin in space missions

    DEFF Research Database (Denmark)

    Cirillo, Massimo; De Santo, Natale G; Heer, Martina

    2002-01-01

    Proteinuria was hypothesized for space mission but research data are missing. Urinary albumin, as index of proteinuria, was analyzed in frozen urine samples collected by astronauts during space missions onboard MIR station and on ground (control). Urinary albumin was measured by a double antibody...... radioimmunoassay. On average, 24h urinary albumin was 27.4% lower in space than on ground; the difference was statistically significant. Low urinary albumin excretion could be another effect of exposure to weightlessness (microgravity)....

  3. KEPLER Mission: development and overview

    International Nuclear Information System (INIS)

    Borucki, William J

    2016-01-01

    The Kepler Mission is a space observatory launched in 2009 by NASA to monitor 170 000 stars over a period of four years to determine the frequency of Earth-size and larger planets in and near the habitable zone of Sun-like stars, the size and orbital distributions of these planets, and the types of stars they orbit. Kepler is the tenth in the series of NASA Discovery Program missions that are competitively-selected, PI-directed, medium-cost missions. The Mission concept and various instrument prototypes were developed at the Ames Research Center over a period of 18 years starting in 1983. The development of techniques to do the 10 ppm photometry required for Mission success took years of experimentation, several workshops, and the exploration of many ‘blind alleys’ before the construction of the flight instrument. Beginning in 1992 at the start of the NASA Discovery Program, the Kepler Mission concept was proposed five times before its acceptance for mission development in 2001. During that period, the concept evolved from a photometer in an L2 orbit that monitored 6000 stars in a 50 sq deg field-of-view (FOV) to one that was in a heliocentric orbit that simultaneously monitored 170 000 stars with a 105 sq deg FOV. Analysis of the data to date has detected over 4600 planetary candidates which include several hundred Earth-size planetary candidates, over a thousand confirmed planets, and Earth-size planets in the habitable zone (HZ). These discoveries provide the information required for estimates of the frequency of planets in our galaxy. The Mission results show that most stars have planets, many of these planets are similar in size to the Earth, and that systems with several planets are common. Although planets in the HZ are common, many are substantially larger than Earth. (review article)

  4. Executive Summary - Our mission

    International Nuclear Information System (INIS)

    2007-01-01

    Full text: The Henryk Niewodniczanski Institute of Nuclear Physics (Instytut Fizyki Jadrowej im. Henryka Niewodniczanskiego, IFJ PAN) is currently the largest research institution of the Polish Academy of Sciences (Polska Akademia Nauk). The research activity of the Academy is financed mainly from the State budget via the Ministry of Science and Higher Education. The mission of IFJ PAN is stated in its Charter. According to Paragraphs 5, 6, and 7 of the 2004 Charter, the Institute's duty is to carry out research activities in the following areas: 1. High energy and elementary particle physics (including astrophysics), 2. Nuclear physics and strong interaction, 3. Condensed matter physics, 4. Interdisciplinary research, in particular: in radiation and environmental biology, environmental physics, medical physics, dosimetry, nuclear geophysics, radiochemistry and material engineering. The main tasks of the Institute are: 1. To perform research in the above disciplines, 2. To promote the development of scientists and of specialists qualified to carry out research in these disciplines, 3. To organize a Post-Graduate Study Course, 4. To permit, through agreements with national and foreign research institutions, external scholars to train and gain academic qualifications in the Institute's laboratories, 5. To collaborate with national and local authorities in providing them with expertise in the Institute's research topics, especially concerning radiation protection. These tasks are fulfilled by: 1. Performing individual and coordinated research through individual and collective research grant projects, 2. Initiating and maintaining cooperation with laboratories, organizations and institutions performing similar activities, in Poland and abroad, 3. Conferring scientific degrees and titles, 4. Distributing research results obtained, through peer-reviewed publications and other public media, 5. Organizing scientific meetings, conferences, symposia, training workshops, etc

  5. Psychosocial interactions during ISS missions

    Science.gov (United States)

    Kanas, N. A.; Salnitskiy, V. P.; Ritsher, J. B.; Gushin, V. I.; Weiss, D. S.; Saylor, S. A.; Kozerenko, O. P.; Marmar, C. R.

    2007-02-01

    Based on anecdotal reports from astronauts and cosmonauts, studies of space analog environments on Earth, and our previous research on the Mir Space Station, a number of psychosocial issues have been identified that can lead to problems during long-duration space expeditions. Several of these issues were studied during a series of missions to the International Space Station. Using a mood and group climate questionnaire that was completed weekly by crewmembers in space and personnel in mission control, we found no evidence to support the presence of predicted decrements in well-being during the second half or in any specific quarter of the missions. The results did support the predicted displacement of negative feelings to outside supervisors among both crew and ground subjects. There were several significant differences in mood and group perceptions between Americans and Russians and between crewmembers and mission control personnel. Crewmembers related cohesion to the support role of their leader, and mission control personnel related cohesion to both the task and support roles of their leader. These findings are discussed with reference to future space missions.

  6. NASA COAST and OCEANIA Airborne Missions Support Ecosystem and Water Quality Research in the Coastal Zone

    Science.gov (United States)

    Guild, Liane; Kudela, Raphael; Hooker, Stanford; Morrow, John; Russell, Philip; Palacios, Sherry; Livingston, John M.; Negrey, Kendra; Torres-Perez, Juan; Broughton, Jennifer

    2014-01-01

    NASA has a continuing requirement to collect high-quality in situ data for the vicarious calibration of current and next generation ocean color satellite sensors and to validate the algorithms that use the remotely sensed observations. Recent NASA airborne missions over Monterey Bay, CA, have demonstrated novel above- and in-water measurement capabilities supporting a combined airborne sensor approach (imaging spectrometer, microradiometers, and a sun photometer). The results characterize coastal atmospheric and aquatic properties through an end-to-end assessment of image acquisition, atmospheric correction, algorithm application, plus sea-truth observations from state-of-the-art instrument systems. The primary goal is to demonstrate the following in support of calibration and validation exercises for satellite coastal ocean color products: 1) the utility of a multi-sensor airborne instrument suite to assess the bio-optical properties of coastal California, including water quality; and 2) the importance of contemporaneous atmospheric measurements to improve atmospheric correction in the coastal zone. The imaging spectrometer (Headwall) is optimized in the blue spectral domain to emphasize remote sensing of marine and freshwater ecosystems. The novel airborne instrument, Coastal Airborne In-situ Radiometers (C-AIR) provides measurements of apparent optical properties with high dynamic range and fidelity for deriving exact water leaving radiances at the land-ocean boundary, including radiometrically shallow aquatic ecosystems. Simultaneous measurements supporting empirical atmospheric correction of image data are accomplished using the Ames Airborne Tracking Sunphotometer (AATS-14). Flight operations are presented for the instrument payloads using the Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS) Twin Otter flown over Monterey Bay during the seasonal fall algal bloom in 2011 (COAST) and 2013 (OCEANIA) to support bio-optical measurements of

  7. The Mission Accessibility of Near-Earth Asteroids

    Science.gov (United States)

    Barbee, Brent W.; Abell, P. A.; Adamo, D. R.; Mazanek, D. D.; Johnson, L. N.; Yeomans, D. K.; Chodas, P. W.; Chamberlin, A. B.; Benner, L. A. M.; Taylor, P.; hide

    2015-01-01

    The population of near-Earth asteroids (NEAs) that may be accessible for human space flight missions is defined by the Near-Earth Object Human Space Flight Accessible Targets Study (NHATS). The NHATS is an automated system designed to monitor the accessibility of, and particular mission opportunities offered by, the NEA population. This is analogous to systems that automatically monitor the impact risk posed to Earth by the NEA population. The NHATS system identifies NEAs that are potentially accessible for future round-trip human space flight missions and provides rapid notification to asteroid observers so that crucial follow-up observations can be obtained following discovery of accessible NEAs. The NHATS was developed in 2010 and was automated by early 2012. NHATS data are provided via an interactive web-site, and daily NHATS notification emails are transmitted to a mailing list; both resources are available to the public.

  8. Low Thrust Trajectory Design for GSFC Missions

    Data.gov (United States)

    National Aeronautics and Space Administration — The Evolutionary Mission Trajectory Generator (EMTG) is a global trajectory optimization tool. EMTG is intended for use in designing interplanetary missions which...

  9. A review of Spacelab mission management approach

    Science.gov (United States)

    Craft, H. G., Jr.

    1979-01-01

    The Spacelab development program is a joint undertaking of the NASA and ESA. The paper addresses the initial concept of Spacelab payload mission management, the lessons learned, and modifications made as a result of the actual implementation of Spacelab Mission 1. The discussion covers mission management responsibilities, program control, science management, payload definition and interfaces, integrated payload mission planning, integration requirements, payload specialist training, payload and launch site integration, payload flight/mission operations, and postmission activities. After 3.5 years the outlined overall mission manager approach has proven to be most successful. The approach does allow the mission manager to maintain the lowest overall mission cost.

  10. Interactive Dynamic Mission Scheduling for ASCA

    Science.gov (United States)

    Antunes, A.; Nagase, F.; Isobe, T.

    The Japanese X-ray astronomy satellite ASCA (Advanced Satellite for Cosmology and Astrophysics) mission requires scheduling for each 6-month observation phase, further broken down into weekly schedules at a few minutes resolution. Two tools, SPIKE and NEEDLE, written in Lisp and C, use artificial intelligence (AI) techniques combined with a graphic user interface for fast creation and alteration of mission schedules. These programs consider viewing and satellite attitude constraints as well as observer-requested criteria and present an optimized set of solutions for review by the planner. Six-month schedules at 1 day resolution are created for an oversubscribed set of targets by the SPIKE software, originally written for HST and presently being adapted for EUVE, XTE and AXAF. The NEEDLE code creates weekly schedules at 1 min resolution using in-house orbital routines and creates output for processing by the command generation software. Schedule creation on both the long- and short-term scale is rapid, less than 1 day for long-term, and one hour for short-term.

  11. The Asteroid Redirect Mission (ARM)

    Science.gov (United States)

    Abell, Paul; Gates, Michele; Johnson, Lindley; Chodas, Paul; Mazanek, Dan; Reeves, David; Ticker, Ronald

    2016-07-01

    To achieve its long-term goal of sending humans to Mars, the National Aeronautics and Space Administration (NASA) plans to proceed in a series of incrementally more complex human spaceflight missions. Today, human flight experience extends only to Low-Earth Orbit (LEO), and should problems arise during a mission, the crew can return to Earth in a matter of minutes to hours. The next logical step for human spaceflight is to gain flight experience in the vicinity of the Moon. These cis-lunar missions provide a "proving ground" for the testing of systems and operations while still accommodating an emergency return path to the Earth that would last only several days. Cis-lunar mission experience will be essential for more ambitious human missions beyond the Earth-Moon system, which will require weeks, months, or even years of transit time. In addition, NASA has been given a Grand Challenge to find all asteroid threats to human populations and know what to do about them. Obtaining knowledge of asteroid physical properties combined with performing technology demonstrations for planetary defense provide much needed information to address the issue of future asteroid impacts on Earth. Hence the combined objectives of human exploration and planetary defense give a rationale for the Asteroid Re-direct Mission (ARM). Mission Description: NASA's ARM consists of two mission segments: 1) the Asteroid Redirect Robotic Mission (ARRM), the first robotic mission to visit a large (greater than ~100 m diameter) near-Earth asteroid (NEA), collect a multi-ton boulder from its surface along with regolith samples, demonstrate a planetary defense technique, and return the asteroidal material to a stable orbit around the Moon; and 2) the Asteroid Redirect Crewed Mission (ARCM), in which astronauts will take the Orion capsule to rendezvous and dock with the robotic vehicle, conduct multiple extravehicular activities to explore the boulder, and return to Earth with samples. NASA's proposed

  12. Performance of the end-to-end test for the characterization of a simulator in stereotaxic corporal radiotherapy of liver; Realização do teste end-to-end para a caracterização de um simulador em radioterapia estereotáxica corpórea de fígado

    Energy Technology Data Exchange (ETDEWEB)

    Burgos, A.F.; Paiva, E. de, E-mail: adamfburgos@gmail.com [Instituto de Radioproteção e Dosimetria (IRD/CNEN), Rio de Janeiro-RJ (Brazil). Div. de Física Médica; Silva, L.P. da [Instituto Nacional de Câncer (MS/INCA), Rio de Janeiro-RJ (Brazil). Dept. de Física Médica

    2017-07-01

    Currently, one of the alternatives to the radiotherapy of the liver is the body stereotactic radiotherapy (SBRT), which delivers high doses in a few fractions, due to its good prognosis. However, in order to ensure that the high dose value delivered to the target is the same as planned, a full-process verification test (image acquisition, design, scheduling, and dose delivery) should be performed. For this purpose, the objective of this work was to develop a water density simulator that takes into account the relative position of the liver and the risk organs involved in this treatment, evaluating the influence of target movement on the dose value, due to the the respiratory process, as well as in positions related to the organs at risk.

  13. Global Precipitation Measurement Mission: Architecture and Mission Concept

    Science.gov (United States)

    Bundas, David

    2005-01-01

    The Global Precipitation Measurement (GPM) Mission is a collaboration between the National Aeronautics and Space Administration (NASA) and the Japanese Aerospace Exploration Agency (JAXA), and other partners, with the goal of monitoring the diurnal and seasonal variations in precipitation over the surface of the earth. These measurements will be used to improve current climate models and weather forecasting, and enable improved storm and flood warnings. This paper gives an overview of the mission architecture and addresses some of the key trades that have been completed, including the selection of the Core Observatory s orbit, orbit maintenance trades, and design issues related to meeting orbital debris requirements.

  14. Vital role of nuclear data in space missions

    International Nuclear Information System (INIS)

    Tripathi, R.K.

    2008-01-01

    Nasa has a new vision for space exploration in the 21. Century encompassing a broad range of human and robotic missions including missions to Moon, Mars and beyond. Exposure from the hazards of severe space radiation in deep space long duration missions is a critical design driver. Thus, protection from the hazards of severe space radiation is of paramount importance for the new vision. Accurate risk assessments critically depend on the accuracy of the input information about the interaction of ions with materials, electronics and tissues. We have discussed some of the state-of-the-art cross sections database at Nasa and have demonstrated the role nuclear interaction plays in space missions. The impact of the cross sections on space missions has been shown by the assessment of dose exposure on Moon surface behind a number of materials with increasing hydrogen contents known to be a better radiation shielding material. In addition we have examined an approach to introduce reliability based design methods into shield evaluation and optimization procedure as a means to assess and control the uncertainties in shield design. Applications to Lunar missions for short and long-term duration display a large impact on the design outcome and the choice of the materials. For short duration missions all the examined materials have similar performance. However, for career astronauts who are exposed to longer duration space radiation over the period of time the choice of material plays a very critical role. Computational procedures based on deterministic solution of the Boltzmann equation are well suited for such procedures allowing optimization processes to be implemented, evaluation of biologically important rare events,and rapid analysis of possible shield optimization outcomes resulting from the biological model uncertainty parameter space

  15. Resumes of the Bird mission

    Science.gov (United States)

    Lorenz, E.; Borwald, W.; Briess, K.; Kayal, H.; Schneller, M.; Wuensten, Herbert

    2004-11-01

    The DLR micro satellite BIRD (Bi-spectral Infra Red Detection) was piggy- back launched with the Indian Polar Satellite Launch Vehicle PSLV-C3 into a 570 km circular sun-synchronous orbit on 22 October 2001. The BIRD mission, fully funded by the DLR, answers topical technological and scientific questions related to the operation of a compact infra- red push-broom sensor system on board of a micro satellite and demonstrates new spacecraft bus technologies. BIRD mission control is conducted by DLR / GSOC in Oberpfaffenhofen. Commanding, data reception and data processing is performed via ground stations in Weilheim and Neustrelitz (Germany). The BIRD mission is a demonstrator for small satellite projects dedicated to the hazard detection and monitoring. In the year 2003 BIRD has been used in the ESA project FUEGOSAT to demonstrate the utilisation of innovative space technologies for fire risk management.

  16. 308 Building deactivation mission analysis report

    International Nuclear Information System (INIS)

    Lund, D.P.

    1995-01-01

    This report presents the results of the 308 Building (Fuels Development Laboratory) Deactivation Project mission analysis. Hanford systems engineering (SE) procedures call for a mission analysis. The mission analysis is an important first step in the SE process. The functions and requirements to successfully accomplish this mission, the selected alternatives and products will later be defined using the SE process

  17. 309 Building deactivation mission analysis report

    International Nuclear Information System (INIS)

    Lund, D.P.

    1995-01-01

    This report presents the results of the 309 Building (Plutonium Fuels Utilization Program) Deactivation Project mission analysis. Hanford systems engineering (SE) procedures call for a mission analysis. The mission analysis is an important first step in the SE process. The functions and requirements to successfully accomplish this mission, the selected alternatives and products will later be defined using the SE process

  18. MIOSAT Mission Scenario and Design

    Science.gov (United States)

    Agostara, C.; Dionisio, C.; Sgroi, G.; di Salvo, A.

    2008-08-01

    MIOSAT ("Mssione Ottica su microSATellite") is a low-cost technological / scientific microsatellite mission for Earth Observation, funded by Italian Space Agency (ASI) and managed by a Group Agreement between Rheinmetall Italia - B.U. Spazio - Contraves as leader and Carlo Gavazzi Space as satellite manufacturer. Several others Italians Companies, SME and Universities are involved in the development team with crucial roles. MIOSAT is a microsatellite weighting around 120 kg and placed in a 525 km altitude sun-synchronuos circular LEO orbit. The microsatellite embarks three innovative optical payloads: Sagnac multi spectral radiometer (IFAC-CNR), Mach Zehender spectrometer (IMM-CNR), high resolution pancromatic camera (Selex Galileo). In addition three technological experiments will be tested in-flight. The first one is an heat pipe based on Marangoni effect with high efficiency. The second is a high accuracy Sun Sensor using COTS components and the last is a GNSS SW receiver that utilizes a Leon2 processor. Finally a new generation of 28% efficiency solar cells will be adopted for the power generation. The platform is highly agile and can tilt along and cross flight direction. The pointing accuracy is in the order of 0,1° for each axe. The pointing determination during images acquisition is <0,02° for the axis normal to the boresight and 0,04° for the boresight. This paper deals with MIOSAT mission scenario and definition, highlighting trade-offs for mission implementation. MIOSAT mission design has been constrained from challenging requirements in terms of satellite mass, mission lifetime, instrument performance, that have implied the utilization of satellite agility capability to improve instruments performance in terms of S/N and resolution. The instruments provide complementary measurements that can be combined in effective ways to exploit new applications in the fields of atmosphere composition analysis, Earth emissions, antropic phenomena, etc. The Mission

  19. The inner magnetosphere imager mission

    International Nuclear Information System (INIS)

    Johnson, L.; Herrmann, M.

    1993-01-01

    After 30 years of in situ measurements of the Earth's magnetosphere, scientists have assembled an incomplete picture of its global composition and dynamics. Imaging the magnetosphere from space will enable scientists to better understand the global shape of the inner magnetosphere, its components and processes. The proposed inner magnetosphere imager (IMI) mission will obtain the first simultaneous images of the component regions of the inner magnetosphere and will enable scientists to relate these global images to internal and external influences as well as local observations. To obtain simultaneous images of component regions of the inner magnetosphere, measurements will comprise: the ring current and inner plasma sheet using energetic neutral atoms; the plasmasphere using extreme ultraviolet; the electron and proton auroras using far ultraviolet (FUV) and x rays; and the geocorona using FUV. The George C. Marshall Space Flight Center (MSFC) is performing a concept definition study of the proposed mission. NASA's Office of Space Science and Applications has placed the IMI third in its queue of intermediate-class missions for launch in the 1990's. An instrument complement of approximately seven imagers will fly in an elliptical Earth orbit with a seven Earth Radii (R E ) altitude apogee and approximately 4,800-kin altitude perigee. Several spacecraft concepts were examined for the mission. The first concept utilizes a spinning spacecraft with a despun platform. The second concept splits the instruments onto a spin-stabilized spacecraft and a complementary three-axis stabilized spacecraft. Launch options being assessed for the spacecraft range from a Delta 11 for the single and dual spacecraft concepts to dual Taurus launches for the two smaller spacecraft. This paper will address the mission objectives, the spacecraft design considerations, the results of the MSFC concept definition study, and future mission plans

  20. Cyberinfrastructure for Aircraft Mission Support

    Science.gov (United States)

    Freudinger, Lawrence C.

    2010-01-01

    Forth last several years NASA's Airborne Science Program has been developing and using infrastructure and applications that enable researchers to interact with each other and with airborne instruments via network communications. Use of these tools has increased near realtime situational awareness during field operations, resulting it productivity improvements, improved decision making, and the collection of better data. Advances in pre-mission planning and post-mission access have also emerged. Integrating these capabilities with other tools to evolve coherent service-oriented enterprise architecture for aircraft flight and test operations is the subject of ongoing efforts.