WorldWideScience

Sample records for end-to-end mission simulator

  1. Composable Mission Framework for Rapid End-to-End Mission Design and Simulation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is the Composable Mission Framework (CMF) a model-based software framework that shall enable seamless continuity of mission design and...

  2. End-to-end plasma bubble PIC simulations on GPUs

    Science.gov (United States)

    Germaschewski, Kai; Fox, William; Matteucci, Jackson; Bhattacharjee, Amitava

    2017-10-01

    Accelerator technologies play a crucial role in eventually achieving exascale computing capabilities. The current and upcoming leadership machines at ORNL (Titan and Summit) employ Nvidia GPUs, which provide vast computational power but also need specifically adapted computational kernels to fully exploit them. In this work, we will show end-to-end particle-in-cell simulations of the formation, evolution and coalescence of laser-generated plasma bubbles. This work showcases the GPU capabilities of the PSC particle-in-cell code, which has been adapted for this problem to support particle injection, a heating operator and a collision operator on GPUs.

  3. The Swarm End-to-End mission simulator study: A demonstration of separating the various contributions to Earth's magnetic field using synthetic data

    DEFF Research Database (Denmark)

    Olsen, Nils; Haagmans, R.; Sabaka, T.J.

    2006-01-01

    Swarm, a satellite constellation to measure Earth's magnetic field with unpreceded accuracy, has been selected by ESA for launch in 2009. The mission will provide the best ever survey of the geomagnetic field and its temporal evolution, in order to gain new insights into the Earth system...... to the science objectives of Swarm. In order to be able to use realistic parameters of the Earth's environment, the mission simulation starts at January 1, 1997 and lasts until re-entry of the lower satellites five years later. Synthetic magnetic field values were generated for all relevant contributions...

  4. Modeling and simulation of satellite subsystems for end-to-end spacecraft modeling

    Science.gov (United States)

    Schum, William K.; Doolittle, Christina M.; Boyarko, George A.

    2006-05-01

    During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems. Much of this research has occurred in the Distributed Architecture Simulation Laboratory (DASL). AFRL developers working in the DASL have effectively combined satellite power, attitude pointing, and communication link analysis subsystem models with robust satellite sensor models to create a first-order end-to-end satellite simulation capability. The merging of these two simulation areas has advanced the field of spacecraft simulation, design, and analysis, and enabled more in-depth mission and satellite utility analyses. A core capability of the DASL is the support of a variety of modeling and analysis efforts, ranging from physics and engineering-level modeling to mission and campaign-level analysis. The flexibility and agility of this simulation architecture will be used to support space mission analysis, military utility analysis, and various integrated exercises with other military and space organizations via direct integration, or through DOD standards such as Distributed Interaction Simulation. This paper discusses the results and lessons learned in modeling satellite communication link analysis, power, and attitude control subsystems for an end-to-end satellite simulation. It also discusses how these spacecraft subsystem simulations feed into and support military utility and space mission analyses.

  5. End-to-end System Performance Simulation: A Data-Centric Approach

    Science.gov (United States)

    Guillaume, Arnaud; Laffitte de Petit, Jean-Luc; Auberger, Xavier

    2013-08-01

    In the early times of space industry, the feasibility of Earth observation missions was directly driven by what could be achieved by the satellite. It was clear to everyone that the ground segment would be able to deal with the small amount of data sent by the payload. Over the years, the amounts of data processed by the spacecrafts have been increasing drastically, leading to put more and more constraints on the ground segment performances - and in particular on timeliness. Nowadays, many space systems require high data throughputs and short response times, with information coming from multiple sources and involving complex algorithms. It has become necessary to perform thorough end-to-end analyses of the full system in order to optimise its cost and efficiency, but even sometimes to assess the feasibility of the mission. This paper presents a novel framework developed by Astrium Satellites in order to meet these needs of timeliness evaluation and optimisation. This framework, named ETOS (for “End-to-end Timeliness Optimisation of Space systems”), provides a modelling process with associated tools, models and GUIs. These are integrated thanks to a common data model and suitable adapters, with the aim of building suitable space systems simulators of the full end-to-end chain. A big challenge of such environment is to integrate heterogeneous tools (each one being well-adapted to part of the chain) into a relevant timeliness simulation.

  6. End-to-end simulation: The front end

    International Nuclear Information System (INIS)

    Haber, I.; Bieniosek, F.M.; Celata, C.M.; Friedman, A.; Grote, D.P.; Henestroza, E.; Vay, J.-L.; Bernal, S.; Kishek, R.A.; O'Shea, P.G.; Reiser, M.; Herrmannsfeldt, W.B.

    2002-01-01

    For the intense beams in heavy ion fusion accelerators, details of the beam distribution as it emerges from the source region can determine the beam behavior well downstream. This occurs because collective space-charge modes excited as the beam is born remain undamped for many focusing periods. Traditional studies of the source region in particle beam systems have emphasized the behavior of averaged beam characteristics, such as total current, rms beam size, or emittance, rather than the details of the full beam distribution function that are necessary to predict the excitation of these modes. Simulations of the beam in the source region and comparisons to experimental measurements at LBNL and the University of Maryland are presented to illustrate some of the complexity in beam characteristics that has been uncovered as increased attention has been devoted to developing a detailed understanding of the source region. Also discussed are methods of using the simulations to infer characteristics of the beam distribution that can be difficult to measure directly

  7. Human Assisted Robotic Vehicle Studies - A conceptual end-to-end mission architecture

    Science.gov (United States)

    Lehner, B. A. E.; Mazzotta, D. G.; Teeney, L.; Spina, F.; Filosa, A.; Pou, A. Canals; Schlechten, J.; Campbell, S.; Soriano, P. López

    2017-11-01

    With current space exploration roadmaps indicating the Moon as a proving ground on the way to human exploration of Mars, it is clear that human-robotic partnerships will play a key role for successful future human space missions. This paper details a conceptual end-to-end architecture for an exploration mission in cis-lunar space with a focus on human-robot interactions, called Human Assisted Robotic Vehicle Studies (HARVeSt). HARVeSt will build on knowledge of plant growth in space gained from experiments on-board the ISS and test the first growth of plants on the Moon. A planned deep space habitat will be utilised as the base of operations for human-robotic elements of the mission. The mission will serve as a technology demonstrator not only for autonomous tele-operations in cis-lunar space but also for key enabling technologies for future human surface missions. The successful approach of the ISS will be built on in this mission with international cooperation. Mission assets such as a modular rover will allow for an extendable mission and to scout and prepare the area for the start of an international Moon Village.

  8. End-to-End Trade-space Analysis for Designing Constellation Missions

    Science.gov (United States)

    LeMoigne, J.; Dabney, P.; Foreman, V.; Grogan, P.; Hache, S.; Holland, M. P.; Hughes, S. P.; Nag, S.; Siddiqi, A.

    2017-12-01

    cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The current GUI automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost. The end-to-end system will be demonstrated as part of the presentation.

  9. End-To-END Performance of the future MOMA intrument aboard the EXOMARS MISSION

    Science.gov (United States)

    Buch, A.; Pinnick, V. T.; Szopa, C.; Grand, N.; Danell, R.; van Amerom, F. H. W.; Freissinet, C.; Glavin, D. P.; Stalport, F.; Arevalo, R. D., Jr.; Coll, P. J.; Steininger, H.; Raulin, F.; Goesmann, F.; Mahaffy, P. R.; Brinckerhoff, W. B.

    2016-12-01

    After the SAM experiment aboard the curiosity rover, the Mars Organic Molecule Analyzer (MOMA) experiment aboard the future ExoMars mission will be the continuation of the search for the organic composition of the Mars surface with the advantage that the sample will be extracted as deep as 2 meters below the martian surface to minimize effects of radiation and oxidation on organic materials. To analyse the wide range of organic composition (volatile and non volatils compounds) of the martian soil MOMA is composed with an UV laser desorption / ionization (LDI) and a pyrolysis gas chromatography ion trap mass spectrometry (pyr-GC-ITMS). In order to analyse refractory organic compounds and chirality samples which undergo GC-ITMS analysis may be submitted to a derivatization process, consisting of the reaction of the sample components with specific reactants (MTBSTFA [1], DMF-DMA [2] or TMAH [3]). To optimize and test the performance of the GC-ITMS instrument we have performed several coupling tests campaigns between the GC, providing by the French team (LISA, LATMOS, CentraleSupelec), and the MS, providing by the US team (NASA, GSFC). Last campaign has been done with the ITU models wich is similar to the flight model and wich include the oven and the taping station providing by the German team (MPS). The results obtained demonstrate the current status of the end-to-end performance of the gas chromatography-mass spectrometry mode of operation. References:[1] Buch, A. et al. (2009) J chrom. A, 43, 143-151. [2] Freissinet et al. (2011) J Chrom A, 1306, 59-71. [3] Geffroy-Rodier, C. et al. (2009) JAAP, 85, 454-459. Acknowledgements: Funding provided by the Mars Exploration Program (point of contact, George Tahu, NASA/HQ). MOMA is a collaboration between NASA and ESA (PI Goesmann, MPS). MOMA-GC team acknowledges support from the French Space Agency (CNES), French National Programme of Planetology (PNP), National French Council (CNRS), Pierre Simon Laplace Institute.

  10. End-to-End Beam Simulations for the New Muon G-2 Experiment at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Korostelev, Maxim [Cockcroft Inst. Accel. Sci. Tech.; Bailey, Ian [Lancaster U.; Herrod, Alexander [Liverpool U.; Morgan, James [Fermilab; Morse, William [RIKEN BNL; Stratakis, Diktys [RIKEN BNL; Tishchenko, Vladimir [RIKEN BNL; Wolski, Andrzej [Cockcroft Inst. Accel. Sci. Tech.

    2016-06-01

    The aim of the new muon g-2 experiment at Fermilab is to measure the anomalous magnetic moment of the muon with an unprecedented uncertainty of 140 ppb. A beam of positive muons required for the experiment is created by pion decay. Detailed studies of the beam dynamics and spin polarization of the muons are important to predict systematic uncertainties in the experiment. In this paper, we present the results of beam simulations and spin tracking from the pion production target to the muon storage ring. The end-to-end beam simulations are developed in Bmad and include the processes of particle decay, collimation (with accurate representation of all apertures) and spin tracking.

  11. End-to-end interoperability and workflows from building architecture design to one or more simulations

    Science.gov (United States)

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  12. MRI simulation: end-to-end testing for prostate radiation therapy using geometric pelvic MRI phantoms

    International Nuclear Information System (INIS)

    Sun, Jidi; Menk, Fred; Lambert, Jonathan; Martin, Jarad; Denham, James W; Greer, Peter B; Dowling, Jason; Rivest-Henault, David; Pichler, Peter; Parker, Joel; Arm, Jameen; Best, Leah

    2015-01-01

    To clinically implement MRI simulation or MRI-alone treatment planning requires comprehensive end-to-end testing to ensure an accurate process. The purpose of this study was to design and build a geometric phantom simulating a human male pelvis that is suitable for both CT and MRI scanning and use it to test geometric and dosimetric aspects of MRI simulation including treatment planning and digitally reconstructed radiograph (DRR) generation.A liquid filled pelvic shaped phantom with simulated pelvic organs was scanned in a 3T MRI simulator with dedicated radiotherapy couch-top, laser bridge and pelvic coil mounts. A second phantom with the same external shape but with an internal distortion grid was used to quantify the distortion of the MR image. Both phantoms were also CT scanned as the gold-standard for both geometry and dosimetry. Deformable image registration was used to quantify the MR distortion. Dose comparison was made using a seven-field IMRT plan developed on the CT scan with the fluences copied to the MR image and recalculated using bulk electron densities.Without correction the maximum distortion of the MR compared with the CT scan was 7.5 mm across the pelvis, while this was reduced to 2.6 and 1.7 mm by the vendor’s 2D and 3D correction algorithms, respectively. Within the locations of the internal organs of interest, the distortion was <1.5 and <1 mm with 2D and 3D correction algorithms, respectively. The dose at the prostate isocentre calculated on CT and MRI images differed by 0.01% (1.1 cGy). Positioning shifts were within 1 mm when setup was performed using MRI generated DRRs compared to setup using CT DRRs.The MRI pelvic phantom allows end-to-end testing of the MRI simulation workflow with comparison to the gold-standard CT based process. MRI simulation was found to be geometrically accurate with organ dimensions, dose distributions and DRR based setup within acceptable limits compared to CT. (paper)

  13. Development of an End-to-End Active Debris Removal (ADR) Mission Strategic Plan

    Data.gov (United States)

    National Aeronautics and Space Administration — The original proposal was to develop an ADR mission strategic plan. However, the task was picked up by the OCT. Subsequently the award was de-scoped to $30K to...

  14. End-to-end simulations and planning of a small space telescopes: Galaxy Evolution Spectroscopic Explorer: a case study

    Science.gov (United States)

    Heap, Sara; Folta, David; Gong, Qian; Howard, Joseph; Hull, Tony; Purves, Lloyd

    2016-08-01

    Large astronomical missions are usually general-purpose telescopes with a suite of instruments optimized for different wavelength regions, spectral resolutions, etc. Their end-to-end (E2E) simulations are typically photons-in to flux-out calculations made to verify that each instrument meets its performance specifications. In contrast, smaller space missions are usually single-purpose telescopes, and their E2E simulations start with the scientific question to be answered and end with an assessment of the effectiveness of the mission in answering the scientific question. Thus, E2E simulations for small missions consist a longer string of calculations than for large missions, as they include not only the telescope and instrumentation, but also the spacecraft, orbit, and external factors such as coordination with other telescopes. Here, we illustrate the strategy and organization of small-mission E2E simulations using the Galaxy Evolution Spectroscopic Explorer (GESE) as a case study. GESE is an Explorer/Probe-class space mission concept with the primary aim of understanding galaxy evolution. Operation of a small survey telescope in space like GESE is usually simpler than operations of large telescopes driven by the varied scientific programs of the observers or by transient events. Nevertheless, both types of telescopes share two common challenges: maximizing the integration time on target, while minimizing operation costs including communication costs and staffing on the ground. We show in the case of GESE how these challenges can be met through a custom orbit and a system design emphasizing simplification and leveraging information from ground-based telescopes.

  15. IDENTIFYING ELUSIVE ELECTROMAGNETIC COUNTERPARTS TO GRAVITATIONAL WAVE MERGERS: AN END-TO-END SIMULATION

    International Nuclear Information System (INIS)

    Nissanke, Samaya; Georgieva, Alexandra; Kasliwal, Mansi

    2013-01-01

    Combined gravitational wave (GW) and electromagnetic (EM) observations of compact binary mergers should enable detailed studies of astrophysical processes in the strong-field gravity regime. This decade, ground-based GW interferometers promise to routinely detect compact binary mergers. Unfortunately, networks of GW interferometers have poor angular resolution on the sky and their EM signatures are predicted to be faint. Therefore, a challenging goal will be to unambiguously pinpoint the EM counterparts of GW mergers. We perform the first comprehensive end-to-end simulation that focuses on: (1) GW sky localization, distance measures, and volume errors with two compact binary populations and four different GW networks; (2) subsequent EM detectability by a slew of multiwavelength telescopes; and (3) final identification of the merger counterpart amidst a sea of possible astrophysical false positives. First, we find that double neutron star binary mergers can be detected out to a maximum distance of 400 Mpc (or 750 Mpc) by three (or five) detector GW networks, respectively. Neutron-star-black-hole binary mergers can be detected a factor of 1.5 further out; their median to maximum sky localizations are 50-170 deg 2 (or 6-65 deg 2 ) for a three (or five) detector GW network. Second, by optimizing depth, cadence, and sky area, we quantify relative fractions of optical counterparts that are detectable by a suite of different aperture-size telescopes across the globe. Third, we present five case studies to illustrate the diversity of scenarios in secure identification of the EM counterpart. We discuss the case of a typical binary, neither beamed nor nearby, and the challenges associated with identifying an EM counterpart at both low and high Galactic latitudes. For the first time, we demonstrate how construction of low-latency GW volumes in conjunction with local universe galaxy catalogs can help solve the problem of false positives. We conclude with strategies that would

  16. End-to-end simulation of the C-ADS injector Ⅱ with a 3-D field map

    International Nuclear Information System (INIS)

    Wang Zhijun; He Yuan; Li Chao; Wang Wangsheng; Liu Shuhui; Jia Huan; Xu Xianbo; Chen Ximeng

    2013-01-01

    The Injector II, one of the two parallel injectors of the high-current superconducting proton driver linac for the China Accelerator-Driven System (C-ADS) project, is being designed and constructed by the Institute of Modern Physics. At present, the design work for the injector is almost finished. End-to-end simulation has been carried out using the TRACK multiparticle simulation code to check the match between each acceleration section and the performance of the injector as a whole. Moreover, multiparticle simulations with all kinds of errors and misalignments have been performed to define the requirements of each device. The simulation results indicate that the lattice design is robust. In this paper, the results of end-to-end simulation and error simulation with a 3-D field map are presented. (authors)

  17. An end-to-end microfluidic platform for engineering life supporting microbes in space exploration missions, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — HJ Science & Technology proposes a programmable, low-cost, and compact microfluidic platform capable of running automated end-to-end processes and optimization...

  18. Modeling and Simulation of Satellite Subsystems for End-to-End Spacecraft Modeling

    National Research Council Canada - National Science Library

    Schum, William K; Doolittle, Christina M; Boyarko, George A

    2006-01-01

    During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems...

  19. End-to-end simulation of a visible 1 kW FEL

    International Nuclear Information System (INIS)

    Parazzoli, Claudio G.; Koltenbah, Benjamin E.C.

    2000-01-01

    In this paper we present the complete numerical simulation of the 1 kW visible Free Electron Laser under construction in Seattle. We show that the goal of producing 1.0 kW at 0.7 μm is well within the hardware capabilities. We simulate in detail the evolution of the electron bunch phase space in the entire e-beam line. The e-beam line includes the photo-injector cavities, the 433.33 MHz accelerator, the magnetic buncher, the 1300 MHz accelerator, the 180 deg. bend and the matching optics into the wiggler. The computed phase space is input for a three-dimensional time-dependent code that predicts the FEL performance. All the computations are based on state of the art software, and the limitations of the current software are discussed. We believe that this is the first time that such a thorough numerical simulation has been carried out and that such a realistic electron phase space has been used in FEL performance calculations

  20. Crosstalk in an FDM Laboratory Setup and the Athena X-IFU End-to-End Simulator

    Science.gov (United States)

    den Hartog, R.; Kirsch, C.; de Vries, C.; Akamatsu, H.; Dauser, T.; Peille, P.; Cucchetti, E.; Jackson, B.; Bandler, S.; Smith, S.; Wilms, J.

    2018-04-01

    The impact of various crosstalk mechanisms on the performance of the Athena X-IFU instrument has been assessed with detailed end-to-end simulations. For the crosstalk in the electrical circuit, a detailed model has been developed. In this contribution, we test this model against measurements made with an FDM laboratory setup and discuss the assumption of deterministic crosstalk in the context of the weak link effect in the detectors. We conclude that crosstalk levels predicted by the model are conservative with respect to the observed levels.

  1. HIDE & SEEK: End-to-end packages to simulate and process radio survey data

    Science.gov (United States)

    Akeret, J.; Seehars, S.; Chang, C.; Monstein, C.; Amara, A.; Refregier, A.

    2017-01-01

    As several large single-dish radio surveys begin operation within the coming decade, a wealth of radio data will become available and provide a new window to the Universe. In order to fully exploit the potential of these datasets, it is important to understand the systematic effects associated with the instrument and the analysis pipeline. A common approach to tackle this is to forward-model the entire system-from the hardware to the analysis of the data products. For this purpose, we introduce two newly developed, open-source Python packages: the HI Data Emulator (HIDE) and the Signal Extraction and Emission Kartographer (SEEK) for simulating and processing single-dish radio survey data. HIDE forward-models the process of collecting astronomical radio signals in a single-dish radio telescope instrument and outputs pixel-level time-ordered-data. SEEK processes the time-ordered-data, removes artifacts from Radio Frequency Interference (RFI), automatically applies flux calibration, and aims to recover the astronomical radio signal. The two packages can be used separately or together depending on the application. Their modular and flexible nature allows easy adaptation to other instruments and datasets. We describe the basic architecture of the two packages and examine in detail the noise and RFI modeling in HIDE, as well as the implementation of gain calibration and RFI mitigation in SEEK. We then apply HIDE &SEEK to forward-model a Galactic survey in the frequency range 990-1260 MHz based on data taken at the Bleien Observatory. For this survey, we expect to cover 70% of the full sky and achieve a median signal-to-noise ratio of approximately 5-6 in the cleanest channels including systematic uncertainties. However, we also point out the potential challenges of high RFI contamination and baseline removal when examining the early data from the Bleien Observatory. The fully documented HIDE &SEEK packages are available at http://hideseek.phys.ethz.ch/ and are published

  2. End to End Travel

    Data.gov (United States)

    US Agency for International Development — E2 Solutions is a web based end-to-end travel management tool that includes paperless travel authorization and voucher document submissions, document approval...

  3. End-to-End Trajectory for Conjunction Class Mars Missions Using Hybrid Solar-Electric/Chemical Transportation System

    Science.gov (United States)

    Chai, Patrick R.; Merrill, Raymond G.; Qu, Min

    2016-01-01

    NASA's Human Spaceflight Architecture Team is developing a reusable hybrid transportation architecture in which both chemical and solar-electric propulsion systems are used to deliver crew and cargo to exploration destinations. By combining chemical and solar-electric propulsion into a single spacecraft and applying each where it is most effective, the hybrid architecture enables a series of Mars trajectories that are more fuel efficient than an all chemical propulsion architecture without significant increases to trip time. The architecture calls for the aggregation of exploration assets in cislunar space prior to departure for Mars and utilizes high energy lunar-distant high Earth orbits for the final staging prior to departure. This paper presents the detailed analysis of various cislunar operations for the EMC Hybrid architecture as well as the result of the higher fidelity end-to-end trajectory analysis to understand the implications of the design choices on the Mars exploration campaign.

  4. Unmanned Aircraft Systems Minimum Operations Performance Standards End-to-End Verification and Validation (E2-V2) Simulation

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Vincent, Michael J.; Sturdy, James L.; Munoz, Cesar A.; Hoffler, Keith D.; Dutle, Aaron M.; Myer, Robert R.; Dehaven, Anna M.; hide

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The current NAS relies on pilot's vigilance and judgement to remain Well Clear (CFR 14 91.113) of other aircraft. RTCA SC-228 has defined DAA Well Clear (DAAWC) to provide a quantified Well Clear volume to allow systems to be designed and measured against. Extended research efforts have been conducted to understand and quantify system requirements needed to support a UAS pilot's ability to remain well clear of other aircraft. The efforts have included developing and testing sensor, algorithm, alerting, and display requirements. More recently, sensor uncertainty and uncertainty mitigation strategies have been evaluated. This paper discusses results and lessons learned from an End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS). NASA Langley Research Center (LaRC) was called upon to develop a system that evaluates a specific set of encounters, in a variety of geometries, with end-to-end DAA functionality including the use of sensor and tracker models, a sensor uncertainty mitigation model, DAA algorithmic guidance in both vertical and horizontal maneuvering, and a pilot model which maneuvers the ownship aircraft to remain well clear from intruder aircraft, having received collective input from the previous modules of the system. LaRC developed a functioning batch simulation and added a sensor/tracker model from the Federal Aviation Administration (FAA) William J. Hughes Technical Center, an in-house developed sensor uncertainty mitigation strategy, and implemented a pilot

  5. Quasi-real-time end-to-end simulations of ELT-scale adaptive optics systems on GPUs

    Science.gov (United States)

    Gratadour, Damien

    2011-09-01

    Our team has started the development of a code dedicated to GPUs for the simulation of AO systems at the E-ELT scale. It uses the CUDA toolkit and an original binding to Yorick (an open source interpreted language) to provide the user with a comprehensive interface. In this paper we present the first performance analysis of our simulation code, showing its ability to provide Shack-Hartmann (SH) images and measurements at the kHz scale for VLT-sized AO system and in quasi-real-time (up to 70 Hz) for ELT-sized systems on a single top-end GPU. The simulation code includes multiple layers atmospheric turbulence generation, ray tracing through these layers, image formation at the focal plane of every sub-apertures of a SH sensor using either natural or laser guide stars and centroiding on these images using various algorithms. Turbulence is generated on-the-fly giving the ability to simulate hours of observations without the need of loading extremely large phase screens in the global memory. Because of its performance this code additionally provides the unique ability to test real-time controllers for future AO systems under nominal conditions.

  6. Verification of the active deformation compensation system of the LMT/GTM by end-to-end simulations

    Science.gov (United States)

    Eisentraeger, Peter; Suess, Martin

    2000-07-01

    The 50 m LMT/GTM is exposed to the climatic conditions at 4,600 m height on Cerro La Negra, Mexico. For operating the telescope to the challenging requirements of its millimeter objective, an active approach for monitoring and compensating the structural deformations (Flexible Body Compensation FBC) is necessary. This system includes temperature sensors and strain gages for identifying large scale deformations of the reflector backup structure, a laser system for measuring the subreflector position, and an inclinometer system for measuring the deformations of the alidade. For compensating the monitored deformations, the telescope is equipped with additional actuators for active control of the main reflector surface and the subreflector position. The paper describes the verification of the active deformation system by finite element calculations and MATLAB simulations of the surface accuracy and the pointing including the servo under the operational wind and thermal conditions.

  7. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The development of the Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large complex systems engineering challenge being addressed in part by focusing on the specific subsystems handling of off-nominal mission and fault tolerance. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML), the Mission and Fault Management (M&FM) algorithms are crafted and vetted in specialized Integrated Development Teams composed of multiple development disciplines. NASA also has formed an M&FM team for addressing fault management early in the development lifecycle. This team has developed a dedicated Vehicle Management End-to-End Testbed (VMET) that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. The flexibility of VMET enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the algorithms utilizing actual subsystem models. The intent is to validate the algorithms and substantiate them with performance baselines for each of the vehicle subsystems in an independent platform exterior to flight software test processes. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test processes. Risk reduction is addressed by working with other organizations such as S

  8. Greenhouse gas profiling by infrared-laser and microwave occultation: retrieval algorithm and demonstration results from end-to-end simulations

    Directory of Open Access Journals (Sweden)

    V. Proschek

    2011-10-01

    Full Text Available Measuring greenhouse gas (GHG profiles with global coverage and high accuracy and vertical resolution in the upper troposphere and lower stratosphere (UTLS is key for improved monitoring of GHG concentrations in the free atmosphere. In this respect a new satellite mission concept adding an infrared-laser part to the already well studied microwave occultation technique exploits the joint propagation of infrared-laser and microwave signals between Low Earth Orbit (LEO satellites. This synergetic combination, referred to as LEO-LEO microwave and infrared-laser occultation (LMIO method, enables to retrieve thermodynamic profiles (pressure, temperature, humidity and accurate altitude levels from the microwave signals and GHG profiles from the simultaneously measured infrared-laser signals. However, due to the novelty of the LMIO method, a retrieval algorithm for GHG profiling is not yet available. Here we introduce such an algorithm for retrieving GHGs from LEO-LEO infrared-laser occultation (LIO data, applied as a second step after retrieving thermodynamic profiles from LEO-LEO microwave occultation (LMO data. We thoroughly describe the LIO retrieval algorithm and unveil the synergy with the LMO-retrieved pressure, temperature, and altitude information. We furthermore demonstrate the effective independence of the GHG retrieval results from background (a priori information in discussing demonstration results from LMIO end-to-end simulations for a representative set of GHG profiles, including carbon dioxide (CO2, water vapor (H2O, methane (CH4, and ozone (O3. The GHGs except for ozone are well retrieved throughout the UTLS, while ozone is well retrieved from about 10 km to 15 km upwards, since the ozone layer resides in the lower stratosphere. The GHG retrieval errors are generally smaller than 1% to 3% r.m.s., at a vertical resolution of about 1 km. The retrieved profiles also appear unbiased, which points

  9. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    Science.gov (United States)

    Trevino, Luis; Patterson, Jonathan; Teare, David; Johnson, Stephen

    2015-01-01

    integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. Additionally, the team has developed processes for implementing and validating these algorithms for concept validation and risk reduction for the SLS program. The flexibility of the Vehicle Management End-to-end Testbed (VMET) enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the developed algorithms utilizing actual subsystem models such as MPS. The intent of VMET is to validate the M&FM algorithms and substantiate them with performance baselines for each of the target vehicle subsystems in an independent platform exterior to the flight software development infrastructure and its related testing entities. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test cases into flight software compounded with potential human errors throughout the development lifecycle. Risk reduction is addressed by the M&FM analysis group working with other organizations such as S&MA, Structures and Environments, GNC, Orion, the Crew Office, Flight Operations, and Ground Operations by assessing performance of the M&FM algorithms in terms of their ability to reduce Loss of Mission and Loss of Crew probabilities. In addition, through state machine and diagnostic modeling, analysis efforts investigate a broader suite of failure effects and associated detection and responses that can be tested in VMET to ensure that failures can be detected, and confirm that responses do not create additional risks or cause undesired states through interactive dynamic effects with other algorithms and systems. VMET further contributes to risk reduction by prototyping and exercising the M&FM algorithms early in their implementation and without any inherent hindrances such as meeting FSW

  10. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASAs Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The engineering development of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS) requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The nominal and off-nominal characteristics of SLS's elements and subsystems must be understood and matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex systems engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model-based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model-based algorithms and their development lifecycle from inception through FSW certification are an important focus of SLS's development effort to further ensure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. To test and validate these M&FM algorithms a dedicated test-bed was developed for full Vehicle Management End-to-End Testing (VMET). For addressing fault management (FM

  11. Standardizing an End-to-end Accounting Service

    Science.gov (United States)

    Greenberg, Edward; Kazz, Greg

    2006-01-01

    Currently there are no space system standards available for space agencies to accomplish end-to-end accounting. Such a standard does not exist for spacecraft operations nor for tracing the relationship between the mission planning activities, the command sequences designed to perform those activities, the commands formulated to initiate those activities and the mission data and specifically the mission data products created by those activities. In order for space agencies to cross-support one another for data accountability/data tracing and for inter agency spacecraft to interoperate with each other, an international CCSDS standard for end-to-end data accountability/tracing needs to be developed. We will first describe the end-to-end accounting service model and functionality that supports the service. This model will describe how science plans that are ultimately transformed into commands can be associated with the telemetry products generated as a result of their execution. Moreover, the interaction between end-to-end accounting and service management will be explored. Finally, we will show how the standard end-to-end accounting service can be applied to a real life flight project i.e., the Mars Reconnaissance Orbiter project.

  12. OISI dynamic end-to-end modeling tool

    Science.gov (United States)

    Kersten, Michael; Weidler, Alexander; Wilhelm, Rainer; Johann, Ulrich A.; Szerdahelyi, Laszlo

    2000-07-01

    The OISI Dynamic end-to-end modeling tool is tailored to end-to-end modeling and dynamic simulation of Earth- and space-based actively controlled optical instruments such as e.g. optical stellar interferometers. `End-to-end modeling' is meant to denote the feature that the overall model comprises besides optical sub-models also structural, sensor, actuator, controller and disturbance sub-models influencing the optical transmission, so that the system- level instrument performance due to disturbances and active optics can be simulated. This tool has been developed to support performance analysis and prediction as well as control loop design and fine-tuning for OISI, Germany's preparatory program for optical/infrared spaceborne interferometry initiated in 1994 by Dornier Satellitensysteme GmbH in Friedrichshafen.

  13. Unmanned Aircraft Systems Detect and Avoid System: End-to-End Verification and Validation Simulation Study of Minimum Operations Performance Standards for Integrating Unmanned Aircraft into the National Airspace System

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Sturdy, James L.; Vincent, Michael J.; Hoffler, Keith D.; Myer, Robert R.; DeHaven, Anna M.

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The technique, results, and lessons learned from a detailed End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS), based on specific test vectors and encounter cases, will be presented in this paper.

  14. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak

  15. Performance of the end-to-end test for the characterization of a simulator in stereotaxic corporal radiotherapy of liver; Realização do teste end-to-end para a caracterização de um simulador em radioterapia estereotáxica corpórea de fígado

    Energy Technology Data Exchange (ETDEWEB)

    Burgos, A.F.; Paiva, E. de, E-mail: adamfburgos@gmail.com [Instituto de Radioproteção e Dosimetria (IRD/CNEN), Rio de Janeiro-RJ (Brazil). Div. de Física Médica; Silva, L.P. da [Instituto Nacional de Câncer (MS/INCA), Rio de Janeiro-RJ (Brazil). Dept. de Física Médica

    2017-07-01

    Currently, one of the alternatives to the radiotherapy of the liver is the body stereotactic radiotherapy (SBRT), which delivers high doses in a few fractions, due to its good prognosis. However, in order to ensure that the high dose value delivered to the target is the same as planned, a full-process verification test (image acquisition, design, scheduling, and dose delivery) should be performed. For this purpose, the objective of this work was to develop a water density simulator that takes into account the relative position of the liver and the risk organs involved in this treatment, evaluating the influence of target movement on the dose value, due to the the respiratory process, as well as in positions related to the organs at risk.

  16. Cyberinfrastructure for End-to-End Environmental Explorations

    Science.gov (United States)

    Merwade, V.; Kumar, S.; Song, C.; Zhao, L.; Govindaraju, R.; Niyogi, D.

    2007-12-01

    The design and implementation of a cyberinfrastructure for End-to-End Environmental Exploration (C4E4) is presented. The C4E4 framework addresses the need for an integrated data/computation platform for studying broad environmental impacts by combining heterogeneous data resources with state-of-the-art modeling and visualization tools. With Purdue being a TeraGrid Resource Provider, C4E4 builds on top of the Purdue TeraGrid data management system and Grid resources, and integrates them through a service-oriented workflow system. It allows researchers to construct environmental workflows for data discovery, access, transformation, modeling, and visualization. Using the C4E4 framework, we have implemented an end-to-end SWAT simulation and analysis workflow that connects our TeraGrid data and computation resources. It enables researchers to conduct comprehensive studies on the impact of land management practices in the St. Joseph watershed using data from various sources in hydrologic, atmospheric, agricultural, and other related disciplines.

  17. A Bayes Theory-Based Modeling Algorithm to End-to-end Network Traffic

    Directory of Open Access Journals (Sweden)

    Zhao Hong-hao

    2016-01-01

    Full Text Available Recently, network traffic has exponentially increasing due to all kind of applications, such as mobile Internet, smart cities, smart transportations, Internet of things, and so on. the end-to-end network traffic becomes more important for traffic engineering. Usually end-to-end traffic estimation is highly difficult. This paper proposes a Bayes theory-based method to model the end-to-end network traffic. Firstly, the end-to-end network traffic is described as a independent identically distributed normal process. Then the Bases theory is used to characterize the end-to-end network traffic. By calculating the parameters, the model is determined correctly. Simulation results show that our approach is feasible and effective.

  18. End-to-End Traffic Flow Modeling of the Integrated SCaN Network

    Science.gov (United States)

    Cheung, K.-M.; Abraham, D. S.

    2012-05-01

    In this article, we describe the analysis and simulation effort of the end-to-end traffic flow for the Integrated Space Communications and Navigation (SCaN) Network. Using the network traffic derived for the 30-day period of July 2018 from the Space Communications Mission Model (SCMM), we generate the wide-area network (WAN) bandwidths of the ground links for different architecture options of the Integrated SCaN Network. We also develop a new analytical scheme to model the traffic flow and buffering mechanism of a store-and-forward network. It is found that the WAN bandwidth of the Integrated SCaN Network is an important differentiator of different architecture options, as the recurring circuit costs of certain architecture options can be prohibitively high.

  19. End-to-End Security for Personal Telehealth

    NARCIS (Netherlands)

    Koster, R.P.; Asim, M.; Petkovic, M.

    2011-01-01

    Personal telehealth is in rapid development with innovative emerging applications like disease management. With personal telehealth people participate in their own care supported by an open distributed system with health services. This poses new end-to-end security and privacy challenges. In this

  20. Utilizing Domain Knowledge in End-to-End Audio Processing

    DEFF Research Database (Denmark)

    Tax, Tycho; Antich, Jose Luis Diez; Purwins, Hendrik

    2017-01-01

    to learn the commonly-used log-scaled mel-spectrogram transformation. Secondly, we demonstrate that upon initializing the first layers of an end-to-end CNN classifier with the learned transformation, convergence and performance on the ESC-50 environmental sound classification dataset are similar to a CNN......-based model trained on the highly pre-processed log-scaled mel-spectrogram features....

  1. Measurements and analysis of end-to-end Internet dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Paxson, Vern [Univ. of California, Berkeley, CA (United States). Computer Science Division

    1997-04-01

    Accurately characterizing end-to-end Internet dynamics - the performance that a user actually obtains from the lengthy series of network links that comprise a path through the Internet - is exceptionally difficult, due to the network`s immense heterogeneity. At the heart of this work is a `measurement framework` in which a number of sites around the Internet host a specialized measurement service. By coordinating `probes` between pairs of these sites one can measure end-to-end behavior along O(N2) paths for a framework consisting of N sites. Consequently, one obtains a superlinear scaling that allows measuring a rich cross-section of Internet behavior without requiring huge numbers of observation points. 37 sites participated in this study, allowing the author to measure more than 1,000 distinct Internet paths. The first part of this work looks at the behavior of end-to-end routing: the series of routers over which a connection`s packets travel. Based on 40,000 measurements made using this framework, the author analyzes: routing `pathologies` such as loops, outages, and flutter; the stability of routes over time; and the symmetry of routing along the two directions of an end-to-end path. The author finds that pathologies increased significantly over the course of 1995 and that Internet paths are heavily dominated by a single route. The second part of this work studies end-to-end Internet packet dynamics. The author analyzes 20,000 TCP transfers of 100 Kbyte each to investigate the performance of both the TCP endpoints and the Internet paths. The measurements used for this part of the study are much richer than those for the first part, but require a great degree of attention to issues of calibration, which are addressed by applying self-consistency checks to the measurements whenever possible. The author finds that packet filters are capable of a wide range of measurement errors, some of which, if undetected, can significantly taint subsequent analysis.

  2. Location Assisted Vertical Handover Algorithm for QoS Optimization in End-to-End Connections

    DEFF Research Database (Denmark)

    Dam, Martin S.; Christensen, Steffen R.; Mikkelsen, Lars M.

    2012-01-01

    implementation on Android based tablets. The simulations cover a wide range of scenarios for two mobile users in an urban area with ubiquitous cellular coverage, and shows our algorithm leads to increased throughput, with fewer handovers, when considering the end-to-end connection than to other handover schemes...

  3. End-to-End Operations in the ELT Era

    Science.gov (United States)

    Hainaut, O. R.; Bierwirth, T.; Brillant, S.; Mieske, S.; Patat, F.; Rejkuba, M.; Romaniello, M.; Sterzik, M.

    2018-03-01

    The Data Flow System is the infrastructure on which Very Large Telescope (VLT) observations are performed at the Observatory, before and after the observations themselves take place. Since its original conception in the late 1990s, it has evolved to accommodate new observing modes and new instruments on La Silla and Paranal. Several updates and upgrades are needed to overcome its obsolescence and to integrate requirements from the new instruments from the community and, of course, from ESO's Extremely Large Telescope (ELT), which will be integrated into Paranal's operations. We describe the end-to-end operations and the resulting roadmap guiding their further development.

  4. End-to-end tests using alanine dosimetry in scanned proton beams

    Science.gov (United States)

    Carlino, A.; Gouldstone, C.; Kragl, G.; Traneus, E.; Marrale, M.; Vatnitsky, S.; Stock, M.; Palmans, H.

    2018-03-01

    This paper describes end-to-end test procedures as the last fundamental step of medical commissioning before starting clinical operation of the MedAustron synchrotron-based pencil beam scanning (PBS) therapy facility with protons. One in-house homogeneous phantom and two anthropomorphic heterogeneous (head and pelvis) phantoms were used for end-to-end tests at MedAustron. The phantoms were equipped with alanine detectors, radiochromic films and ionization chambers. The correction for the ‘quenching’ effect of alanine pellets was implemented in the Monte Carlo platform of the evaluation version of RayStation TPS. During the end-to-end tests, the phantoms were transferred through the workflow like real patients to simulate the entire clinical workflow: immobilization, imaging, treatment planning and dose delivery. Different clinical scenarios of increasing complexity were simulated: delivery of a single beam, two oblique beams without and with range shifter. In addition to the dose comparison in the plastic phantoms the dose obtained from alanine pellet readings was compared with the dose determined with the Farmer ionization chamber in water. A consistent systematic deviation of about 2% was found between alanine dosimetry and the ionization chamber dosimetry in water and plastic materials. Acceptable agreement of planned and delivered doses was observed together with consistent and reproducible results of the end-to-end testing performed with different dosimetric techniques (alanine detectors, ionization chambers and EBT3 radiochromic films). The results confirmed the adequate implementation and integration of the new PBS technology at MedAustron. This work demonstrates that alanine pellets are suitable detectors for end-to-end tests in proton beam therapy and the developed procedures with customized anthropomorphic phantoms can be used to support implementation of PBS technology in clinical practice.

  5. STS/DBS power subsystem end-to-end stability margin

    Science.gov (United States)

    Devaux, R. N.; Vattimo, R. J.; Peck, S. R.; Baker, W. E.

    Attention is given to a full-up end-to-end subsystem stability test which was performed with a flight solar array providing power to a fully operational spacecraft. The solar array simulator is described, and a comparison is made between test results obtained with the simulator and those obtained with the actual array. It is concluded that stability testing with a fully integrated spacecraft is necessary to ensure that all elements have been adequately modeled.

  6. An end to end secure CBIR over encrypted medical database.

    Science.gov (United States)

    Bellafqira, Reda; Coatrieux, Gouenou; Bouslimi, Dalel; Quellec, Gwenole

    2016-08-01

    In this paper, we propose a new secure content based image retrieval (SCBIR) system adapted to the cloud framework. This solution allows a physician to retrieve images of similar content within an outsourced and encrypted image database, without decrypting them. Contrarily to actual CBIR approaches in the encrypted domain, the originality of the proposed scheme stands on the fact that the features extracted from the encrypted images are themselves encrypted. This is achieved by means of homomorphic encryption and two non-colluding servers, we however both consider as honest but curious. In that way an end to end secure CBIR process is ensured. Experimental results carried out on a diabetic retinopathy database encrypted with the Paillier cryptosystem indicate that our SCBIR achieves retrieval performance as good as if images were processed in their non-encrypted form.

  7. System of end-to-end symmetric database encryption

    Science.gov (United States)

    Galushka, V. V.; Aydinyan, A. R.; Tsvetkova, O. L.; Fathi, V. A.; Fathi, D. V.

    2018-05-01

    The article is devoted to the actual problem of protecting databases from information leakage, which is performed while bypassing access control mechanisms. To solve this problem, it is proposed to use end-to-end data encryption, implemented at the end nodes of an interaction of the information system components using one of the symmetric cryptographic algorithms. For this purpose, a key management method designed for use in a multi-user system based on the distributed key representation model, part of which is stored in the database, and the other part is obtained by converting the user's password, has been developed and described. In this case, the key is calculated immediately before the cryptographic transformations and is not stored in the memory after the completion of these transformations. Algorithms for registering and authorizing a user, as well as changing his password, have been described, and the methods for calculating parts of a key when performing these operations have been provided.

  8. End-to-end learning for digital hologram reconstruction

    Science.gov (United States)

    Xu, Zhimin; Zuo, Si; Lam, Edmund Y.

    2018-02-01

    Digital holography is a well-known method to perform three-dimensional imaging by recording the light wavefront information originating from the object. Not only the intensity, but also the phase distribution of the wavefront can then be computed from the recorded hologram in the numerical reconstruction process. However, the reconstructions via the traditional methods suffer from various artifacts caused by twin-image, zero-order term, and noise from image sensors. Here we demonstrate that an end-to-end deep neural network (DNN) can learn to perform both intensity and phase recovery directly from an intensity-only hologram. We experimentally show that the artifacts can be effectively suppressed. Meanwhile, our network doesn't need any preprocessing for initialization, and is comparably fast to train and test, in comparison with the recently published learning-based method. In addition, we validate that the performance improvement can be achieved by introducing a prior on sparsity.

  9. End-to-end delay analysis in wireless sensor networks with service vacation

    KAUST Repository

    Alabdulmohsin, Ibrahim; Hyadi, Amal; Afify, Laila H.; Shihada, Basem

    2014-01-01

    In this paper, a delay-sensitive multi-hop wireless sensor network is considered, employing an M/G/1 with vacations framework. Sensors transmit measurements to a predefined data sink subject to maximum end-to-end delay constraint. In order to prolong the battery lifetime, a sleeping scheme is adopted throughout the network nodes. The objective of our proposed framework is to present an expression for maximum hop-count as well as an approximate expression of the probability of blocking at the sink node upon violating certain end-to-end delay threshold. Using numerical simulations, we validate the proposed analytical model and demonstrate that the blocking probability of the system for various vacation time distributions matches the simulation results.

  10. End-to-end delay analysis in wireless sensor networks with service vacation

    KAUST Repository

    Alabdulmohsin, Ibrahim

    2014-04-01

    In this paper, a delay-sensitive multi-hop wireless sensor network is considered, employing an M/G/1 with vacations framework. Sensors transmit measurements to a predefined data sink subject to maximum end-to-end delay constraint. In order to prolong the battery lifetime, a sleeping scheme is adopted throughout the network nodes. The objective of our proposed framework is to present an expression for maximum hop-count as well as an approximate expression of the probability of blocking at the sink node upon violating certain end-to-end delay threshold. Using numerical simulations, we validate the proposed analytical model and demonstrate that the blocking probability of the system for various vacation time distributions matches the simulation results.

  11. End-to-End Adversarial Retinal Image Synthesis.

    Science.gov (United States)

    Costa, Pedro; Galdran, Adrian; Meyer, Maria Ines; Niemeijer, Meindert; Abramoff, Michael; Mendonca, Ana Maria; Campilho, Aurelio

    2018-03-01

    In medical image analysis applications, the availability of the large amounts of annotated data is becoming increasingly critical. However, annotated medical data is often scarce and costly to obtain. In this paper, we address the problem of synthesizing retinal color images by applying recent techniques based on adversarial learning. In this setting, a generative model is trained to maximize a loss function provided by a second model attempting to classify its output into real or synthetic. In particular, we propose to implement an adversarial autoencoder for the task of retinal vessel network synthesis. We use the generated vessel trees as an intermediate stage for the generation of color retinal images, which is accomplished with a generative adversarial network. Both models require the optimization of almost everywhere differentiable loss functions, which allows us to train them jointly. The resulting model offers an end-to-end retinal image synthesis system capable of generating as many retinal images as the user requires, with their corresponding vessel networks, by sampling from a simple probability distribution that we impose to the associated latent space. We show that the learned latent space contains a well-defined semantic structure, implying that we can perform calculations in the space of retinal images, e.g., smoothly interpolating new data points between two retinal images. Visual and quantitative results demonstrate that the synthesized images are substantially different from those in the training set, while being also anatomically consistent and displaying a reasonable visual quality.

  12. Urban Biomining Meets Printable Electronics: End-To-End at Destination Biological Recycling and Reprinting

    Science.gov (United States)

    Rothschild, Lynn J. (Principal Investigator); Koehne, Jessica; Gandhiraman, Ram; Navarrete, Jesica; Spangle, Dylan

    2017-01-01

    Space missions rely utterly on metallic components, from the spacecraft to electronics. Yet, metals add mass, and electronics have the additional problem of a limited lifespan. Thus, current mission architectures must compensate for replacement. In space, spent electronics are discarded; on earth, there is some recycling but current processes are toxic and environmentally hazardous. Imagine instead an end-to-end recycling of spent electronics at low mass, low cost, room temperature, and in a non-toxic manner. Here, we propose a solution that will not only enhance mission success by decreasing upmass and providing a fresh supply of electronics, but in addition has immediate applications to a serious environmental issue on the Earth. Spent electronics will be used as feedstock to make fresh electronic components, a process we will accomplish with so-called 'urban biomining' using synthetically enhanced microbes to bind metals with elemental specificity. To create new electronics, the microbes will be used as 'bioink' to print a new IC chip, using plasma jet electronics printing. The plasma jet electronics printing technology will have the potential to use martian atmospheric gas to print and to tailor the electronic and chemical properties of the materials. Our preliminary results have suggested that this process also serves as a purification step to enhance the proportion of metals in the 'bioink'. The presence of electric field and plasma can ensure printing in microgravity environment while also providing material morphology and electronic structure tunabiity and thus optimization. Here we propose to increase the TRL level of the concept by engineering microbes to dissolve the siliceous matrix in the IC, extract copper from a mixture of metals, and use the microbes as feedstock to print interconnects using mars gas simulant. To assess the ability of this concept to influence mission architecture, we will do an analysis of the infrastructure required to execute

  13. A Bayes Theory-Based Modeling Algorithm to End-to-end Network Traffic

    OpenAIRE

    Zhao Hong-hao; Meng Fan-bo; Zhao Si-wen; Zhao Si-hang; Lu Yi

    2016-01-01

    Recently, network traffic has exponentially increasing due to all kind of applications, such as mobile Internet, smart cities, smart transportations, Internet of things, and so on. the end-to-end network traffic becomes more important for traffic engineering. Usually end-to-end traffic estimation is highly difficult. This paper proposes a Bayes theory-based method to model the end-to-end network traffic. Firstly, the end-to-end network traffic is described as a independent identically distrib...

  14. Performance Enhancements of UMTS networks using end-to-end QoS provisioning

    DEFF Research Database (Denmark)

    Wang, Haibo; Prasad, Devendra; Teyeb, Oumer

    2005-01-01

    This paper investigates the end-to-end(E2E) quality of service(QoS) provisioning approaches for UMTS networks together with DiffServ IP network. The effort was put on QoS classes mapping from DiffServ to UMTS, Access Control(AC), buffering and scheduling optimization. The DiffServ Code Point (DSCP......) was utilized in the whole UMTS QoS provisioning to differentiate different type of traffics. The overall algorithm was optimized to guarantee the E2E QoS parameters of each service class, especially for realtime applications, as well as to improve the bandwidth utilization. Simulation shows that the enhanced...

  15. Screening California Current fishery management scenarios using the Atlantis end-to-end ecosystem model

    Science.gov (United States)

    Kaplan, Isaac C.; Horne, Peter J.; Levin, Phillip S.

    2012-09-01

    value. However, this cost was minimal when local conservation actions were part of a concerted coast-wide plan. The simulations demonstrate the utility of using the Atlantis end-to-end ecosystem model within NOAA’s Integrated Ecosystem Assessment, by illustrating an end-to-end modeling tool that allows consideration of multiple management alternatives that are relevant to numerous state, federal and private interests.

  16. End to End Inter-domain Quality of Service Provisioning

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy

    This thesis addresses selected topics of Quality of Service (QoS) provisioning in heterogeneous data networks that construct the communication environment of today's Internet. In the vast range of protocols available in different domains of network infrastructures, a few chosen ones are discussed......, the general UPnPQoS performance was assessed analytically and confirmed by simulations results. The results validate the usability of UPnP-QoS, but some open issues in the specication were identified. As a result of addressing mentioned shortcomings of UPnP-QoS, a few pre-emption algorithms for home gateway...... and discuss also access Passive Optical Network (PON) technologies, a GMPLS controlled Ten Gigabit Passive Optical Network (XGPON) was proposed. This part of the thesis introduces the possibility of managing the XG-PON by the GMPLS suite, showing again that this protocol suite is a good candidate...

  17. The end-to-end testbed of the optical metrology system on-board LISA Pathfinder

    Energy Technology Data Exchange (ETDEWEB)

    Steier, F; Cervantes, F Guzman; Marin, A F GarcIa; Heinzel, G; Danzmann, K [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut) and Universitaet Hannover (Germany); Gerardi, D, E-mail: frank.steier@aei.mpg.d [EADS Astrium Satellites GmbH, Friedrichshafen (Germany)

    2009-05-07

    LISA Pathfinder is a technology demonstration mission for the Laser Interferometer Space Antenna (LISA). The main experiment on-board LISA Pathfinder is the so-called LISA Technology Package (LTP) which has the aim to measure the differential acceleration between two free-falling test masses with an accuracy of 3 x 10{sup -14} ms{sup -2} Hz{sup -1/2} between 1 mHz and 30 mHz. This measurement is performed interferometrically by the optical metrology system (OMS) on-board LISA Pathfinder. In this paper, we present the development of an experimental end-to-end testbed of the entire OMS. It includes the interferometer and its sub-units, the interferometer backend which is a phasemeter and the processing of the phasemeter output data. Furthermore, three-axes piezo-actuated mirrors are used instead of the free-falling test masses for the characterization of the dynamic behaviour of the system and some parts of the drag-free and attitude control system (DFACS) which controls the test masses and the satellite. The end-to-end testbed includes all parts of the LTP that can reasonably be tested on earth without free-falling test masses. At its present status it consists mainly of breadboard components. Some of those have already been replaced by engineering models of the LTP experiment. In the next steps, further engineering and flight models will also be inserted in this testbed and tested against well-characterized breadboard components. The presented testbed is an important reference for the unit tests and can also be used for validation of the on-board experiment during the mission.

  18. Analytical Framework for End-to-End Delay Based on Unidirectional Highway Scenario

    Directory of Open Access Journals (Sweden)

    Aslinda Hassan

    2015-01-01

    Full Text Available In a sparse vehicular ad hoc network, a vehicle normally employs a carry and forward approach, where it holds the message it wants to transmit until the vehicle meets other vehicles or roadside units. A number of analyses in the literature have been done to investigate the time delay when packets are being carried by vehicles on both unidirectional and bidirectional highways. However, these analyses are focusing on the delay between either two disconnected vehicles or two disconnected vehicle clusters. Furthermore, majority of the analyses only concentrate on the expected value of the end-to-end delay when the carry and forward approach is used. Using regression analysis, we establish the distribution model for the time delay between two disconnected vehicle clusters as an exponential distribution. Consequently, a distribution is newly derived to represent the number of clusters on a highway using a vehicular traffic model. From there, we are able to formulate end-to-end delay model which extends the time delay model for two disconnected vehicle clusters to multiple disconnected clusters on a unidirectional highway. The analytical results obtained from the analytical model are then validated through simulation results.

  19. AN ANALYSIS OF THE APPLICATION END TO END QUALITY OF SERVICE ON 3G TELECOMMUNICATION NETWORK

    Directory of Open Access Journals (Sweden)

    Cahya Lukito

    2012-05-01

    Full Text Available End to End Quality of Service is a way to provide data package service in a telecommunication network that based on Right Price, Right Service Level, and Right Quality. The goal of this research is to analyze the impact of End to End QoS use on 3G telecommunication network for voice service and data. This research uses an analysis method by doing the application on the lab. The result that is achieved in this research shows that End to End QoS is very influental to the Service Level Agreement to the users of the telecommunication service.Keywords: End to End Qos, SLA, Diffserv

  20. Experimental demonstration of software defined data center optical networks with Tbps end-to-end tunability

    Science.gov (United States)

    Zhao, Yongli; Zhang, Jie; Ji, Yuefeng; Li, Hui; Wang, Huitao; Ge, Chao

    2015-10-01

    The end-to-end tunability is important to provision elastic channel for the burst traffic of data center optical networks. Then, how to complete the end-to-end tunability based on elastic optical networks? Software defined networking (SDN) based end-to-end tunability solution is proposed for software defined data center optical networks, and the protocol extension and implementation procedure are designed accordingly. For the first time, the flexible grid all optical networks with Tbps end-to-end tunable transport and switch system have been online demonstrated for data center interconnection, which are controlled by OpenDayLight (ODL) based controller. The performance of the end-to-end tunable transport and switch system has been evaluated with wavelength number tuning, bit rate tuning, and transmit power tuning procedure.

  1. OMV mission simulator

    Science.gov (United States)

    Cok, Keith E.

    1989-01-01

    The Orbital Maneuvering Vehicle (OMV) will be remotely piloted during rendezvous, docking, or proximity operations with target spacecraft from a ground control console (GCC). The real-time mission simulator and graphics being used to design a console pilot-machine interface are discussed. A real-time orbital dynamics simulator drives the visual displays. The dynamics simulator includes a J2 oblate earth gravity model and a generalized 1962 rotating atmospheric and drag model. The simulator also provides a variable-length communication delay to represent use of the Tracking and Data Relay Satellite System (TDRSS) and NASA Communications (NASCOM). Input parameter files determine the graphics display. This feature allows rapid prototyping since displays can be easily modified from pilot recommendations. A series of pilot reviews are being held to determine an effective pilot-machine interface. Pilots fly missions with nominal to 3-sigma dispersions in translational or rotational axes. Console dimensions, switch type and layout, hand controllers, and graphic interfaces are evaluated by the pilots and the GCC simulator is modified for subsequent runs. Initial results indicate a pilot preference for analog versus digital displays and for two 3-degree-of-freedom hand controllers.

  2. Model outputs - Developing end-to-end models of the Gulf of California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the northern Gulf of California, linking oceanography, biogeochemistry, food web...

  3. Physical oceanography - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  4. Atlantis model outputs - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  5. Advanced Camera Image Cropping Approach for CNN-Based End-to-End Controls on Sustainable Computing

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2018-03-01

    Full Text Available Recent research on deep learning has been applied to a diversity of fields. In particular, numerous studies have been conducted on self-driving vehicles using end-to-end approaches based on images captured by a single camera. End-to-end controls learn the output vectors of output devices directly from the input vectors of available input devices. In other words, an end-to-end approach learns not by analyzing the meaning of input vectors, but by extracting optimal output vectors based on input vectors. Generally, when end-to-end control is applied to self-driving vehicles, the steering wheel and pedals are controlled autonomously by learning from the images captured by a camera. However, high-resolution images captured from a car cannot be directly used as inputs to Convolutional Neural Networks (CNNs owing to memory limitations; the image size needs to be efficiently reduced. Therefore, it is necessary to extract features from captured images automatically and to generate input images by merging the parts of the images that contain the extracted features. This paper proposes a learning method for end-to-end control that generates input images for CNNs by extracting road parts from input images, identifying the edges of the extracted road parts, and merging the parts of the images that contain the detected edges. In addition, a CNN model for end-to-end control is introduced. Experiments involving the Open Racing Car Simulator (TORCS, a sustainable computing environment for cars, confirmed the effectiveness of the proposed method for self-driving by comparing the accumulated difference in the angle of the steering wheel in the images generated by it with those of resized images containing the entire captured area and cropped images containing only a part of the captured area. The results showed that the proposed method reduced the accumulated difference by 0.839% and 0.850% compared to those yielded by the resized images and cropped images

  6. Sleep/wake scheduling scheme for minimizing end-to-end delay in multi-hop wireless sensor networks

    Directory of Open Access Journals (Sweden)

    Madani Sajjad

    2011-01-01

    Full Text Available Abstract We present a sleep/wake schedule protocol for minimizing end-to-end delay for event driven multi-hop wireless sensor networks. In contrast to generic sleep/wake scheduling schemes, our proposed algorithm performs scheduling that is dependent on traffic loads. Nodes adapt their sleep/wake schedule based on traffic loads in response to three important factors, (a the distance of the node from the sink node, (b the importance of the node's location from connectivity's perspective, and (c if the node is in the proximity where an event occurs. Using these heuristics, the proposed scheme reduces end-to-end delay and maximizes the throughput by minimizing the congestion at nodes having heavy traffic load. Simulations are carried out to evaluate the performance of the proposed protocol, by comparing its performance with S-MAC and Anycast protocols. Simulation results demonstrate that the proposed protocol has significantly reduced the end-to-end delay, as well as has improved the other QoS parameters, like average energy per packet, average delay, packet loss ratio, throughput, and coverage lifetime.

  7. Simulation of Mission Phases

    Science.gov (United States)

    Carlstrom, Nicholas Mercury

    2016-01-01

    This position with the Simulation and Graphics Branch (ER7) at Johnson Space Center (JSC) provided an introduction to vehicle hardware, mission planning, and simulation design. ER7 supports engineering analysis and flight crew training by providing high-fidelity, real-time graphical simulations in the Systems Engineering Simulator (SES) lab. The primary project assigned by NASA mentor and SES lab manager, Meghan Daley, was to develop a graphical simulation of the rendezvous, proximity operations, and docking (RPOD) phases of flight. The simulation is to include a generic crew/cargo transportation vehicle and a target object in low-Earth orbit (LEO). Various capsule, winged, and lifting body vehicles as well as historical RPOD methods were evaluated during the project analysis phase. JSC core mission to support the International Space Station (ISS), Commercial Crew Program (CCP), and Human Space Flight (HSF) influenced the project specifications. The simulation is characterized as a 30 meter +V Bar and/or -R Bar approach to the target object's docking station. The ISS was selected as the target object and the international Low Impact Docking System (iLIDS) was selected as the docking mechanism. The location of the target object's docking station corresponds with the RPOD methods identified. The simulation design focuses on Guidance, Navigation, and Control (GNC) system architecture models with station keeping and telemetry data processing capabilities. The optical and inertial sensors, reaction control system thrusters, and the docking mechanism selected were based on CCP vehicle manufacturer's current and proposed technologies. A significant amount of independent study and tutorial completion was required for this project. Multiple primary source materials were accessed using the NASA Technical Report Server (NTRS) and reference textbooks were borrowed from the JSC Main Library and International Space Station Library. The Trick Simulation Environment and User

  8. Automated Design of Propellant-Optimal, End-to-End, Low-Thrust Trajectories for Trojan Asteroid Tours

    Science.gov (United States)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2013-01-01

    The Sun-Jupiter Trojan asteroids are celestial bodies of great scientific interest as well as potential resources offering water and other mineral resources for longterm human exploration of the solar system. Previous investigations under this project have addressed the automated design of tours within the asteroid swarm. This investigation expands the current automation scheme by incorporating options for a complete trajectory design approach to the Trojan asteroids. Computational aspects of the design procedure are automated such that end-to-end trajectories are generated with a minimum of human interaction after key elements and constraints associated with a proposed mission concept are specified.

  9. Availability and End-to-end Reliability in Low Duty Cycle Multihop Wireless Sensor Networks.

    Science.gov (United States)

    Suhonen, Jukka; Hämäläinen, Timo D; Hännikäinen, Marko

    2009-01-01

    A wireless sensor network (WSN) is an ad-hoc technology that may even consist of thousands of nodes, which necessitates autonomic, self-organizing and multihop operations. A typical WSN node is battery powered, which makes the network lifetime the primary concern. The highest energy efficiency is achieved with low duty cycle operation, however, this alone is not enough. WSNs are deployed for different uses, each requiring acceptable Quality of Service (QoS). Due to the unique characteristics of WSNs, such as dynamic wireless multihop routing and resource constraints, the legacy QoS metrics are not feasible as such. We give a new definition to measure and implement QoS in low duty cycle WSNs, namely availability and reliability. Then, we analyze the effect of duty cycling for reaching the availability and reliability. The results are obtained by simulations with ZigBee and proprietary TUTWSN protocols. Based on the results, we also propose a data forwarding algorithm suitable for resource constrained WSNs that guarantees end-to-end reliability while adding a small overhead that is relative to the packet error rate (PER). The forwarding algorithm guarantees reliability up to 30% PER.

  10. Availability and End-to-end Reliability in Low Duty Cycle MultihopWireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Timo D. Hämäläinen

    2009-03-01

    Full Text Available A wireless sensor network (WSN is an ad-hoc technology that may even consist of thousands of nodes, which necessitates autonomic, self-organizing and multihop operations. A typical WSN node is battery powered, which makes the network lifetime the primary concern. The highest energy efficiency is achieved with low duty cycle operation, however, this alone is not enough. WSNs are deployed for different uses, each requiring acceptable Quality of Service (QoS. Due to the unique characteristics of WSNs, such as dynamic wireless multihop routing and resource constraints, the legacy QoS metrics are not feasible as such. We give a new definition to measure and implement QoS in low duty cycle WSNs, namely availability and reliability. Then, we analyze the effect of duty cycling for reaching the availability and reliability. The results are obtained by simulations with ZigBee and proprietary TUTWSN protocols. Based on the results, we also propose a data forwarding algorithm suitable for resource constrained WSNs that guarantees end-to-end reliability while adding a small overhead that is relative to the packet error rate (PER. The forwarding algorithm guarantees reliability up to 30% PER.

  11. A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Michelle M. [Southern Illinois Univ., Carbondale, IL (United States); Wu, Chase Q. [Univ. of Memphis, TN (United States)

    2013-11-07

    Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization for this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.

  12. Impacts of the Deepwater Horizon oil spill evaluated using an end-to-end ecosystem model.

    Science.gov (United States)

    Ainsworth, Cameron H; Paris, Claire B; Perlin, Natalie; Dornberger, Lindsey N; Patterson, William F; Chancellor, Emily; Murawski, Steve; Hollander, David; Daly, Kendra; Romero, Isabel C; Coleman, Felicia; Perryman, Holly

    2018-01-01

    We use a spatially explicit biogeochemical end-to-end ecosystem model, Atlantis, to simulate impacts from the Deepwater Horizon oil spill and subsequent recovery of fish guilds. Dose-response relationships with expected oil concentrations were utilized to estimate the impact on fish growth and mortality rates. We also examine the effects of fisheries closures and impacts on recruitment. We validate predictions of the model by comparing population trends and age structure before and after the oil spill with fisheries independent data. The model suggests that recruitment effects and fishery closures had little influence on biomass dynamics. However, at the assumed level of oil concentrations and toxicity, impacts on fish mortality and growth rates were large and commensurate with observations. Sensitivity analysis suggests the biomass of large reef fish decreased by 25% to 50% in areas most affected by the spill, and biomass of large demersal fish decreased even more, by 40% to 70%. Impacts on reef and demersal forage caused starvation mortality in predators and increased reliance on pelagic forage. Impacts on the food web translated effects of the spill far away from the oiled area. Effects on age structure suggest possible delayed impacts on fishery yields. Recovery of high-turnover populations generally is predicted to occur within 10 years, but some slower-growing populations may take 30+ years to fully recover.

  13. An end-to-end assessment of range uncertainty in proton therapy using animal tissues

    Science.gov (United States)

    Zheng, Yuanshui; Kang, Yixiu; Zeidan, Omar; Schreuder, Niek

    2016-11-01

    Accurate assessment of range uncertainty is critical in proton therapy. However, there is a lack of data and consensus on how to evaluate the appropriate amount of uncertainty. The purpose of this study is to quantify the range uncertainty in various treatment conditions in proton therapy, using transmission measurements through various animal tissues. Animal tissues, including a pig head, beef steak, and lamb leg, were used in this study. For each tissue, an end-to-end test closely imitating patient treatments was performed. This included CT scan simulation, treatment planning, image-guided alignment, and beam delivery. Radio-chromic films were placed at various depths in the distal dose falloff region to measure depth dose. Comparisons between measured and calculated doses were used to evaluate range differences. The dose difference at the distal falloff between measurement and calculation depends on tissue type and treatment conditions. The estimated range difference was up to 5, 6 and 4 mm for the pig head, beef steak, and lamb leg irradiation, respectively. Our study shows that the TPS was able to calculate proton range within about 1.5% plus 1.5 mm. Accurate assessment of range uncertainty in treatment planning would allow better optimization of proton beam treatment, thus fully achieving proton beams’ superior dose advantage over conventional photon-based radiation therapy.

  14. End-to-side and end-to-end anastomoses give similar results in cervical oesophagogastrostomy.

    Science.gov (United States)

    Pierie, J P; De Graaf, P W; Poen, H; Van Der Tweel, I; Obertop, H

    1995-12-01

    To find out if there were any differences in healing between end-to-end and end-to-side anastomoses for oesophagogastrostomy. Open study with historical controls. University hospital, The Netherlands. 28 patients with end-to-end and 90 patients with end-to-side anastomoses after transhiatal oesophagectomy and partial gastrectomy for cancer of the oesophagus or oesophagogastric junction, with gastric tube reconstruction and cervical anastomosis. Leak and stricture rates, and the number of dilatations needed to relieve dysphagia. There were no significant differences in leak rates (end-to-end 4/28, 14%, and end-to-side 13/90, 14%) or anastomotic strictures (end-to-end 9/28, 32%, and end-to-side 26/90, 29%). The median number of dilatations needed to relieve dysphagia was 7 (1-33) after end-to-end and 9 (1-113) after end-to-side oesophagogastrostomy. There were no differences between the two methods of suture of cervical oesophagogastrostomy when leakage, stricture, and number of dilatations were used as criteria of good healing.

  15. Automatic provisioning of end-to-end QoS into the home

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Skoldström, Pontus; Nelis, Jelle

    2011-01-01

    Due to a growing number of high bandwidth applications today (such as HDTV), and an increasing amount of network and cloud based applications, service providers need to pay attention to QoS in their networks. We believe there is a need for an end-to-end approach reaching into the home as well....... The Home Gateway (HG) as a key component of the home network is crucial for enabling the end-to-end solutions. UPnP-QoS has been proposed as an inhome solution for resource reservations. In this paper we assess a solution for automatic QoS reservations, on behalf of non-UPnP-QoS aware applications....... Additionally we focus on an integrated end-to-end solution, combining GMPLS-based reservations in e.g., access/metro and UPnP-QoS based reservation in the home network....

  16. Ground Contact Model for Mars Science Laboratory Mission Simulations

    Science.gov (United States)

    Raiszadeh, Behzad; Way, David

    2012-01-01

    The Program to Optimize Simulated Trajectories II (POST 2) has been successful in simulating the flight of launch vehicles and entry bodies on earth and other planets. POST 2 has been the primary simulation tool for the Entry Descent, and Landing (EDL) phase of numerous Mars lander missions such as Mars Pathfinder in 1997, the twin Mars Exploration Rovers (MER-A and MER-B) in 2004, Mars Phoenix lander in 2007, and it is now the main trajectory simulation tool for Mars Science Laboratory (MSL) in 2012. In all previous missions, the POST 2 simulation ended before ground impact, and a tool other than POST 2 simulated landing dynamics. It would be ideal for one tool to simulate the entire EDL sequence, thus avoiding errors that could be introduced by handing off position, velocity, or other fight parameters from one simulation to the other. The desire to have one continuous end-to-end simulation was the motivation for developing the ground interaction model in POST 2. Rover landing, including the detection of the postlanding state, is a very critical part of the MSL mission, as the EDL landing sequence continues for a few seconds after landing. The method explained in this paper illustrates how a simple ground force interaction model has been added to POST 2, which allows simulation of the entire EDL from atmospheric entry through touchdown.

  17. Design and end-to-end modelling of a deployable telescope

    Science.gov (United States)

    Dolkens, Dennis; Kuiper, Hans

    2017-09-01

    a closed-loop system based on measurements of the image sharpness as well as measurements obtained with edge sensors placed between the mirror segments. In addition, a phase diversity system will be used to recover residual wavefront aberrations. To aid the design of the deployable telescope, an end-to-end performance model was developed. The model is built around a dedicated ray-trace program written in Matlab. This program was built from the ground up for the purpose of modelling segmented telescope systems and allows for surface data computed with Finite Element Models (FEM) to be imported in the model. The program also contains modules which can simulate the closed-loop calibration of the telescope and it can use simulated images as an input for phase diversity and image processing algorithms. For a given thermo-mechanical state, the end-to-end model can predict the image quality that will be obtained after the calibration has been completed and the image has been processed. As such, the model is a powerful systems engineering tool, which can be used to optimize the in-orbit performance of a segmented, deployable telescope.

  18. SPoRT - An End-to-End R2O Activity

    Science.gov (United States)

    Jedlovec, Gary J.

    2009-01-01

    Established in 2002 to demonstrate the weather and forecasting application of real-time EOS measurements, the Short-term Prediction Research and Transition (SPoRT) program has grown to be an end-to-end research to operations activity focused on the use of advanced NASA modeling and data assimilation approaches, nowcasting techniques, and unique high-resolution multispectral observational data applications from EOS satellites to improve short-term weather forecasts on a regional and local scale. SPoRT currently partners with several universities and other government agencies for access to real-time data and products, and works collaboratively with them and operational end users at 13 WFOs to develop and test the new products and capabilities in a "test-bed" mode. The test-bed simulates key aspects of the operational environment without putting constraints on the forecaster workload. Products and capabilities which show utility in the test-bed environment are then transitioned experimentally into the operational environment for further evaluation and assessment. SPoRT focuses on a suite of data and products from MODIS, AMSR-E, and AIRS on the NASA Terra and Aqua satellites, and total lightning measurements from ground-based networks. Some of the observations are assimilated into or used with various versions of the WRF model to provide supplemental forecast guidance to operational end users. SPoRT is enhancing partnerships with NOAA / NESDIS for new product development and data access to exploit the remote sensing capabilities of instruments on the NPOESS satellites to address short term weather forecasting problems. The VIIRS and CrIS instruments on the NPP and follow-on NPOESS satellites provide similar observing capabilities to the MODIS and AIRS instruments on Terra and Aqua. SPoRT will be transitioning existing and new capabilities into the AWIIPS II environment to continue the continuity of its activities.

  19. Security Considerations around End-to-End Security in the IP-based Internet of Things

    NARCIS (Netherlands)

    Brachmann, M.; Garcia-Mochon, O.; Keoh, S.L.; Kumar, S.S.

    2012-01-01

    The IP-based Internet of Things refers to the interconnection of smart objects in a Low-power and Lossy Network (LLN) with the Internetby means of protocols such as 6LoWPAN or CoAP. The provisioning of an end-to-end security connection is the key to ensure basic functionalities such as software

  20. QoC-based Optimization of End-to-End M-Health Data Delivery Services

    NARCIS (Netherlands)

    Widya, I.A.; van Beijnum, Bernhard J.F.; Salden, Alfons

    2006-01-01

    This paper addresses how Quality of Context (QoC) can be used to optimize end-to-end mobile healthcare (m-health) data delivery services in the presence of alternative delivery paths, which is quite common in a pervasive computing and communication environment. We propose min-max-plus based

  1. End-to-End Availability Analysis of IMS-Based Networks

    DEFF Research Database (Denmark)

    Kamyod, Chayapol; Nielsen, Rasmus Hjorth; Prasad, Neeli R.

    2013-01-01

    Generation Networks (NGNs). In this paper, an end-to-end availability model is proposed and evaluated using a combination of Reliability Block Diagrams (RBD) and a proposed five-state Markov model. The overall availability for intra- and inter domain communication in IMS is analyzed, and the state...

  2. End-to-End Delay Model for Train Messaging over Public Land Mobile Networks

    Directory of Open Access Journals (Sweden)

    Franco Mazzenga

    2017-11-01

    Full Text Available Modern train control systems rely on a dedicated radio network for train to ground communications. A number of possible alternatives have been analysed to adopt the European Rail Traffic Management System/European Train Control System (ERTMS/ETCS control system on local/regional lines to improve transport capacity. Among them, a communication system based on public networks (cellular&satellite provides an interesting, effective and alternative solution to proprietary and expensive radio networks. To analyse performance of this solution, it is necessary to model the end-to-end delay and message loss to fully characterize the message transfer process from train to ground and vice versa. Starting from the results of a railway test campaign over a 300 km railway line for a cumulative 12,000 traveled km in 21 days, in this paper, we derive a statistical model for the end-to-end delay required for delivering messages. In particular, we propose a two states model allowing for reproducing the main behavioral characteristics of the end-to-end delay as observed experimentally. Model formulation has been derived after deep analysis of the recorded experimental data. When it is applied to model a realistic scenario, it allows for explicitly accounting for radio coverage characteristics, the received power level, the handover points along the line and for the serving radio technology. As an example, the proposed model is used to generate the end-to-end delay profile in a realistic scenario.

  3. End-to-end Configuration of Wireless Realtime Communication over Heterogeneous Protocols

    DEFF Research Database (Denmark)

    Malinowsky, B.; Grønbæk, Jesper; Schwefel, Hans-Peter

    2015-01-01

    This paper describes a wireless real-time communication system design using two Time Division Multiple Access (TDMA) protocols. Messages are subject to prioritization and queuing. For this interoperation scenario, we show a method for end-to-end configuration of protocols and queue sizes. Such co...

  4. Coupling of a single quantum emitter to end-to-end aligned silver nanowires

    DEFF Research Database (Denmark)

    Kumar, Shailesh; Huck, Alexander; Chen, Yuntian

    2013-01-01

    We report on the observation of coupling a single nitrogen vacancy (NV) center in a nanodiamond crystal to a propagating plasmonic mode of silver nanowires. The nanocrystal is placed either near the apex of a single silver nanowire or in the gap between two end-to-end aligned silver nanowires. We...

  5. End-to-End Flow Control for Visual-Haptic Communication under Bandwidth Change

    Science.gov (United States)

    Yashiro, Daisuke; Tian, Dapeng; Yakoh, Takahiro

    This paper proposes an end-to-end flow controller for visual-haptic communication. A visual-haptic communication system transmits non-real-time packets, which contain large-size visual data, and real-time packets, which contain small-size haptic data. When the transmission rate of visual data exceeds the communication bandwidth, the visual-haptic communication system becomes unstable owing to buffer overflow. To solve this problem, an end-to-end flow controller is proposed. This controller determines the optimal transmission rate of visual data on the basis of the traffic conditions, which are estimated by the packets for haptic communication. Experimental results confirm that in the proposed method, a short packet-sending interval and a short delay are achieved under bandwidth change, and thus, high-precision visual-haptic communication is realized.

  6. End-to-end network models encompassing terrestrial, wireless, and satellite components

    Science.gov (United States)

    Boyarko, Chandler L.; Britton, John S.; Flores, Phil E.; Lambert, Charles B.; Pendzick, John M.; Ryan, Christopher M.; Shankman, Gordon L.; Williams, Ramon P.

    2004-08-01

    Development of network models that reflect true end-to-end architectures such as the Transformational Communications Architecture need to encompass terrestrial, wireless and satellite component to truly represent all of the complexities in a world wide communications network. Use of best-in-class tools including OPNET, Satellite Tool Kit (STK), Popkin System Architect and their well known XML-friendly definitions, such as OPNET Modeler's Data Type Description (DTD), or socket-based data transfer modules, such as STK/Connect, enable the sharing of data between applications for more rapid development of end-to-end system architectures and a more complete system design. By sharing the results of and integrating best-in-class tools we are able to (1) promote sharing of data, (2) enhance the fidelity of our results and (3) allow network and application performance to be viewed in the context of the entire enterprise and its processes.

  7. Providing end-to-end QoS for multimedia applications in 3G wireless networks

    Science.gov (United States)

    Guo, Katherine; Rangarajan, Samapth; Siddiqui, M. A.; Paul, Sanjoy

    2003-11-01

    As the usage of wireless packet data services increases, wireless carriers today are faced with the challenge of offering multimedia applications with QoS requirements within current 3G data networks. End-to-end QoS requires support at the application, network, link and medium access control (MAC) layers. We discuss existing CDMA2000 network architecture and show its shortcomings that prevent supporting multiple classes of traffic at the Radio Access Network (RAN). We then propose changes in RAN within the standards framework that enable support for multiple traffic classes. In addition, we discuss how Session Initiation Protocol (SIP) can be augmented with QoS signaling for supporting end-to-end QoS. We also review state of the art scheduling algorithms at the base station and provide possible extensions to these algorithms to support different classes of traffic as well as different classes of users.

  8. Rectovaginal fistula following colectomy with an end-to-end anastomosis stapler for a colorectal adenocarcinoma.

    Science.gov (United States)

    Klein, A; Scotti, S; Hidalgo, A; Viateau, V; Fayolle, P; Moissonnier, P

    2006-12-01

    An 11-year-old, female neutered Labrador retriever was presented with a micro-invasive differentiated papillar adenocarcinoma at the colorectal junction. A colorectal end-to-end anastomosis stapler device was used to perform resection and anastomosis using a transanal technique. A rectovaginal fistula was diagnosed two days later. An exploratory laparotomy was conducted and the fistula was identified and closed. Early dehiscence of the colon was also suspected and another colorectal anastomosis was performed using a manual technique. Comparison to a conventional manual technique of intestinal surgery showed that the use of an automatic staple device was quicker and easier. To the authors' knowledge, this is the first report of a rectovaginal fistula occurring after end-to-end anastomosis stapler colorectal resection-anastomosis in the dog. To minimise the risk of this potential complication associated with the limited surgical visibility, adequate tissue retraction and inspection of the anastomosis site are essential.

  9. Development of a Dynamic, End-to-End Free Piston Stirling Convertor Model

    Science.gov (United States)

    Regan, Timothy F.; Gerber, Scott S.; Roth, Mary Ellen

    2003-01-01

    A dynamic model for a free-piston Stirling convertor is being developed at the NASA Glenn Research Center. The model is an end-to-end system model that includes the cycle thermodynamics, the dynamics, and electrical aspects of the system. The subsystems of interest are the heat source, the springs, the moving masses, the linear alternator, the controller and the end-user load. The envisioned use of the model will be in evaluating how changes in a subsystem could affect the operation of the convertor. The model under development will speed the evaluation of improvements to a subsystem and aid in determining areas in which most significant improvements may be found. One of the first uses of the end-to-end model will be in the development of controller architectures. Another related area is in evaluating changes to details in the linear alternator.

  10. Building dialogue POMDPs from expert dialogues an end-to-end approach

    CERN Document Server

    Chinaei, Hamidreza

    2016-01-01

    This book discusses the Partially Observable Markov Decision Process (POMDP) framework applied in dialogue systems. It presents POMDP as a formal framework to represent uncertainty explicitly while supporting automated policy solving. The authors propose and implement an end-to-end learning approach for dialogue POMDP model components. Starting from scratch, they present the state, the transition model, the observation model and then finally the reward model from unannotated and noisy dialogues. These altogether form a significant set of contributions that can potentially inspire substantial further work. This concise manuscript is written in a simple language, full of illustrative examples, figures, and tables. Provides insights on building dialogue systems to be applied in real domain Illustrates learning dialogue POMDP model components from unannotated dialogues in a concise format Introduces an end-to-end approach that makes use of unannotated and noisy dialogue for learning each component of dialogue POM...

  11. Circular myotomy as an aid to resection and end-to-end anastomosis of the esophagus.

    Science.gov (United States)

    Attum, A A; Hankins, J R; Ngangana, J; McLaughlin, J S

    1979-08-01

    Segments ranging from 40 to 70% of the thoracic esophagus were resected in 80 mongrel dogs. End-to-end anastomosis was effected after circular myotomy either proximal or distal, or both proximal and distal, to the anastomosis. Among dogs undergoing resection of 60% of the esophagus, distal myotomy enabled 6 of 8 animals to survive, and combined proximal and distal myotomy permitted 8 of 10 to survive. Cineesophagography was performed in a majority of the 50 surviving animals and showed no appreciable delay of peristalsis at the myotomy sites. When these sites were examined at postmortem examination up to 13 months after operation, 1 dog showed a small diverticulum but none showed dilatation or stricture. It is concluded that circular myotomy holds real promise as a means of extending the clinical application of esophageal resection with end-to-end anastomosis.

  12. Financing the End-to-end Supply Chain: A Reference Guide to Supply Chain Finance

    OpenAIRE

    Templar, Simon; Hofmann, Erik; Findlay, Charles

    2016-01-01

    Financing the End to End Supply Chain provides readers with a real insight into the increasingly important area of supply chain finance. It demonstrates the importance of the strategic relationship between the physical supply of goods and services and the associated financial flows. The book provides a clear introduction, demonstrating the importance of the strategic relationship between supply chain and financial communities within an organization. It contains vital information on how supply...

  13. Testing Application (End-to-End Performance of Networks With EFT Traffic

    Directory of Open Access Journals (Sweden)

    Vlatko Lipovac

    2009-01-01

    Full Text Available This paper studies how end-to-end application peiformance(of Electronic Financial Transaction traffic, in particulardepends on the actual protocol stacks, operating systemsand network transmission rates. With this respect, the respectivesimulation tests of peiformance of TCP and UDP protocolsrunning on various operating systems, ranging from Windows,Sun Solmis, to Linux have been implemented, and thedifferences in peiformance addressed focusing on throughputand response time.

  14. Experimental evaluation of end-to-end delay in switched Ethernet application in the automotive domain

    OpenAIRE

    Beretis , Kostas; Symeonidis , Ieroklis

    2013-01-01

    International audience; This article presents an approach for deriving upper bound for end-to-end delay in a double star switched Ethernet network. Four traffic classes, following a strict priority queuing policy, were considered. The theoretical analysis was based on network calculus. An experimental setup, which accu-rately reflects an automotive communication network, was implemented in or-der to evaluate the theoretical model. The results obtained by the experiments provided valuable feed...

  15. CHEETAH: circuit-switched high-speed end-to-end transport architecture

    Science.gov (United States)

    Veeraraghavan, Malathi; Zheng, Xuan; Lee, Hyuk; Gardner, M.; Feng, Wuchun

    2003-10-01

    Leveraging the dominance of Ethernet in LANs and SONET/SDH in MANs and WANs, we propose a service called CHEETAH (Circuit-switched High-speed End-to-End Transport ArcHitecture). The service concept is to provide end hosts with high-speed, end-to-end circuit connectivity on a call-by-call shared basis, where a "circuit" consists of Ethernet segments at the ends that are mapped into Ethernet-over-SONET long-distance circuits. This paper focuses on the file-transfer application for such circuits. For this application, the CHEETAH service is proposed as an add-on to the primary Internet access service already in place for enterprise hosts. This allows an end host that is sending a file to first attempt setting up an end-to-end Ethernet/EoS circuit, and if rejected, fall back to the TCP/IP path. If the circuit setup is successful, the end host will enjoy a much shorter file-transfer delay than on the TCP/IP path. To determine the conditions under which an end host with access to the CHEETAH service should attempt circuit setup, we analyze mean file-transfer delays as a function of call blocking probability in the circuit-switched network, probability of packet loss in the IP network, round-trip times, link rates, and so on.

  16. QoS Modeling for End-to-End Performance Evaluation over Networks with Wireless Access

    Directory of Open Access Journals (Sweden)

    Gómez Gerardo

    2010-01-01

    Full Text Available This paper presents an end-to-end Quality of Service (QoS model for assessing the performance of data services over networks with wireless access. The proposed model deals with performance degradation across protocol layers using a bottom-up strategy, starting with the physical layer and moving on up to the application layer. This approach makes it possible to analytically assess performance at different layers, thereby facilitating a possible end-to-end optimization process. As a representative case, a scenario where a set of mobile terminals connected to a streaming server through an IP access node has been studied. UDP, TCP, and the new TCP-Friendly Rate Control (TFRC protocols were analyzed at the transport layer. The radio interface consisted of a variable-rate multiuser and multichannel subsystem, including retransmissions and adaptive modulation and coding. The proposed analytical QoS model was validated on a real-time emulator of an end-to-end network with wireless access and proved to be very useful for the purposes of service performance estimation and optimization.

  17. Analysis of the relationship between end-to-end distance and activity of single-chain antibody against colorectal carcinoma.

    Science.gov (United States)

    Zhang, Jianhua; Liu, Shanhong; Shang, Zhigang; Shi, Li; Yun, Jun

    2012-08-22

    We investigated the relationship of End-to-end distance between VH and VL with different peptide linkers and the activity of single-chain antibodies by computer-aided simulation. First, we developed (G4S)n (where n = 1-9) as the linker to connect VH and VL, and estimated the 3D structure of single-chain Fv antibody (scFv) by homologous modeling. After molecular models were evaluated and optimized, the coordinate system of every protein was built and unified into one coordinate system, and End-to-end distances calculated using 3D space coordinates. After expression and purification of scFv-n with (G4S)n as n = 1, 3, 5, 7 or 9, the immunoreactivity of purified ND-1 scFv-n was determined by ELISA. A multi-factorial relationship model was employed to analyze the structural factors affecting scFv: rn=ABn-ABO2+CDn-CDO2+BCn-BCst2. The relationship between immunoreactivity and r-values revealed that fusion protein structure approached the desired state when the r-value = 3. The immunoreactivity declined as the r-value increased, but when the r-value exceeded a certain threshold, it stabilized. We used a linear relationship to analyze structural factors affecting scFv immunoreactivity.

  18. Weighted-DESYNC and Its Application to End-to-End Throughput Fairness in Wireless Multihop Network

    Directory of Open Access Journals (Sweden)

    Ui-Seong Yu

    2017-01-01

    Full Text Available The end-to-end throughput of a routing path in wireless multihop network is restricted by a bottleneck node that has the smallest bandwidth among the nodes on the routing path. In this study, we propose a method for resolving the bottleneck-node problem in multihop networks, which is based on multihop DESYNC (MH-DESYNC algorithm that is a bioinspired resource allocation method developed for use in multihop environments and enables fair resource allocation among nearby (up to two hops neighbors. Based on MH-DESYNC, we newly propose weighted-DESYNC (W-DESYNC as a tool artificially to control the amount of resource allocated to the specific user and thus to achieve throughput fairness over a routing path. Proposed W-DESYNC employs the weight factor of a link to determine the amount of bandwidth allocated to a node. By letting the weight factor be the link quality of a routing path and making it the same across a routing path via Cucker-Smale flocking model, we can obtain throughput fairness over a routing path. The simulation results show that the proposed algorithm achieves throughput fairness over a routing path and can increase total end-to-end throughput in wireless multihop networks.

  19. Outcome of end-to-end urethroplasty in post-traumatic stricture of posterior urethra.

    Science.gov (United States)

    Hussain, Akbar; Pansota, Mudassar Saeed; Rasool, Mumtaz; Tabassum, Shafqat Ali; Ahmad, Iftikhar; Saleem, Muhammad Shahzad

    2013-04-01

    To determine the outcome of delayed end-to-end anastomotic urethroplasty in blind post-traumatic stricture of posterior urethra at our setup. Case series. Department of Urology and Renal Transplantation, Quaid-e-Azam Medical College/Bahawal Victoria Hospital, Bahawalpur, from January 2009 to June 2011. Adult patients with completely obliterated post-traumatic stricture of posterior urethra ≤ 2 cm were included in the study. Patients with post-prostatectomy (TUR-P, TVP) stricture, stricture more than 2 cm in size or patients of stricture with neurogenic bladder and patients with any perineal disease were excluded from the study. Retrograde urethrogram and voiding cysto-urethrogram was done in every patient to assess stricture length and location. Stricture excision and delayed end-to-end anastomosis of urethra with spatulation was performed in every patient. Minimum followup period was 6 months and maximum 18 months. There were 26 cases with road traffic accident (indirect) and 14 had history of fall/direct trauma to perineum or urethra. Majority of the patients (57.5%) were between 16 to 30 years of age. Twelve (30.0%) patients developed complications postoperatively. Early complications of wound infection occurred in 01 (2.5%) patient. Late complications were seen in 11 (27.5%) patients i.e. stricture recurrence in 7 (17.5%), erectile dysfunction in 2 (5.0%), urethrocutaneous fistula and urinary incontinence in one patient (2.5%) each. Success rate was 70.0% initially and 87.5% overall. Delayed end-to-end anastomotic urethroplasty is an effective procedure for traumatic posterior urethral strictures with success rate of about 87.5%.

  20. Reversible end-to-end assembly of gold nanorods using a disulfide-modified polypeptide

    International Nuclear Information System (INIS)

    Walker, David A; Gupta, Vinay K

    2008-01-01

    Directing the self-assembly of colloidal particles into nanostructures is of great interest in nanotechnology. Here, reversible end-to-end assembly of gold nanorods (GNR) is induced by pH-dependent changes in the secondary conformation of a disulfide-modified poly(L-glutamic acid) (SSPLGA). The disulfide anchoring group drives chemisorption of the polyacid onto the end of the gold nanorods in an ethanolic solution. A layer of poly(vinyl pyrrolidone) is adsorbed on the positively charged, surfactant-stabilized GNR to screen the surfactant bilayer charge and provide stability for dispersion of the GNR in ethanol. For comparison, irreversible end-to-end assembly using a bidentate ligand, namely 1,6-hexanedithiol, is also performed. Characterization of the modified GNR and its end-to-end linking behavior using SSPLGA and hexanedithiol is performed using dynamic light scattering (DLS), UV-vis absorption spectroscopy and transmission electron microscopy (TEM). Experimental results show that, in a colloidal solution of GNR-SSPLGA at a pH∼3.5, where the PLGA is in an α-helical conformation, the modified GNR self-assemble into one-dimensional nanostructures. The linking behavior can be reversed by increasing the pH (>8.5) to drive the conformation of the polypeptide to a random coil and this reversal with pH occurs rapidly within minutes. Cycling the pH multiple times between low and high pH values can be used to drive the formation of the nanostructures of the GNR and disperse them in solution.

  1. Outcome of end-to-end urethroplasty in post-traumatic stricture of posterior urethra

    International Nuclear Information System (INIS)

    Hussain, A.; Pansota, M. S.; Rasool, M.; Tabassum, S. A.; Ahmad, I.; Saleem, M. S.

    2013-01-01

    Objective: To determine the outcome of delayed end-to-end anastomotic urethroplasty in blind post-traumatic stricture of posterior urethra at our setup. Study Design: Case series. Place and Duration of Study: Department of Urology and Renal Transplantation, Quaid-e-Azam Medical College/ Bahawal Victoria Hospital, Bahawalpur, from January 2009 to June 2011. Methodology: Adult patients with completely obliterated post-traumatic stricture of posterior urethra 2 cm/sup 2/ were included in the study. Patients with post-prostatectomy (TUR-P, TVP) stricture, stricture more than 2 cm in size or patients of stricture with neurogenic bladder and patients with any perineal disease were excluded from the study. Retrograde urethrogram and voiding cysto-urethrogram was done in every patient to assess stricture length and location. Stricture excision and delayed end-to-end anastomosis of urethra with spatulation was performed in every patient. Minimum followup period was 6 months and maximum 18 months. Results: There were 26 cases with road traffic accident (indirect) and 14 had history of fall/direct trauma to perineum or urethra. Majority of the patients (57.5%) were between 16 to 30 years of age. Twelve (30.0%) patients developed complications postoperatively. Early complications of wound infection occurred in 01 (2.5%) patient. Late complications were seen in 11 (27.5%) patients i.e. stricture recurrence in 7 (17.5%), erectile dysfunction in 2 (5.0%), urethrocutaneous fistula and urinary incontinence in one patient (2.5%) each. Success rate was 70.0% initially and 87.5% overall. Conclusion: Delayed end-to-end anastomotic urethroplasty is an effective procedure for traumatic posterior urethral strictures with success rate of about 87.5%. (author)

  2. Outcome of end-to-end urethroplasty in post-traumatic stricture of posterior urethra

    Energy Technology Data Exchange (ETDEWEB)

    Hussain, A.; Pansota, M. S.; Rasool, M.; Tabassum, S. A.; Ahmad, I.; Saleem, M. S. [Bahawal Victoria Hospital, Bahawalpur (Pakistan). Dept. of Urology

    2013-04-15

    Objective: To determine the outcome of delayed end-to-end anastomotic urethroplasty in blind post-traumatic stricture of posterior urethra at our setup. Study Design: Case series. Place and Duration of Study: Department of Urology and Renal Transplantation, Quaid-e-Azam Medical College/ Bahawal Victoria Hospital, Bahawalpur, from January 2009 to June 2011. Methodology: Adult patients with completely obliterated post-traumatic stricture of posterior urethra 2 cm/sup 2/ were included in the study. Patients with post-prostatectomy (TUR-P, TVP) stricture, stricture more than 2 cm in size or patients of stricture with neurogenic bladder and patients with any perineal disease were excluded from the study. Retrograde urethrogram and voiding cysto-urethrogram was done in every patient to assess stricture length and location. Stricture excision and delayed end-to-end anastomosis of urethra with spatulation was performed in every patient. Minimum followup period was 6 months and maximum 18 months. Results: There were 26 cases with road traffic accident (indirect) and 14 had history of fall/direct trauma to perineum or urethra. Majority of the patients (57.5%) were between 16 to 30 years of age. Twelve (30.0%) patients developed complications postoperatively. Early complications of wound infection occurred in 01 (2.5%) patient. Late complications were seen in 11 (27.5%) patients i.e. stricture recurrence in 7 (17.5%), erectile dysfunction in 2 (5.0%), urethrocutaneous fistula and urinary incontinence in one patient (2.5%) each. Success rate was 70.0% initially and 87.5% overall. Conclusion: Delayed end-to-end anastomotic urethroplasty is an effective procedure for traumatic posterior urethral strictures with success rate of about 87.5%. (author)

  3. Optimizing End-to-End Big Data Transfers over Terabits Network Infrastructure

    International Nuclear Information System (INIS)

    Kim, Youngjae; Vallee, Geoffroy R.; Lee, Sangkeun; Shipman, Galen M.

    2016-01-01

    While future terabit networks hold the promise of significantly improving big-data motion among geographically distributed data centers, significant challenges must be overcome even on today's 100 gigabit networks to realize end-to-end performance. Multiple bottlenecks exist along the end-to-end path from source to sink, for instance, the data storage infrastructure at both the source and sink and its interplay with the wide-area network are increasingly the bottleneck to achieving high performance. In this study, we identify the issues that lead to congestion on the path of an end-to-end data transfer in the terabit network environment, and we present a new bulk data movement framework for terabit networks, called LADS. LADS exploits the underlying storage layout at each endpoint to maximize throughput without negatively impacting the performance of shared storage resources for other users. LADS also uses the Common Communication Interface (CCI) in lieu of the sockets interface to benefit from hardware-level zero-copy, and operating system bypass capabilities when available. It can further improve data transfer performance under congestion on the end systems using buffering at the source using flash storage. With our evaluations, we show that LADS can avoid congested storage elements within the shared storage resource, improving input/output bandwidth, and data transfer rates across the high speed networks. We also investigate the performance degradation problems of LADS due to I/O contention on the parallel file system (PFS), when multiple LADS tools share the PFS. We design and evaluate a meta-scheduler to coordinate multiple I/O streams while sharing the PFS, to minimize the I/O contention on the PFS. Finally, with our evaluations, we observe that LADS with meta-scheduling can further improve the performance by up to 14 percent relative to LADS without meta-scheduling.

  4. WiMAX security and quality of service an end-to-end perspective

    CERN Document Server

    Tang, Seok-Yee; Sharif, Hamid

    2010-01-01

    WiMAX is the first standard technology to deliver true broadband mobility at speeds that enable powerful multimedia applications such as Voice over Internet Protocol (VoIP), online gaming, mobile TV, and personalized infotainment. WiMAX Security and Quality of Service, focuses on the interdisciplinary subject of advanced Security and Quality of Service (QoS) in WiMAX wireless telecommunication systems including its models, standards, implementations, and applications. Split into 4 parts, Part A of the book is an end-to-end overview of the WiMAX architecture, protocol, and system requirements.

  5. An overview of recent end-to-end wireless medical video telemedicine systems using 3G.

    Science.gov (United States)

    Panayides, A; Pattichis, M S; Pattichis, C S; Schizas, C N; Spanias, A; Kyriacou, E

    2010-01-01

    Advances in video compression, network technologies, and computer technologies have contributed to the rapid growth of mobile health (m-health) systems and services. Wide deployment of such systems and services is expected in the near future, and it's foreseen that they will soon be incorporated in daily clinical practice. This study focuses in describing the basic components of an end-to-end wireless medical video telemedicine system, providing a brief overview of the recent advances in the field, while it also highlights future trends in the design of telemedicine systems that are diagnostically driven.

  6. Wiretapping End-to-End Encrypted VoIP Calls: Real-World Attacks on ZRTP

    Directory of Open Access Journals (Sweden)

    Schürmann Dominik

    2017-07-01

    Full Text Available Voice calls are still one of the most common use cases for smartphones. Often, sensitive personal information but also confidential business information is shared. End-to-end security is required to protect against wiretapping of voice calls. For such real-time communication, the ZRTP key-agreement protocol has been proposed. By verbally comparing a small number of on-screen characters or words, called Short Authentication Strings, the participants can be sure that no one is wiretapping the call. Since 2011, ZRTP is an IETF standard implemented in several VoIP clients.

  7. End to end adaptive congestion control in TCP/IP networks

    CERN Document Server

    Houmkozlis, Christos N

    2012-01-01

    This book provides an adaptive control theory perspective on designing congestion controls for packet-switching networks. Relevant to a wide range of disciplines and industries, including the music industry, computers, image trading, and virtual groups, the text extensively discusses source oriented, or end to end, congestion control algorithms. The book empowers readers with clear understanding of the characteristics of packet-switching networks and their effects on system stability and performance. It provides schemes capable of controlling congestion and fairness and presents real-world app

  8. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    Science.gov (United States)

    Feinberg, Lee; Bolcar, Matt; Liu, Alice; Guyon, Olivier; Stark,Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance.

  9. On end-to-end performance of MIMO multiuser in cognitive radio networks

    KAUST Repository

    Yang, Yuli

    2011-12-01

    In this paper, a design for the multiple-input-multiple-output (MIMO) multiuser transmission in the cognitive radio network is developed and its end-to-end performance is investigated under spectrum-sharing constraints. Firstly, the overall average packet error rate is analyzed by considering the channel state information feedback delay and the multiuser scheduling. Then, we provide corresponding numerical results to measure the performance evaluation for several separate scenarios, which presents a convenient tool for the cognitive radio network design with multiple secondary MIMO users. © 2011 IEEE.

  10. On end-to-end performance of MIMO multiuser in cognitive radio networks

    KAUST Repository

    Yang, Yuli; Aissa, Sonia

    2011-01-01

    In this paper, a design for the multiple-input-multiple-output (MIMO) multiuser transmission in the cognitive radio network is developed and its end-to-end performance is investigated under spectrum-sharing constraints. Firstly, the overall average packet error rate is analyzed by considering the channel state information feedback delay and the multiuser scheduling. Then, we provide corresponding numerical results to measure the performance evaluation for several separate scenarios, which presents a convenient tool for the cognitive radio network design with multiple secondary MIMO users. © 2011 IEEE.

  11. Increasing operations profitability using an end-to-end, wireless internet, gas monitoring system

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, M. [Northrock Resources Ltd., AB (Canada); Benterud, K. [zed.i solutions, inc., Calgary, AB (Canada)

    2004-10-01

    Implementation by Northrock Resources Ltd., a wholly-owned subsidiary of Unocal Corporation, of a fully integrated end-to-end gas measurement and production analysis system, is discussed. The system, dubbed Smart-Alek(TM), utilizes public wireless communications and a web browser only delivery system to provide seamless well visibility to a desk-top computer. Smart-Alek(TM) is an example of a new type of end-to-end electronic gas flow measurement system, known as FINE(TM), which is an acronym for Field Intelligence Network and End-User Interface. The system delivers easy-to-use, complete, reliable and cost effective production information, far more effective than is possible to obtain with conventional SCADA technology. By installing the system, Northrock was able to increase gas volumes with more accurate electronic flow measurement in place of mechanical charts, with very low technical maintenance, and at a reduced operating cost. It is emphasized that deploying the technology alone will produce only partial benefits; to realize full benefits it is also essential to change grass roots operating practices, aiming at timely decision-making at the field level. 5 refs., 5 figs.

  12. An End-to-End Model of Plant Pheromone Channel for Long Range Molecular Communication.

    Science.gov (United States)

    Unluturk, Bige D; Akyildiz, Ian F

    2017-01-01

    A new track in molecular communication is using pheromones which can scale up the range of diffusion-based communication from μm meters to meters and enable new applications requiring long range. Pheromone communication is the emission of molecules in the air which trigger behavioral or physiological responses in receiving organisms. The objective of this paper is to introduce a new end-to-end model which incorporates pheromone behavior with communication theory for plants. The proposed model includes both the transmission and reception processes as well as the propagation channel. The transmission process is the emission of pheromones from the leaves of plants. The dispersion of pheromones by the flow of wind constitutes the propagation process. The reception process is the sensing of pheromones by the pheromone receptors of plants. The major difference of pheromone communication from other molecular communication techniques is the dispersion channel acting under the laws of turbulent diffusion. In this paper, the pheromone channel is modeled as a Gaussian puff, i.e., a cloud of pheromone released instantaneously from the source whose dispersion follows a Gaussian distribution. Numerical results on the performance of the overall end-to-end pheromone channel in terms of normalized gain and delay are provided.

  13. End-to-End Airplane Detection Using Transfer Learning in Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Zhong Chen

    2018-01-01

    Full Text Available Airplane detection in remote sensing images remains a challenging problem due to the complexity of backgrounds. In recent years, with the development of deep learning, object detection has also obtained great breakthroughs. For object detection tasks in natural images, such as the PASCAL (Pattern Analysis, Statistical Modelling and Computational Learning VOC (Visual Object Classes Challenge, the major trend of current development is to use a large amount of labeled classification data to pre-train the deep neural network as a base network, and then use a small amount of annotated detection data to fine-tune the network for detection. In this paper, we use object detection technology based on deep learning for airplane detection in remote sensing images. In addition to using some characteristics of remote sensing images, some new data augmentation techniques have been proposed. We also use transfer learning and adopt a single deep convolutional neural network and limited training samples to implement end-to-end trainable airplane detection. Classification and positioning are no longer divided into multistage tasks; end-to-end detection attempts to combine them for optimization, which ensures an optimal solution for the final stage. In our experiment, we use remote sensing images of airports collected from Google Earth. The experimental results show that the proposed algorithm is highly accurate and meaningful for remote sensing object detection.

  14. End to end distribution functions for a class of polymer models

    International Nuclear Information System (INIS)

    Khandekar, D.C.; Wiegel, F.W.

    1988-01-01

    The two point end-to-end distribution functions for a class of polymer models have been obtained within the first cumulant approximation. The trial distribution function this purpose is chosen to correspond to a general non-local quadratic functional. An Exact expression for the trial distribution function is obtained. It is pointed out that these trial distribution functions themselves can be used to study certain aspects of the configurational behaviours of polymers. These distribution functions are also used to obtain the averaged mean square size 2 > of a polymer characterized by the non-local quadratic potential energy functional. Finally, we derive an analytic expression for 2 > of a polyelectrolyte model and show that for a long polymer a weak electrostatic interaction does not change the behaviour of 2 > from that of a free polymer. (author). 16 refs

  15. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    Science.gov (United States)

    Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Oliver; Stark, Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and Exo-Earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling it.

  16. An end-to-end secure patient information access card system.

    Science.gov (United States)

    Alkhateeb, A; Singer, H; Yakami, M; Takahashi, T

    2000-03-01

    The rapid development of the Internet and the increasing interest in Internet-based solutions has promoted the idea of creating Internet-based health information applications. This will force a change in the role of IC cards in healthcare card systems from a data carrier to an access key medium. At the Medical Informatics Department of Kyoto University Hospital we are developing a smart card patient information project where patient databases are accessed via the Internet. Strong end-to-end data encryption is performed via Secure Socket Layers, transparent to transmit patient information. The smart card is playing the crucial role of access key to the database: user authentication is performed internally without ever revealing the actual key. For easy acceptance by healthcare professionals, the user interface is integrated as a plug-in for two familiar Web browsers, Netscape Navigator and MS Internet Explorer.

  17. End-to-end operations at the National Radio Astronomy Observatory

    Science.gov (United States)

    Radziwill, Nicole M.

    2008-07-01

    In 2006 NRAO launched a formal organization, the Office of End to End Operations (OEO), to broaden access to its instruments (VLA/EVLA, VLBA, GBT and ALMA) in the most cost-effective ways possible. The VLA, VLBA and GBT are mature instruments, and the EVLA and ALMA are currently under construction, which presents unique challenges for integrating software across the Observatory. This article 1) provides a survey of the new developments over the past year, and those planned for the next year, 2) describes the business model used to deliver many of these services, and 3) discusses the management models being applied to ensure continuous innovation in operations, while preserving the flexibility and autonomy of telescope software development groups.

  18. End-to-end performance of cooperative relaying in spectrum-sharing systems with quality of service requirements

    KAUST Repository

    Asghari, Vahid Reza

    2011-07-01

    We propose adopting a cooperative relaying technique in spectrum-sharing cognitive radio (CR) systems to more effectively and efficiently utilize available transmission resources, such as power, rate, and bandwidth, while adhering to the quality of service (QoS) requirements of the licensed (primary) users of the shared spectrum band. In particular, we first consider that the cognitive (secondary) user\\'s communication is assisted by an intermediate relay that implements the decode-and-forward (DF) technique onto the secondary user\\'s relayed signal to help with communication between the corresponding source and the destination nodes. In this context, we obtain first-order statistics pertaining to the first- and second-hop transmission channels, and then, we investigate the end-to-end performance of the proposed spectrum-sharing cooperative relaying system under resource constraints defined to assure that the primary QoS is unaffected. Specifically, we investigate the overall average bit error rate (BER), ergodic capacity, and outage probability of the secondary\\'s communication subject to appropriate constraints on the interference power at the primary receivers. We then consider a general scenario where a cluster of relays is available between the secondary source and destination nodes. In this case, making use of the partial relay selection method, we generalize our results for the single-relay scheme and obtain the end-to-end performance of the cooperative spectrum-sharing system with a cluster of L available relays. Finally, we examine our theoretical results through simulations and comparisons, illustrating the overall performance of the proposed spectrum-sharing cooperative system and quantify its advantages for different operating scenarios and conditions. © 2011 IEEE.

  19. The role of sea ports in end-to-end maritime transport chain emissions

    International Nuclear Information System (INIS)

    Gibbs, David; Rigot-Muller, Patrick; Mangan, John; Lalwani, Chandra

    2014-01-01

    This paper's purpose is to investigate the role of sea ports in helping to mitigate the GHG emissions associated with the end-to-end maritime transport chain. The analysis is primarily focused on the UK, but is international in application. The paper is based on both the analysis of secondary data and information on actions taken by ports to reduce their emissions, with the latter data collected for the main UK ports via their published reports and/or via interviews. Only a small number of ports (representing 32% of UK port activity) actually measure and report their carbon emissions in the UK context. The emissions generated by ships calling at these ports are analysed using a method based on Department for Transport Maritime Statistics Data. In addition, a case example (Felixstowe) of emissions associated with HGV movements to and from ports is presented, and data on vessel emissions at berth are also considered. Our analyses indicate that emissions generated by ships during their voyages between ports are of a far greater magnitude than those generated by the port activities. Thus while reducing the ports' own emissions is worthwhile, the results suggest that ports might have more impact through focusing their efforts on reducing shipping emissions. - Highlights: • Investigates role of ports in mitigating GHG emissions in the end-to-end maritime transport chain. • Emissions generated both by ports and by ships calling at ports are analysed. • Shipping's emissions are far greater than those generated by port activities. • Ports may have more impact through focusing efforts on reducing shipping's emissions. • Options for ports to support and drive change in the maritime sector also considered

  20. Kinetics of end-to-end collision in short single-stranded nucleic acids.

    Science.gov (United States)

    Wang, Xiaojuan; Nau, Werner M

    2004-01-28

    A novel fluorescence-based method, which entails contact quenching of the long-lived fluorescent state of 2,3-diazabicyclo[2.2.2]-oct-2-ene (DBO), was employed to measure the kinetics of end-to-end collision in short single-stranded oligodeoxyribonucleotides of the type 5'-DBO-(X)n-dG with X = dA, dC, dT, or dU and n = 2 or 4. The fluorophore was covalently attached to the 5' end and dG was introduced as an efficient intrinsic quencher at the 3' terminus. The end-to-end collision rates, which can be directly related to the efficiency of intramolecular fluorescence quenching, ranged from 0.1 to 9.0 x 10(6) s(-1). They were strongly dependent on the strand length, the base sequence, as well as the temperature. Oligonucleotides containing dA in the backbone displayed much slower collision rates and significantly higher positive activation energies than strands composed of pyrimidine bases, suggesting a higher intrinsic rigidity of oligoadenylate. Comparison of the measured collision rates in short single-stranded oligodeoxyribonucleotides with the previously reported kinetics of hairpin formation indicates that the intramolecular collision is significantly faster than the nucleation step of hairpin closing. This is consistent with the configurational diffusion model suggested by Ansari et al. (Ansari, A.; Kuznetsov, S. V.; Shen, Y. Proc.Natl. Acad. Sci. USA 2001, 98, 7771-7776), in which the formation of misfolded loops is thought to slow hairpin formation.

  1. SciBox, an end-to-end automated science planning and commanding system

    Science.gov (United States)

    Choo, Teck H.; Murchie, Scott L.; Bedini, Peter D.; Steele, R. Josh; Skura, Joseph P.; Nguyen, Lillian; Nair, Hari; Lucks, Michael; Berman, Alice F.; McGovern, James A.; Turner, F. Scott

    2014-01-01

    SciBox is a new technology for planning and commanding science operations for Earth-orbital and planetary space missions. It has been incrementally developed since 2001 and demonstrated on several spaceflight projects. The technology has matured to the point that it is now being used to plan and command all orbital science operations for the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) mission to Mercury. SciBox encompasses the derivation of observing sequences from science objectives, the scheduling of those sequences, the generation of spacecraft and instrument commands, and the validation of those commands prior to uploading to the spacecraft. Although the process is automated, science and observing requirements are incorporated at each step by a series of rules and parameters to optimize observing opportunities, which are tested and validated through simulation and review. Except for limited special operations and tests, there is no manual scheduling of observations or construction of command sequences. SciBox reduces the lead time for operations planning by shortening the time-consuming coordination process, reduces cost by automating the labor-intensive processes of human-in-the-loop adjudication of observing priorities, reduces operations risk by systematically checking constraints, and maximizes science return by fully evaluating the trade space of observing opportunities to meet MESSENGER science priorities within spacecraft recorder, downlink, scheduling, and orbital-geometry constraints.

  2. Internet end-to-end performance monitoring for the High Energy Nuclear and Particle Physics community

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, W.

    2000-02-22

    Modern High Energy Nuclear and Particle Physics (HENP) experiments at Laboratories around the world present a significant challenge to wide area networks. Petabytes (1015) or exabytes (1018) of data will be generated during the lifetime of the experiment. Much of this data will be distributed via the Internet to the experiment's collaborators at Universities and Institutes throughout the world for analysis. In order to assess the feasibility of the computing goals of these and future experiments, the HENP networking community is actively monitoring performance across a large part of the Internet used by its collaborators. Since 1995, the pingER project has been collecting data on ping packet loss and round trip times. In January 2000, there are 28 monitoring sites in 15 countries gathering data on over 2,000 end-to-end pairs. HENP labs such as SLAC, Fermi Lab and CERN are using Advanced Network's Surveyor project and monitoring performance from one-way delay of UDP packets. More recently several HENP sites have become involved with NLANR's active measurement program (AMP). In addition SLAC and CERN are part of the RIPE test-traffic project and SLAC is home for a NIMI machine. The large End-to-end performance monitoring infrastructure allows the HENP networking community to chart long term trends and closely examine short term glitches across a wide range of networks and connections. The different methodologies provide opportunities to compare results based on different protocols and statistical samples. Understanding agreement and discrepancies between results provides particular insight into the nature of the network. This paper will highlight the practical side of monitoring by reviewing the special needs of High Energy Nuclear and Particle Physics experiments and provide an overview of the experience of measuring performance across a large number of interconnected networks throughout the world with various methodologies. In particular, results

  3. Internet end-to-end performance monitoring for the High Energy Nuclear and Particle Physics community

    International Nuclear Information System (INIS)

    Matthews, W.

    2000-01-01

    Modern High Energy Nuclear and Particle Physics (HENP) experiments at Laboratories around the world present a significant challenge to wide area networks. Petabytes (1015) or exabytes (1018) of data will be generated during the lifetime of the experiment. Much of this data will be distributed via the Internet to the experiment's collaborators at Universities and Institutes throughout the world for analysis. In order to assess the feasibility of the computing goals of these and future experiments, the HENP networking community is actively monitoring performance across a large part of the Internet used by its collaborators. Since 1995, the pingER project has been collecting data on ping packet loss and round trip times. In January 2000, there are 28 monitoring sites in 15 countries gathering data on over 2,000 end-to-end pairs. HENP labs such as SLAC, Fermi Lab and CERN are using Advanced Network's Surveyor project and monitoring performance from one-way delay of UDP packets. More recently several HENP sites have become involved with NLANR's active measurement program (AMP). In addition SLAC and CERN are part of the RIPE test-traffic project and SLAC is home for a NIMI machine. The large End-to-end performance monitoring infrastructure allows the HENP networking community to chart long term trends and closely examine short term glitches across a wide range of networks and connections. The different methodologies provide opportunities to compare results based on different protocols and statistical samples. Understanding agreement and discrepancies between results provides particular insight into the nature of the network. This paper will highlight the practical side of monitoring by reviewing the special needs of High Energy Nuclear and Particle Physics experiments and provide an overview of the experience of measuring performance across a large number of interconnected networks throughout the world with various methodologies. In particular, results from each project

  4. MO-B-BRB-04: 3D Dosimetry in End-To-End Dosimetry QA

    Energy Technology Data Exchange (ETDEWEB)

    Ibbott, G. [UT MD Anderson Cancer Center (United States)

    2016-06-15

    irradiated volume can help understand interplay effects during TomoTherapy or VMAT. Titania Juang: Special techniques in the clinic and research Understand the potential for 3D dosimetry in validating dose accumulation in deformable systems, and Observe the benefits of high resolution measurements for precision therapy in SRS and in MicroSBRT for small animal irradiators Geoffrey S. Ibbott: 3D Dosimetry in end-to-end dosimetry QA Understand the potential for 3D dosimetry for end-to-end radiation therapy process validation in the in-house and external credentialing setting. Canadian Institutes of Health Research; L. Schreiner, Modus QA, London, ON, Canada; T. Juang, NIH R01CA100835.

  5. End-to-End Multimodal Emotion Recognition Using Deep Neural Networks

    Science.gov (United States)

    Tzirakis, Panagiotis; Trigeorgis, George; Nicolaou, Mihalis A.; Schuller, Bjorn W.; Zafeiriou, Stefanos

    2017-12-01

    Automatic affect recognition is a challenging task due to the various modalities emotions can be expressed with. Applications can be found in many domains including multimedia retrieval and human computer interaction. In recent years, deep neural networks have been used with great success in determining emotional states. Inspired by this success, we propose an emotion recognition system using auditory and visual modalities. To capture the emotional content for various styles of speaking, robust features need to be extracted. To this purpose, we utilize a Convolutional Neural Network (CNN) to extract features from the speech, while for the visual modality a deep residual network (ResNet) of 50 layers. In addition to the importance of feature extraction, a machine learning algorithm needs also to be insensitive to outliers while being able to model the context. To tackle this problem, Long Short-Term Memory (LSTM) networks are utilized. The system is then trained in an end-to-end fashion where - by also taking advantage of the correlations of the each of the streams - we manage to significantly outperform the traditional approaches based on auditory and visual handcrafted features for the prediction of spontaneous and natural emotions on the RECOLA database of the AVEC 2016 research challenge on emotion recognition.

  6. End-to-End Neural Optical Music Recognition of Monophonic Scores

    Directory of Open Access Journals (Sweden)

    Jorge Calvo-Zaragoza

    2018-04-01

    Full Text Available Optical Music Recognition is a field of research that investigates how to computationally decode music notation from images. Despite the efforts made so far, there are hardly any complete solutions to the problem. In this work, we study the use of neural networks that work in an end-to-end manner. This is achieved by using a neural model that combines the capabilities of convolutional neural networks, which work on the input image, and recurrent neural networks, which deal with the sequential nature of the problem. Thanks to the use of the the so-called Connectionist Temporal Classification loss function, these models can be directly trained from input images accompanied by their corresponding transcripts into music symbol sequences. We also present the Printed Music Scores dataset, containing more than 80,000 monodic single-staff real scores in common western notation, that is used to train and evaluate the neural approach. In our experiments, it is demonstrated that this formulation can be carried out successfully. Additionally, we study several considerations about the codification of the output musical sequences, the convergence and scalability of the neural models, as well as the ability of this approach to locate symbols in the input score.

  7. End to End Digitisation and Analysis of Three-Dimensional Coral Models, from Communities to Corallites.

    Directory of Open Access Journals (Sweden)

    Luis Gutierrez-Heredia

    Full Text Available Coral reefs hosts nearly 25% of all marine species and provide food sources for half a billion people worldwide while only a very small percentage have been surveyed. Advances in technology and processing along with affordable underwater cameras and Internet availability gives us the possibility to provide tools and softwares to survey entire coral reefs. Holistic ecological analyses of corals require not only the community view (10s to 100s of meters, but also the single colony analysis as well as corallite identification. As corals are three-dimensional, classical approaches to determine percent cover and structural complexity across spatial scales are inefficient, time-consuming and limited to experts. Here we propose an end-to-end approach to estimate these parameters using low-cost equipment (GoPro, Canon and freeware (123D Catch, Meshmixer and Netfabb, allowing every community to participate in surveys and monitoring of their coral ecosystem. We demonstrate our approach on 9 species of underwater colonies in ranging size and morphology. 3D models of underwater colonies, fresh samples and bleached skeletons with high quality texture mapping and detailed topographic morphology were produced, and Surface Area and Volume measurements (parameters widely used for ecological and coral health studies were calculated and analysed. Moreover, we integrated collected sample models with micro-photogrammetry models of individual corallites to aid identification and colony and polyp scale analysis.

  8. End to End Digitisation and Analysis of Three-Dimensional Coral Models, from Communities to Corallites.

    Science.gov (United States)

    Gutierrez-Heredia, Luis; Benzoni, Francesca; Murphy, Emma; Reynaud, Emmanuel G

    2016-01-01

    Coral reefs hosts nearly 25% of all marine species and provide food sources for half a billion people worldwide while only a very small percentage have been surveyed. Advances in technology and processing along with affordable underwater cameras and Internet availability gives us the possibility to provide tools and softwares to survey entire coral reefs. Holistic ecological analyses of corals require not only the community view (10s to 100s of meters), but also the single colony analysis as well as corallite identification. As corals are three-dimensional, classical approaches to determine percent cover and structural complexity across spatial scales are inefficient, time-consuming and limited to experts. Here we propose an end-to-end approach to estimate these parameters using low-cost equipment (GoPro, Canon) and freeware (123D Catch, Meshmixer and Netfabb), allowing every community to participate in surveys and monitoring of their coral ecosystem. We demonstrate our approach on 9 species of underwater colonies in ranging size and morphology. 3D models of underwater colonies, fresh samples and bleached skeletons with high quality texture mapping and detailed topographic morphology were produced, and Surface Area and Volume measurements (parameters widely used for ecological and coral health studies) were calculated and analysed. Moreover, we integrated collected sample models with micro-photogrammetry models of individual corallites to aid identification and colony and polyp scale analysis.

  9. Mechanics of spatulated end-to-end artery-to-vein anastomoses.

    Science.gov (United States)

    Morasch, M D; Dobrin, P B; Dong, Q S; Mrkvicka, R

    1998-01-01

    It previously has been shown that in straight end-to-end artery-to-vein anastomoses, maximum dimensions are obtained with an interrupted suture line. Nearly equivalent dimensions are obtained with a continuous compliant polybutester suture (Novafil), and the smallest dimensions are obtained with a continuous noncompliant polypropylene suture (Surgilene). The present study was undertaken to examine these suture techniques in a spatulated or beveled anastomosis in living dogs. Anastomoses were constructed using continuous 6-0 polypropylene (Surgilene), continuous 6-0 polybutester (Novafil), or interrupted 6-0 polypropylene or polybutester. Thirty minutes after construction, the artery, vein, and beveled anastomoses were excised, restored to in situ length and pressurized with the lumen filled with a dilute suspension of barium sulfate. High resolution radiographs were obtained at 25 mmHg pressure increments up to 200 mmHg. Dimensions and compliance were determined from the radiographic images. Results showed that, unlike straight artery-to-vein anastomoses, there were no differences in the dimensions or compliance of spatulated anastomoses with continuous Surgilene, continuous Novafil, or interrupted suture techniques. Therefore a continuous suture technique is acceptable when constructing spatulated artery-to-vein anastomoses in patients.

  10. End-to-end Information Flow Security Model for Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    D. Ju. Chaly

    2015-01-01

    Full Text Available Software-defined networks (SDN are a novel paradigm of networking which became an enabler technology for many modern applications such as network virtualization, policy-based access control and many others. Software can provide flexibility and fast-paced innovations in the networking; however, it has a complex nature. In this connection there is an increasing necessity of means for assuring its correctness and security. Abstract models for SDN can tackle these challenges. This paper addresses to confidentiality and some integrity properties of SDNs. These are critical properties for multi-tenant SDN environments, since the network management software must ensure that no confidential data of one tenant are leaked to other tenants in spite of using the same physical infrastructure. We define a notion of end-to-end security in context of software-defined networks and propose a semantic model where the reasoning is possible about confidentiality, and we can check that confidential information flows do not interfere with non-confidential ones. We show that the model can be extended in order to reason about networks with secure and insecure links which can arise, for example, in wireless environments.The article is published in the authors’ wording.

  11. Circumferential resection and "Z"-shape plastic end-to-end anastomosis of canine trachea.

    Science.gov (United States)

    Zhao, H; Li, Z; Fang, J; Fang, C

    1999-03-01

    To prevent anastomotic stricture of the trachea. Forty young mongrel dogs, weighing 5-7 kg, were randomly divided into two groups: experimental group and control group, with 20 dogs in each group. Four tracheal rings were removed from each dog. In the experimental group, two "Z"-shape tracheoplastic anastomoses were performed on each dog, one on the anterior wall and the other on the membranous part of the trachea. In the control group, each dog received only simple end-to-end anastomosis. Vicryl 3-0 absorbable suture and OB fibrin glue were used for both groups. All dogs were killed when their body weight doubled. The average sagittal stenotic ratio were 1.20 +/- 0.12 for the experimental group and 0.83 +/- 0.05 for the control group. The average cross-sectional area stenotic ratio were 0.90 +/- 0.12 and 0.69 +/- 0.09 and T values were 8.71 and 4.57 for the two groups (P anastomosis in preventing anastomotic stricture of canine trachea.

  12. Mucociliary clearance following tracheal resection and end-to-end anastomosis.

    Science.gov (United States)

    Toomes, H; Linder, A

    1989-10-01

    Mucociliary clearance is an important cleaning system of the bronchial tree. The complex transport system reacts sensitively to medicinal stimuli and inhaled substances. A disturbance causes secretion retention which encourages the development of acute and chronic pulmonary diseases. It is not yet known in which way sectional resection of the central airway effects mucociliary clearance. A large number of the surgical failures are attributable to septic complications in the area of the anastomosis. In order to study the transportation process over the anastomosis, ten dogs underwent a tracheal resection with end-to-end anastomosis, and the mucociliary activity was recorded using a bronchoscopic video-technical method. Recommencement of mucous transport was observed on the third, and transport over the anastomosis from the sixth to tenth, postoperative days. The mucociliary clearance had completely recovered on the twenty-first day in the majority of dogs. Histological examination of the anastomoses nine months postoperatively showed a flat substitute epithelium without cilia-bearing cells in all dogs. This contrasts with the quick restitution of the transport function. In case of undamaged respiratory mucosa, a good adaptation of the resection margins suffices for the mucous film to slide over the anastomosis.

  13. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  14. Telomere dynamics, end-to-end fusions and telomerase activation during the human fibroblast immortalization process.

    Science.gov (United States)

    Ducray, C; Pommier, J P; Martins, L; Boussin, F D; Sabatier, L

    1999-07-22

    Loss of telomeric repeats during cell proliferation could play a role in senescence. It has been generally assumed that activation of telomerase prevents further telomere shortening and is essential for cell immortalization. In this study, we performed a detailed cytogenetic and molecular characterization of four SV40 transformed human fibroblastic cell lines by regularly monitoring the size distribution of terminal restriction fragments, telomerase activity and the associated chromosomal instability throughout immortalization. The mean TRF lengths progressively decreased in pre-crisis cells during the lifespan of the cultures. At crisis, telomeres reached a critical size, different among the cell lines, contributing to the peak of dicentric chromosomes, which resulted mostly from telomeric associations. We observed a direct correlation between short telomere length at crisis and chromosomal instability. In two immortal cell lines, although telomerase was detected, mean telomere length still continued to decrease whereas the number of dicentric chromosomes associated was stabilized. Thus telomerase could protect specifically telomeres which have reached a critical size against end-to-end dicentrics, while long telomeres continue to decrease, although at a slower rate as before crisis. This suggests a balance between elongation by telomerase and telomere shortening, towards a stabilized 'optimal' length.

  15. Practical End-to-End Performance Testing Tool for High Speed 3G-Based Networks

    Science.gov (United States)

    Shinbo, Hiroyuki; Tagami, Atsushi; Ano, Shigehiro; Hasegawa, Toru; Suzuki, Kenji

    High speed IP communication is a killer application for 3rd generation (3G) mobile systems. Thus 3G network operators should perform extensive tests to check whether expected end-to-end performances are provided to customers under various environments. An important objective of such tests is to check whether network nodes fulfill requirements to durations of processing packets because a long duration of such processing causes performance degradation. This requires testers (persons who do tests) to precisely know how long a packet is hold by various network nodes. Without any tool's help, this task is time-consuming and error prone. Thus we propose a multi-point packet header analysis tool which extracts and records packet headers with synchronized timestamps at multiple observation points. Such recorded packet headers enable testers to calculate such holding durations. The notable feature of this tool is that it is implemented on off-the shelf hardware platforms, i.e., lap-top personal computers. The key challenges of the implementation are precise clock synchronization without any special hardware and a sophisticated header extraction algorithm without any drop.

  16. Status report of the end-to-end ASKAP software system: towards early science operations

    Science.gov (United States)

    Guzman, Juan Carlos; Chapman, Jessica; Marquarding, Malte; Whiting, Matthew

    2016-08-01

    300 MHz bandwidth for Array Release 1; followed by the deployment of the real-time data processing components. In addition to the Central Processor, the first production release of the CSIRO ASKAP Science Data Archive (CASDA) has also been deployed in one of the Pawsey Supercomputing Centre facilities and it is integrated to the end-to-end ASKAP data flow system. This paper describes the current status of the "end-to-end" data flow software system from preparing observations to data acquisition, processing and archiving; and the challenges of integrating an HPC facility as a key part of the instrument. It also shares some lessons learned since the start of integration activities and the challenges ahead in preparation for the start of the Early Science program.

  17. SME2EM: Smart mobile end-to-end monitoring architecture for life-long diseases.

    Science.gov (United States)

    Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani

    2016-01-01

    Monitoring life-long diseases requires continuous measurements and recording of physical vital signs. Most of these diseases are manifested through unexpected and non-uniform occurrences and behaviors. It is impractical to keep patients in hospitals, health-care institutions, or even at home for long periods of time. Monitoring solutions based on smartphones combined with mobile sensors and wireless communication technologies are a potential candidate to support complete mobility-freedom, not only for patients, but also for physicians. However, existing monitoring architectures based on smartphones and modern communication technologies are not suitable to address some challenging issues, such as intensive and big data, resource constraints, data integration, and context awareness in an integrated framework. This manuscript provides a novel mobile-based end-to-end architecture for live monitoring and visualization of life-long diseases. The proposed architecture provides smartness features to cope with continuous monitoring, data explosion, dynamic adaptation, unlimited mobility, and constrained devices resources. The integration of the architecture׳s components provides information about diseases׳ recurrences as soon as they occur to expedite taking necessary actions, and thus prevent severe consequences. Our architecture system is formally model-checked to automatically verify its correctness against designers׳ desirable properties at design time. Its components are fully implemented as Web services with respect to the SOA architecture to be easy to deploy and integrate, and supported by Cloud infrastructure and services to allow high scalability, availability of processes and data being stored and exchanged. The architecture׳s applicability is evaluated through concrete experimental scenarios on monitoring and visualizing states of epileptic diseases. The obtained theoretical and experimental results are very promising and efficiently satisfy the proposed

  18. jade: An End-To-End Data Transfer and Catalog Tool

    Science.gov (United States)

    Meade, P.

    2017-10-01

    The IceCube Neutrino Observatory is a cubic kilometer neutrino telescope located at the Geographic South Pole. IceCube collects 1 TB of data every day. An online filtering farm processes this data in real time and selects 10% to be sent via satellite to the main data center at the University of Wisconsin-Madison. IceCube has two year-round on-site operators. New operators are hired every year, due to the hard conditions of wintering at the South Pole. These operators are tasked with the daily operations of running a complex detector in serious isolation conditions. One of the systems they operate is the data archiving and transfer system. Due to these challenging operational conditions, the data archive and transfer system must above all be simple and robust. It must also share the limited resource of satellite bandwidth, and collect and preserve useful metadata. The original data archive and transfer software for IceCube was written in 2005. After running in production for several years, the decision was taken to fully rewrite it, in order to address a number of structural drawbacks. The new data archive and transfer software (JADE2) has been in production for several months providing improved performance and resiliency. One of the main goals for JADE2 is to provide a unified system that handles the IceCube data end-to-end: from collection at the South Pole, all the way to long-term archive and preservation in dedicated repositories at the North. In this contribution, we describe our experiences and lessons learned from developing and operating the data archive and transfer software for a particle physics experiment in extreme operational conditions like IceCube.

  19. In vivo laser assisted end-to-end anastomosis with ICG-infused chitosan patches

    Science.gov (United States)

    Rossi, Francesca; Matteini, Paolo; Esposito, Giuseppe; Scerrati, Alba; Albanese, Alessio; Puca, Alfredo; Maira, Giulio; Rossi, Giacomo; Pini, Roberto

    2011-07-01

    Laser assisted vascular repair is a new optimized technique based on the use of ICG-infused chitosan patch to close a vessel wound, with or even without few supporting single stitches. We present an in vivo experimental study on an innovative end-to-end laser assisted vascular anastomotic (LAVA) technique, performed with the application of ICGinfused chitosan patches. The photostability and the mechanical properties of ICG-infused chitosan films were preliminary measured. The in vivo study was performed in 10 New Zealand rabbits. After anesthesia, a 3-cm segment of the right common carotid artery was exposed, thus clamped proximally and distally. The artery was then interrupted by means of a full thickness cut. Three single microsutures were used to approximate the two vessel edges. The ICG-infused chitosan patch was rolled all over the anastomotic site and welded by the use of a diode laser emitting at 810 nm and equipped with a 300 μm diameter optical fiber. Welding was obtained by delivering single laser spots to induce local patch/tissue adhesion. The result was an immediate closure of the anastomosis, with no bleeding at clamps release. Thus animals underwent different follow-up periods, in order to evaluate the welded vessels over time. At follow-up examinations, all the anastomoses were patent and no bleeding signs were documented. Samples of welded vessels underwent histological examinations. Results showed that this technique offer several advantages over conventional suturing methods: simplification of the surgical procedure, shortening of the operative time, better re-endothelization and optimal vascular healing process.

  20. NCAR Earth Observing Laboratory - An End-to-End Observational Science Enterprise

    Science.gov (United States)

    Rockwell, A.; Baeuerle, B.; Grubišić, V.; Hock, T. F.; Lee, W. C.; Ranson, J.; Stith, J. L.; Stossmeister, G.

    2017-12-01

    Researchers who want to understand and describe the Earth System require high-quality observations of the atmosphere, ocean, and biosphere. Making these observations not only requires capable research platforms and state-of-the-art instrumentation but also benefits from comprehensive in-field project management and data services. NCAR's Earth Observing Laboratory (EOL) is an end-to-end observational science enterprise that provides leadership in observational research to scientists from universities, U.S. government agencies, and NCAR. Deployment: EOL manages the majority of the NSF Lower Atmosphere Observing Facilities, which includes research aircraft, radars, lidars, profilers, and surface and sounding systems. This suite is designed to address a wide range of Earth system science - from microscale to climate process studies and from the planet's surface into the Upper Troposphere/Lower Stratosphere. EOL offers scientific, technical, operational, and logistics support to small and large field campaigns across the globe. Development: By working closely with the scientific community, EOL's engineering and scientific staff actively develop the next generation of observing facilities, staying abreast of emerging trends, technologies, and applications in order to improve our measurement capabilities. Through our Design and Fabrication Services, we also offer high-level engineering and technical expertise, mechanical design, and fabrication to the atmospheric research community. Data Services: EOL's platforms and instruments collect unique datasets that must be validated, archived, and made available to the research community. EOL's Data Management and Services deliver high-quality datasets and metadata in ways that are transparent, secure, and easily accessible. We are committed to the highest standard of data stewardship from collection to validation to archival. Discovery: EOL promotes curiosity about Earth science, and fosters advanced understanding of the

  1. GOCE gravity field simulation based on actual mission scenario

    Science.gov (United States)

    Pail, R.; Goiginger, H.; Mayrhofer, R.; Höck, E.; Schuh, W.-D.; Brockmann, J. M.; Krasbutter, I.; Fecher, T.; Gruber, T.

    2009-04-01

    In the framework of the ESA-funded project "GOCE High-level Processing Facility", an operational hardware and software system for the scientific processing (Level 1B to Level 2) of GOCE data has been set up by the European GOCE Gravity Consortium EGG-C. One key component of this software system is the processing of a spherical harmonic Earth's gravity field model and the corresponding full variance-covariance matrix from the precise GOCE orbit and calibrated and corrected satellite gravity gradiometry (SGG) data. In the framework of the time-wise approach a combination of several processing strategies for the optimum exploitation of the information content of the GOCE data has been set up: The Quick-Look Gravity Field Analysis is applied to derive a fast diagnosis of the GOCE system performance and to monitor the quality of the input data. In the Core Solver processing a rigorous high-precision solution of the very large normal equation systems is derived by applying parallel processing techniques on a PC cluster. Before the availability of real GOCE data, by means of a realistic numerical case study, which is based on the actual GOCE orbit and mission scenario and simulation data stemming from the most recent ESA end-to-end simulation, the expected GOCE gravity field performance is evaluated. Results from this simulation as well as recently developed features of the software system are presented. Additionally some aspects on data combination with complementary data sources are addressed.

  2. An End-to-End System to Enable Quick, Easy and Inexpensive Deployment of Hydrometeorological Stations

    Science.gov (United States)

    Celicourt, P.; Piasecki, M.

    2014-12-01

    The high cost of hydro-meteorological data acquisition, communication and publication systems along with limited qualified human resources is considered as the main reason why hydro-meteorological data collection remains a challenge especially in developing countries. Despite significant advances in sensor network technologies which gave birth to open hardware and software, low-cost (less than $50) and low-power (in the order of a few miliWatts) sensor platforms in the last two decades, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome, and thus expensive task. These factors give rise for the need to develop a affordable, simple to deploy, scalable and self-organizing end-to-end (from sensor to publication) system suitable for deployment in such countries. The design of the envisioned system will consist of a few Sensed-And-Programmed Arduino-based sensor nodes with low-cost sensors measuring parameters relevant to hydrological processes and a Raspberry Pi micro-computer hosting the in-the-field back-end data management. This latter comprises the Python/Django model of the CUAHSI Observations Data Model (ODM) namely DjangODM backed by a PostgreSQL Database Server. We are also developing a Python-based data processing script which will be paired with the data autoloading capability of Django to populate the DjangODM database with the incoming data. To publish the data, the WOFpy (WaterOneFlow Web Services in Python) developed by the Texas Water Development Board for 'Water Data for Texas' which can produce WaterML web services from a variety of back-end database installations such as SQLite, MySQL, and PostgreSQL will be used. A step further would be the development of an appealing online visualization tool using Python statistics and analytics tools (Scipy, Numpy, Pandas) showing the spatial distribution of variables across an entire watershed as a time variant layer on top of a basemap.

  3. Unidata's Vision for Providing Comprehensive and End-to-end Data Services

    Science.gov (United States)

    Ramamurthy, M. K.

    2009-05-01

    This paper presents Unidata's vision for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users no matter where they are or how they are connected to the Internetwill be able to find and access a plethora of geosciences data and use Unidata-provided tools and services both productively and creatively in their research and education. What that vision means for the Unidata community is elucidated by drawing a simple analogy. Most of users are familiar with Amazon and eBay e-commerce sites and content sharing sites like YouTube and Flickr. On the eBay marketplace, people can sell practically anything at any time and buyers can share their experience of purchasing a product or the reputation of a seller. Likewise, at Amazon, thousands of merchants sell their goods and millions of customers not only buy those goods, but provide a review or opinion of the products they buy and share their experiences as purchasers. Similarly, YouTube and Flickr are sites tailored to video- and photo-sharing, respectively, where users can upload their own content and share it with millions of other users, including family and friends. What all these sites, together with social-networking applications like MySpace and Facebook, have enabled is a sense of a virtual community in which users can search and browse products or content, comment and rate those products from anywhere, at any time, and via any Internet- enabled device like an iPhone, laptop, or a desktop computer. In essence, these enterprises have fundamentally altered people's buying modes and behavior toward purchases. Unidata believes that similar approaches, appropriately tailored to meet the needs of the scientific

  4. Ocean Acidification Scientific Data Stewardship: An approach for end-to-end data management and integration

    Science.gov (United States)

    Arzayus, K. M.; Garcia, H. E.; Jiang, L.; Michael, P.

    2012-12-01

    As the designated Federal permanent oceanographic data center in the United States, NOAA's National Oceanographic Data Center (NODC) has been providing scientific stewardship for national and international marine environmental and ecosystem data for over 50 years. NODC is supporting NOAA's Ocean Acidification Program and the science community by providing end-to-end scientific data management of ocean acidification (OA) data, dedicated online data discovery, and user-friendly access to a diverse range of historical and modern OA and other chemical, physical, and biological oceanographic data. This effort is being catalyzed by the NOAA Ocean Acidification Program, but the intended reach is for the broader scientific ocean acidification community. The first three years of the project will be focused on infrastructure building. A complete ocean acidification data content standard is being developed to ensure that a full spectrum of ocean acidification data and metadata can be stored and utilized for optimal data discovery and access in usable data formats. We plan to develop a data access interface capable of allowing users to constrain their search based on real-time and delayed mode measured variables, scientific data quality, their observation types, the temporal coverage, methods, instruments, standards, collecting institutions, and the spatial coverage. In addition, NODC seeks to utilize the existing suite of international standards (including ISO 19115-2 and CF-compliant netCDF) to help our data producers use those standards for their data, and help our data consumers make use of the well-standardized metadata-rich data sets. These tools will be available through our NODC Ocean Acidification Scientific Data Stewardship (OADS) web page at http://www.nodc.noaa.gov/oceanacidification. NODC also has a goal to provide each archived dataset with a unique ID, to ensure a means of providing credit to the data provider. Working with partner institutions, such as the

  5. Common Patterns with End-to-end Interoperability for Data Access

    Science.gov (United States)

    Gallagher, J.; Potter, N.; Jones, M. B.

    2010-12-01

    At first glance, using common storage formats and open standards should be enough to ensure interoperability between data servers and client applications, but that is often not the case. In the REAP (Realtime Environment for Analytical Processing; NSF #0619060) project we integrated access to data from OPeNDAP servers into the Kepler workflow system and found that, as in previous cases, we spent the bulk of our effort addressing the twin issues of data model compatibility and integration strategies. Implementing seamless data access between a remote data source and a client application (data sink) can be broken down into two kinds of issues. First, the solution must address any differences in the data models used by the data source (OPeNDAP) and the data sink (the Kepler workflow system). If these models match completely, there is little work to be done. However, that is rarely the case. To map OPeNDAP's data model to Kepler's, we used two techniques (ignoring trivial conversions): On-the-fly type mapping and out-of-band communication. Type conversion takes place both for data and metadata because Kepler requires a priori knowledge of some aspects (e.g., syntactic metadata) of the data to build a workflow. In addition, OPeNDAP's constraint expression syntax was used to send out-of-band information to restrict the data requested from the server, facilitating changes in the returned data's type. This technique provides a way for users to exert fine-grained control over the data request, a potentially useful technique, at the cost of requiring that users understand a little about the data source's processing capabilities. The second set of issues for end-to-end data access are integration strategies. OPeNDAP provides several different tools for bringing data into an application: C++, C and Java libraries that provide functions for newly written software; The netCDF library which enables existing applications to read from servers using an older interface; and simple

  6. On the importance of risk knowledge for an end-to-end tsunami early warning system

    Science.gov (United States)

    Post, Joachim; Strunz, Günter; Riedlinger, Torsten; Mück, Matthias; Wegscheider, Stephanie; Zosseder, Kai; Steinmetz, Tilmann; Gebert, Niklas; Anwar, Herryal

    2010-05-01

    context has been worked out. The generated results contribute significantly in the fields of (1) warning decision and warning levels, (2) warning dissemination and warning message content, (3) early warning chain planning, (4) increasing response capabilities and protective systems, (5) emergency relief and (6) enhancing communities' awareness and preparedness towards tsunami threats. Additionally examples will be given on the potentials of an operational use of risk information in early warning systems as first experiences exist for the tsunami early warning center in Jakarta, Indonesia. Beside this the importance of linking national level early warning information with tsunami risk information available at the local level (e.g. linking warning message information on expected intensity with respective tsunami hazard zone maps at community level for effective evacuation) will be demonstrated through experiences gained in three pilot areas in Indonesia. The presentation seeks to provide new insights on benefits using risk information in early warning and will provide further evidence that practical use of risk information is an important and indispensable component of end-to-end early warning.

  7. The End-To-End Safety Verification Process Implemented to Ensure Safe Operations of the Columbus Research Module

    Science.gov (United States)

    Arndt, J.; Kreimer, J.

    2010-09-01

    The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough

  8. A vision for end-to-end data services to foster international partnerships through data sharing

    Science.gov (United States)

    Ramamurthy, M.; Yoksas, T.

    2009-04-01

    Increasingly, the conduct of science requires scientific partnerships and sharing of knowledge, information, and other assets. This is particularly true in our field where the highly-coupled Earth system and its many linkages have heightened the importance of collaborations across geographic, disciplinary, and organizational boundaries. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. As articulated in the NSF Strategic Plan: FY 2006-2011, "…discovery increasingly requires expertise of individuals from different disciplines, with diverse perspectives, and often from different nations, working together to accommodate the extraordinary complexity of today's science and engineering challenges." The Nobel Prize winning IPCC assessments are a prime example of such an effort. Earth science education is also uniquely suited to drawing connections between the dynamic Earth system and societal issues. Events like the 2004 Indian Ocean tsunami and Hurricane Katrina provide ample evidence of this relevance, as they underscore the importance of timely and interdisciplinary integration and synthesis of data. Our success in addressing such complex problems and advancing geosciences depends on the availability of a state-of-the-art and robust cyberinfrastructure, transparent and timely access to high-quality data from diverse sources, and requisite tools to integrate and use the data effectively, toward creating new knowledge. To that end, Unidata's vision calls for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users — no matter where they are, how they are connected to the Internet, or what

  9. An end-to-end coupled model ROMS-N 2 P 2 Z 2 D 2 -OSMOSE of ...

    African Journals Online (AJOL)

    An end-to-end coupled model ROMS-N 2 P 2 Z 2 D 2 -OSMOSE of the southern Benguela foodweb: parameterisation, calibration and pattern-oriented validation. ... We also highlight the capacity of this model for tracking indicators at various hierarchical levels. Keywords: individual-based model, model validation, ...

  10. GROWTH OF THE HYPOPLASTIC AORTIC-ARCH AFTER SIMPLE COARCTATION RESECTION AND END-TO-END ANASTOMOSIS

    NARCIS (Netherlands)

    BROUWER, MHJ; CROMMEDIJKHUIS, AH; EBELS, T; EIJGELAAR, A

    Surgical treatment of a hypoplastic aortic arch associated with an aortic coarctation is controversial. The controversy concerns the claimed need to surgically enlarge the diameter of the hypoplastic arch, in addition to resection and end-to-end anastomosis. The purpose of this prospective study is

  11. IMS Intra- and Inter Domain End-to-End Resilience Analysis

    DEFF Research Database (Denmark)

    Kamyod, Chayapol; Nielsen, Rasmus Hjorth; Prasad, Neeli R.

    2013-01-01

    This paper evaluated resilience of the reference IMS based network topology in operation through the keys reliability parameters via OPNET. The reliability behaviors of communication within similar and across registered home IMS domains were simulated and compared. Besides, the reliability effects...

  12. Exploring the requirements for multimodal interaction for mobile devices in an end-to-end journey context.

    Science.gov (United States)

    Krehl, Claudia; Sharples, Sarah

    2012-01-01

    The paper investigates the requirements for multimodal interaction on mobile devices in an end-to-end journey context. Traditional interfaces are deemed cumbersome and inefficient for exchanging information with the user. Multimodal interaction provides a different user-centred approach allowing for more natural and intuitive interaction between humans and computers. It is especially suitable for mobile interaction as it can overcome additional constraints including small screens, awkward keypads, and continuously changing settings - an inherent property of mobility. This paper is based on end-to-end journeys where users encounter several contexts during their journeys. Interviews and focus groups explore the requirements for multimodal interaction design for mobile devices by examining journey stages and identifying the users' information needs and sources. Findings suggest that multimodal communication is crucial when users multitask. Choosing suitable modalities depend on user context, characteristics and tasks.

  13. Minimizing End-to-End Interference in I/O Stacks Spanning Shared Multi-Level Buffer Caches

    Science.gov (United States)

    Patrick, Christina M.

    2011-01-01

    This thesis presents an end-to-end interference minimizing uniquely designed high performance I/O stack that spans multi-level shared buffer cache hierarchies accessing shared I/O servers to deliver a seamless high performance I/O stack. In this thesis, I show that I can build a superior I/O stack which minimizes the inter-application interference…

  14. Sleep/wake scheduling scheme for minimizing end-to-end delay in multi-hop wireless sensor networks

    OpenAIRE

    Madani Sajjad; Nazir Babar; Hasbullah Halabi

    2011-01-01

    Abstract We present a sleep/wake schedule protocol for minimizing end-to-end delay for event driven multi-hop wireless sensor networks. In contrast to generic sleep/wake scheduling schemes, our proposed algorithm performs scheduling that is dependent on traffic loads. Nodes adapt their sleep/wake schedule based on traffic loads in response to three important factors, (a) the distance of the node from the sink node, (b) the importance of the node's location from connectivity's perspective, and...

  15. Multi-institutional evaluation of end-to-end protocol for IMRT/VMAT treatment chains utilizing conventional linacs.

    Science.gov (United States)

    Loughery, Brian; Knill, Cory; Silverstein, Evan; Zakjevskii, Viatcheslav; Masi, Kathryn; Covington, Elizabeth; Snyder, Karen; Song, Kwang; Snyder, Michael

    2018-03-20

    We conducted a multi-institutional assessment of a recently developed end-to-end monthly quality assurance (QA) protocol for external beam radiation therapy treatment chains. This protocol validates the entire treatment chain against a baseline to detect the presence of complex errors not easily found in standard component-based QA methods. Participating physicists from 3 institutions ran the end-to-end protocol on treatment chains that include Imaging and Radiation Oncology Core (IROC)-credentialed linacs. Results were analyzed in the form of American Association of Physicists in Medicine (AAPM) Task Group (TG)-119 so that they may be referenced by future test participants. Optically stimulated luminescent dosimeter (OSLD), EBT3 radiochromic film, and A1SL ion chamber readings were accumulated across 10 test runs. Confidence limits were calculated to determine where 95% of measurements should fall. From calculated confidence limits, 95% of measurements should be within 5% error for OSLDs, 4% error for ionization chambers, and 4% error for (96% relative gamma pass rate) radiochromic film at 3% agreement/3 mm distance to agreement. Data were separated by institution, model of linac, and treatment protocol (intensity-modulated radiation therapy [IMRT] vs volumetric modulated arc therapy [VMAT]). A total of 97% of OSLDs, 98% of ion chambers, and 93% of films were within the confidence limits; measurements were found outside these limits by a maximum of 4%, consistent despite institutional differences in OSLD reading equipment and radiochromic film calibration techniques. Results from this test may be used by clinics for data comparison. Areas of improvement were identified in the end-to-end protocol that can be implemented in an updated version. The consistency of our data demonstrates the reproducibility and ease-of-use of such tests and suggests a potential role for their use in broad end-to-end QA initiatives. Copyright © 2018 American Association of Medical

  16. Hybrid monitoring scheme for end-to-end performance enhancement of multicast-based real-time media

    Science.gov (United States)

    Park, Ju-Won; Kim, JongWon

    2004-10-01

    As real-time media applications based on IP multicast networks spread widely, end-to-end QoS (quality of service) provisioning for these applications have become very important. To guarantee the end-to-end QoS of multi-party media applications, it is essential to monitor the time-varying status of both network metrics (i.e., delay, jitter and loss) and system metrics (i.e., CPU and memory utilization). In this paper, targeting the multicast-enabled AG (Access Grid) a next-generation group collaboration tool based on multi-party media services, the applicability of hybrid monitoring scheme that combines active and passive monitoring is investigated. The active monitoring measures network-layer metrics (i.e., network condition) with probe packets while the passive monitoring checks both application-layer metrics (i.e., user traffic condition by analyzing RTCP packets) and system metrics. By comparing these hybrid results, we attempt to pinpoint the causes of performance degradation and explore corresponding reactions to improve the end-to-end performance. The experimental results show that the proposed hybrid monitoring can provide useful information to coordinate the performance improvement of multi-party real-time media applications.

  17. Debris mitigation measures by satellite design and operational methods - Findings from the DLR space debris End-to-End Service

    Science.gov (United States)

    Sdunnus, H.; Beltrami, P.; Janovsky, R.; Koppenwallner, G.; Krag, H.; Reimerdes, H.; Schäfer, F.

    Debris Mitigation has been recognised as an issue to be addressed by the space faring nations around the world. Currently, there are various activities going on, aiming at the establishment of debris mitigation guidelines on various levels, reaching from the UN down to national space agencies. Though guidelines established on the national level already provide concrete information how things should be done (rather that specifying what should be done or providing fundamental principles) potential users of the guidelines will still have the need to explore the technical, management, and financial implications of the guidelines for their projects. Those questions are addressed by the so called "Space Debris End-to-End Service" project, which has been initiated as a national initiative of the German Aerospace Centre (DLR). Based on a review of already existing mitigation guidelines or guidelines under development and following an identification of needs from a circle of industrial users the "End-to-End Service Gu idelines" have been established for designer and operators of spacecraft. The End-to-End Service Guidelines are based on requirements addressed by the mitigation guidelines and provide recommendations how and when the technical consideration of the mitigation guidelines should take place. By referencing requirements from the mitigation guidelines, the End-to-End Service Guidelines address the consideration of debris mitigation measures by spacecraft design and operational measures. This paper will give an introduction to the End-to-End Service Guidelines. It will focus on the proposals made for mitigation measures by the S/C system design, i.e. on protective design measures inside the spacecraft and on design measures, e.g. innovative protective (shielding) systems. Furthermore, approaches on the analytical optimisation of protective systems will be presented, aiming at the minimisation of shield mass under conservation of the protective effects. On the

  18. Privacy in Pharmacogenetics: An End-to-End Case Study of Personalized Warfarin Dosing.

    Science.gov (United States)

    Fredrikson, Matthew; Lantz, Eric; Jha, Somesh; Lin, Simon; Page, David; Ristenpart, Thomas

    2014-08-01

    We initiate the study of privacy in pharmacogenetics, wherein machine learning models are used to guide medical treatments based on a patient's genotype and background. Performing an in-depth case study on privacy in personalized warfarin dosing, we show that suggested models carry privacy risks, in particular because attackers can perform what we call model inversion : an attacker, given the model and some demographic information about a patient, can predict the patient's genetic markers. As differential privacy (DP) is an oft-proposed solution for medical settings such as this, we evaluate its effectiveness for building private versions of pharmacogenetic models. We show that DP mechanisms prevent our model inversion attacks when the privacy budget is carefully selected . We go on to analyze the impact on utility by performing simulated clinical trials with DP dosing models. We find that for privacy budgets effective at preventing attacks, patients would be exposed to increased risk of stroke, bleeding events, and mortality . We conclude that current DP mechanisms do not simultaneously improve genomic privacy while retaining desirable clinical efficacy, highlighting the need for new mechanisms that should be evaluated in situ using the general methodology introduced by our work.

  19. Telephony Over IP: A QoS Measurement-Based End to End Control Algorithm

    Directory of Open Access Journals (Sweden)

    Luigi Alcuri

    2004-12-01

    Full Text Available This paper presents a method for admitting voice calls in Telephony over IP (ToIP scenarios. This method, called QoS-Weighted CAC, aims to guarantee Quality of Service to telephony applications. We use a measurement-based call admission control algorithm, which detects network congested links through a feedback on overall link utilization. This feedback is based on the measures of packet delivery latencies related to voice over IP connections at the edges of the transport network. In this way we introduce a close loop control method, which is able to auto-adapt the quality margin on the basis of network load and specific service level requirements. Moreover we evaluate the difference in performance achieved by different Queue management configurations to guarantee Quality of Service to telephony applications, in which our goal was to evaluate the weight of edge router queue configuration in complex and real-like telephony over IP scenario. We want to compare many well-know queue scheduling algorithms, such as SFQ, WRR, RR, WIRR, and Priority. This comparison aims to locate queue schedulers in a more general control scheme context where different elements such as DiffServ marking and Admission control algorithms contribute to the overall Quality of Service required by real-time voice conversations. By means of software simulations we want to compare this solution with other call admission methods already described in scientific literature in order to locate this proposed method in a more general control scheme context. On the basis of the results we try to evidence the possible advantages of this QoS-Weighted solution in comparison with other similar CAC solutions ( in particular Measured Sum, Bandwidth Equivalent with Hoeffding Bounds, and Simple Measure CAC, on the planes of complexity, stability, management, tune-ability to service level requirements, and compatibility with actual network implementation.

  20. Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission

    Science.gov (United States)

    Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan

    2010-01-01

    The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine

  1. Understanding Effect of Constraint Release Environment on End-to-End Vector Relaxation of Linear Polymer Chains

    KAUST Repository

    Shivokhin, Maksim E.

    2017-05-30

    We propose and verify methods based on the slip-spring (SSp) model [ Macromolecules 2005, 38, 14 ] for predicting the effect of any monodisperse, binary, or ternary environment of topological constraints on the relaxation of the end-to-end vector of a linear probe chain. For this purpose we first validate the ability of the model to consistently predict both the viscoelastic and dielectric response of monodisperse and binary mixtures of type A polymers, based on published experimental data. We also report the synthesis of new binary and ternary polybutadiene systems, the measurement of their linear viscoelastic response, and the prediction of these data by the SSp model. We next clarify the relaxation mechanisms of probe chains in these constraint release (CR) environments by analyzing a set of "toy" SSp models with simplified constraint release rates, by examining fluctuations of the end-to-end vector. In our analysis, the longest relaxation time of the probe chain is determined by a competition between the longest relaxation times of the effective CR motions of the fat and thin tubes and the motion of the chain itself in the thin tube. This picture is tested by the analysis of four model systems designed to separate and estimate every single contribution involved in the relaxation of the probe\\'s end-to-end vector in polydisperse systems. We follow the CR picture of Viovy et al. [ Macromolecules 1991, 24, 3587 ] and refine the effective chain friction in the thin and fat tubes based on Read et al. [ J. Rheol. 2012, 56, 823 ]. The derived analytical equations form a basis for generalizing the proposed methodology to polydisperse mixtures of linear and branched polymers. The consistency between the SSp model and tube model predictions is a strong indicator of the compatibility between these two distinct mesoscopic frameworks.

  2. AN AUTOMATED END-TO-END MULTI-AGENT QOS BASED ARCHITECTURE FOR SELECTION OF GEOSPATIAL WEB SERVICES

    Directory of Open Access Journals (Sweden)

    M. Shah

    2012-07-01

    With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  3. Comparison of Direct Side-to-End and End-to-End Hypoglossal-Facial Anastomosis for Facial Nerve Repair.

    Science.gov (United States)

    Samii, Madjid; Alimohamadi, Maysam; Khouzani, Reza Karimi; Rashid, Masoud Rafizadeh; Gerganov, Venelin

    2015-08-01

    The hypoglossal facial anastomosis (HFA) is the gold standard for facial reanimation in patients with severe facial nerve palsy. The major drawbacks of the classic HFA technique are lingual morbidities due to hypoglossal nerve transection. The side-to-end HFA is a modification of the classic technique with fewer tongue-related morbidities. In this study we compared the outcome of the classic end-to-end and the direct side-to-end HFA surgeries performed at our center in regards to the facial reanimation success rate and tongue-related morbidities. Twenty-six successive cases of HFA were enrolled. In 9 of them end-to-end anastomoses were performed, and 17 had direct side-to-end anastomoses. The House-Brackmann (HB) and Pitty and Tator (PT) scales were used to document surgical outcome. The hemiglossal atrophy, swallowing, and hypoglossal nerve function were assessed at follow-up. The original pathology was vestibular schwannoma in 15, meningioma in 4, brain stem glioma in 4, and other pathologies in 3. The mean interval between facial palsy and HFA was 18 months (range: 0-60). The median follow-up period was 20 months. The PT grade at follow-up was worse in patients with a longer interval from facial palsy and HFA (P value: 0.041). The lesion type was the only other factor that affected PT grade (the best results in vestibular schwannoma and the worst in the other pathologies group, P value: 0.038). The recovery period for facial tonicity was longer in patients with radiation therapy before HFA (13.5 vs. 8.5 months) and those with a longer than 2-year interval from facial palsy to HFA (13.5 vs. 8.5 months). Although no significant difference between the side-to-end and the end-to-end groups was seen in terms of facial nerve functional recovery, patients from the side-to-end group had a significantly lower rate of lingual morbidities (tongue hemiatrophy: 100% vs. 5.8%, swallowing difficulty: 55% vs. 11.7%, speech disorder 33% vs. 0%). With the side-to-end HFA

  4. Risk Factors for Dehiscence of Stapled Functional End-to-End Intestinal Anastomoses in Dogs: 53 Cases (2001-2012).

    Science.gov (United States)

    Snowdon, Kyle A; Smeak, Daniel D; Chiang, Sharon

    2016-01-01

    To identify risk factors for dehiscence in stapled functional end-to-end anastomoses (SFEEA) in dogs. Retrospective case series. Dogs (n = 53) requiring an enterectomy. Medical records from a single institution for all dogs undergoing an enterectomy (2001-2012) were reviewed. Surgeries were included when gastrointestinal (GIA) and thoracoabdominal (TA) stapling equipment was used to create a functional end-to-end anastomosis between segments of small intestine or small and large intestine in dogs. Information regarding preoperative, surgical, and postoperative factors was recorded. Anastomotic dehiscence was noted in 6 of 53 cases (11%), with a mortality rate of 83%. The only preoperative factor significantly associated with dehiscence was the presence of inflammatory bowel disease (IBD). Surgical factors significantly associated with dehiscence included the presence, duration, and number of intraoperative hypotensive periods, and location of anastomosis, with greater odds of dehiscence in anastomoses involving the large intestine. IBD, location of anastomosis, and intraoperative hypotension are risk factors for intestinal anastomotic dehiscence after SFEEA in dogs. Previously suggested risk factors (low serum albumin concentration, preoperative septic peritonitis, and intestinal foreign body) were not confirmed in this study. © Copyright 2015 by The American College of Veterinary Surgeons.

  5. A new technique for end-to-end ureterostomy in the rat, using an indwelling reabsorbable stent.

    Science.gov (United States)

    Carmignani, G; Farina, F P; De Stefani, S; Maffezzini, M

    1983-01-01

    The restoration of the continuity of the urinary tract represents one of the major problems in rat renal transplantation. End-to-end ureterostomy is the most physiologically effective technique; however, it involves noteworthy technical difficulties because of the extremely thin caliber of the ureter in the rat and the high incidence of postoperative hydronephrosis. We describe a new technique for end-to-end ureterostomy in the rat, where the use of an absorbable ureteral stent is recommended. A 5-0 plain catgut thread is used as a stent. The anastomosis is performed under an operating microscope at X 25-40 magnification with interrupted sutures of 11-0 Vicryl. The use of the indwelling stent facilitates the performance of the anastomosis and yields optimal results. The macroscopical, radiological, and histological controls in a group of rats operated on with this technique showed a very high percentage of success with no complications, a result undoubtedly superior to that obtained with conventional methods.

  6. A multicentre 'end to end' dosimetry audit of motion management (4DCT-defined motion envelope) in radiotherapy.

    Science.gov (United States)

    Palmer, Antony L; Nash, David; Kearton, John R; Jafari, Shakardokht M; Muscat, Sarah

    2017-12-01

    External dosimetry audit is valuable for the assurance of radiotherapy quality. However, motion management has not been rigorously audited, despite its complexity and importance for accuracy. We describe the first end-to-end dosimetry audit for non-SABR (stereotactic ablative body radiotherapy) lung treatments, measuring dose accumulation in a moving target, and assessing adequacy of target dose coverage. A respiratory motion lung-phantom with custom-designed insert was used. Dose was measured with radiochromic film, employing triple-channel dosimetry and uncertainty reduction. The host's 4DCT scan, outlining and planning techniques were used. Measurements with the phantom static and then moving at treatment delivery separated inherent treatment uncertainties from motion effects. Calculated and measured dose distributions were compared by isodose overlay, gamma analysis, and we introduce the concept of 'dose plane histograms' for clinically relevant interpretation of film dosimetry. 12 radiotherapy centres and 19 plans were audited: conformal, IMRT (intensity modulated radiotherapy) and VMAT (volumetric modulated radiotherapy). Excellent agreement between planned and static-phantom results were seen (mean gamma pass 98.7% at 3% 2 mm). Dose blurring was evident in the moving-phantom measurements (mean gamma pass 88.2% at 3% 2 mm). Planning techniques for motion management were adequate to deliver the intended moving-target dose coverage. A novel, clinically-relevant, end-to-end dosimetry audit of motion management strategies in radiotherapy is reported. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. End-to-end self-assembly of gold nanorods in isopropanol solution: experimental and theoretical studies

    Energy Technology Data Exchange (ETDEWEB)

    Gordel, M., E-mail: marta.gordel@pwr.edu.pl [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland); Piela, K., E-mail: katarzyna.piela@pwr.edu.pl [Wrocław University of Technology, Department of Physical and Quantum Chemistry (Poland); Kołkowski, R. [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland); Koźlecki, T. [Wrocław University of Technology, Department of Chemical Engineering, Faculty of Chemistry (Poland); Buckle, M. [CNRS, École Normale Supérieure de Cachan, Laboratoire de Biologie et Pharmacologie Appliquée (France); Samoć, M. [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland)

    2015-12-15

    We describe here a modification of properties of colloidal gold nanorods (NRs) resulting from the chemical treatment used to carry out their transfer into isopropanol (IPA) solution. The NRs acquire a tendency to attach one to another by their ends (end-to-end assembly). We focus on the investigation of the change in position and shape of the longitudinal surface plasmon (l-SPR) band after self-assembly. The experimental results are supported by a theoretical calculation, which rationalizes the dramatic change in optical properties when the NRs are positioned end-to-end at short distances. The detailed spectroscopic characterization performed at the consecutive stages of transfer of the NRs from water into IPA solution revealed the features of the interaction between the polymers used as ligands and their contribution to the final stage, when the NRs were dispersed in IPA solution. The efficient method of aligning the NRs detailed here may facilitate applications of the self-assembled NRs as building blocks for optical materials and biological sensing.Graphical Abstract.

  8. End-to-End Joint Antenna Selection Strategy and Distributed Compress and Forward Strategy for Relay Channels

    Directory of Open Access Journals (Sweden)

    Rahul Vaze

    2009-01-01

    Full Text Available Multihop relay channels use multiple relay stages, each with multiple relay nodes, to facilitate communication between a source and destination. Previously, distributed space-time codes were proposed to maximize the achievable diversity-multiplexing tradeoff; however, they fail to achieve all the points of the optimal diversity-multiplexing tradeoff. In the presence of a low-rate feedback link from the destination to each relay stage and the source, this paper proposes an end-to-end antenna selection (EEAS strategy as an alternative to distributed space-time codes. The EEAS strategy uses a subset of antennas of each relay stage for transmission of the source signal to the destination with amplifying and forwarding at each relay stage. The subsets are chosen such that they maximize the end-to-end mutual information at the destination. The EEAS strategy achieves the corner points of the optimal diversity-multiplexing tradeoff (corresponding to maximum diversity gain and maximum multiplexing gain and achieves better diversity gain at intermediate values of multiplexing gain, versus the best-known distributed space-time coding strategies. A distributed compress and forward (CF strategy is also proposed to achieve all points of the optimal diversity-multiplexing tradeoff for a two-hop relay channel with multiple relay nodes.

  9. End-to-end Structural Restriction of α-Synuclein and Its Influence on Amyloid Fibril Formation

    International Nuclear Information System (INIS)

    Hong, Chul Suk; Park, Jae Hyung; Choe, Young Jun; Paik, Seung R.

    2014-01-01

    Relationship between molecular freedom of amyloidogenic protein and its self-assembly into amyloid fibrils has been evaluated with α-synuclein, an intrinsically unfolded protein related to Parkinson's disease, by restricting its structural plasticity through an end-to-end disulfide bond formation between two newly introduced cysteine residues on the N- and C-termini. Although the resulting circular form of α-synuclein exhibited an impaired fibrillation propensity, the restriction did not completely block the protein's interactive core since co-incubation with wild-type α-synuclein dramatically facilitated the fibrillation by producing distinctive forms of amyloid fibrils. The suppressed fibrillation propensity was instantly restored as the structural restriction was unleashed with β-mercaptoethanol. Conformational flexibility of the accreting amyloidogenic protein to pre-existing seeds has been demonstrated to be critical for fibrillar extension process by exerting structural adjustment to a complementary structure for the assembly

  10. End-to-end Structural Restriction of α-Synuclein and Its Influence on Amyloid Fibril Formation

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Chul Suk; Park, Jae Hyung; Choe, Young Jun; Paik, Seung R. [Seoul National University, Seoul (Korea, Republic of)

    2014-09-15

    Relationship between molecular freedom of amyloidogenic protein and its self-assembly into amyloid fibrils has been evaluated with α-synuclein, an intrinsically unfolded protein related to Parkinson's disease, by restricting its structural plasticity through an end-to-end disulfide bond formation between two newly introduced cysteine residues on the N- and C-termini. Although the resulting circular form of α-synuclein exhibited an impaired fibrillation propensity, the restriction did not completely block the protein's interactive core since co-incubation with wild-type α-synuclein dramatically facilitated the fibrillation by producing distinctive forms of amyloid fibrils. The suppressed fibrillation propensity was instantly restored as the structural restriction was unleashed with β-mercaptoethanol. Conformational flexibility of the accreting amyloidogenic protein to pre-existing seeds has been demonstrated to be critical for fibrillar extension process by exerting structural adjustment to a complementary structure for the assembly.

  11. Self-assembled nanogaps via seed-mediated growth of end-to-end linked gold nanorods

    DEFF Research Database (Denmark)

    Jain, Titoo; Westerlund, Axel Rune Fredrik; Johnson, Erik

    2009-01-01

    Gold nanorods (AuNRs) are of interest for a wide range of applications, ranging from imaging to molecular electronics, and they have been studied extensively for the past decade. An important issue in AuNR applications is the ability to self-assemble the rods in predictable structures...... on the nanoscale. We here present a new way to end-to-end link AuNRs with a single or few linker molecules. Whereas methods reported in the literature so far rely on modification of the AuNRs after the synthesis, we here dimerize gold nanoparticle seeds with a water-soluble dithiol-functionalized polyethylene...... that a large fraction of the rods are flexible around the hinging molecule in solution, as expected for a molecularly linked nanogap. By using excess of gold nanoparticles relative to the linking dithiol molecule, this method can provide a high probability that a single molecule is connecting the two rods...

  12. Increasing gas producer profitability with virtual well visibility via an end-to-end wireless Internet gas monitoring system

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, M. [Northrock Resources Ltd., Calgary, AB (Canada); Benterud, K. [Zed.i solutions, Calgary, AB (Canada)

    2003-07-01

    This PowerPoint presentation describes how Northrock Resources Ltd. increased profitability using Smart-Alek{sup TM} while avoiding high implementation costs. Smart-Alek is a new type of fully integrated end-to-end electronic gas flow measurement (GFM) system based on Field Intelligence Network and End User Interference (FINE). Smart-Alek can analyze gas production through public wireless communications and a web-browser delivery system. The system has enabled Northrock to increase gas volumes with more accurate measurement and reduced downtime. In addition, operating costs have decreased because the frequency of well visits has been reduced and the administrative procedures of data collection is more efficient. The real-time well visibility of the tool has proven to be very effective in optimizing business profitability. 7 figs.

  13. SU-E-T-282: Dose Measurements with An End-To-End Audit Phantom for Stereotactic Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Jones, R; Artschan, R [Calvary Mater Newcastle, Newcastle, NSW (Australia); Thwaites, D [University of Sydney, Sydney, NSW (Australia); Lehmann, J [Calvary Mater Newcastle, Newcastle, NSW (Australia); University of Sydney, Sydney, NSW (Australia)

    2015-06-15

    Purpose: Report on dose measurements as part of an end-to-end test for stereotactic radiotherapy, using a new audit tool, which allows audits to be performed efficiently either by an onsite team or as a postal audit. Methods: Film measurements have been performed with a new Stereotactic Cube Phantom. The phantom has been designed to perform Winston Lutz type position verification measurements and dose measurements in one setup. It comprises a plastic cube with a high density ball in its centre (used for MV imaging with film or EPID) and low density markers in the periphery (used for Cone Beam Computed Tomography, CBCT imaging). It also features strategically placed gold markers near the posterior and right surfaces, which can be used to calculate phantom rotations on MV images. Slit-like openings allow insertion of film or other detectors.The phantom was scanned and small field treatment plans were created. The fields do not traverse any inhomogeneities of the phantom on their paths to the measurement location. The phantom was setup at the delivery system using CBCT imaging. The calculated treatment fields were delivered, each with a piece of radiochromic film (EBT3) placed in the anterior film holder of the phantom. MU had been selected in planning to achieve similar exposures on all films. Calibration films were exposed in solid water for dose levels around the expected doses. Films were scanned and analysed following established procedures. Results: Setup of the cube showed excellent suitability for CBCT 3D alignment. MV imaging with EPID allowed for clear identification of all markers. Film based dose measurements showed good agreement for MLC created fields down to 0.5 mm × 0.5 mm. Conclusion: An end-to-end audit phantom for stereotactic radiotherapy has been developed and tested.

  14. Evaluation of Techniques to Detect Significant Network Performance Problems using End-to-End Active Network Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Cottrell, R.Les; Logg, Connie; Chhaparia, Mahesh; /SLAC; Grigoriev, Maxim; /Fermilab; Haro, Felipe; /Chile U., Catolica; Nazir, Fawad; /NUST, Rawalpindi; Sandford, Mark

    2006-01-25

    End-to-End fault and performance problems detection in wide area production networks is becoming increasingly hard as the complexity of the paths, the diversity of the performance, and dependency on the network increase. Several monitoring infrastructures are built to monitor different network metrics and collect monitoring information from thousands of hosts around the globe. Typically there are hundreds to thousands of time-series plots of network metrics which need to be looked at to identify network performance problems or anomalous variations in the traffic. Furthermore, most commercial products rely on a comparison with user configured static thresholds and often require access to SNMP-MIB information, to which a typical end-user does not usually have access. In our paper we propose new techniques to detect network performance problems proactively in close to realtime and we do not rely on static thresholds and SNMP-MIB information. We describe and compare the use of several different algorithms that we have implemented to detect persistent network problems using anomalous variations analysis in real end-to-end Internet performance measurements. We also provide methods and/or guidance for how to set the user settable parameters. The measurements are based on active probes running on 40 production network paths with bottlenecks varying from 0.5Mbits/s to 1000Mbit/s. For well behaved data (no missed measurements and no very large outliers) with small seasonal changes most algorithms identify similar events. We compare the algorithms' robustness with respect to false positives and missed events especially when there are large seasonal effects in the data. Our proposed techniques cover a wide variety of network paths and traffic patterns. We also discuss the applicability of the algorithms in terms of their intuitiveness, their speed of execution as implemented, and areas of applicability. Our encouraging results compare and evaluate the accuracy of our

  15. End-to-end models for marine ecosystems: Are we on the precipice of a significant advance or just putting lipstick on a pig?

    Directory of Open Access Journals (Sweden)

    Kenneth A. Rose

    2012-02-01

    Full Text Available There has been a rapid rise in the development of end-to-end models for marine ecosystems over the past decade. Some reasons for this rise include need for predicting effects of climate change on biota and dissatisfaction with existing models. While the benefits of a well-implemented end-to-end model are straightforward, there are many challenges. In the short term, my view is that the major role of end-to-end models is to push the modelling community forward, and to identify critical data so that these data can be collected now and thus be available for the next generation of end-to-end models. I think we should emulate physicists and build theoretically-oriented models first, and then collect the data. In the long-term, end-to-end models will increase their skill, data collection will catch up, and end-to-end models will move towards site-specific applications with forecasting and management capabilities. One pathway into the future is individual efforts, over-promise, and repackaging of poorly performing component submodels (“lipstick on a pig”. The other pathway is a community-based collaborative effort, with appropriate caution and thoughtfulness, so that the needed improvements are achieved (“significant advance”. The promise of end-to-end modelling is great. We should act now to avoid missing a great opportunity.

  16. An End-to-End Modeling and Simulation Testbed (EMAST) to Support Detailed Quantitative Evaluations of GIG Transport Services

    National Research Council Canada - National Science Library

    Comparetto, G; Schult, N; Mirhakkak, M; Chen, L; Wade, R; Duffalo, S

    2005-01-01

    .... A variety of services must be provided to the users including management of resources to support QoS, a transition path from IPv4 to IPv6, and efficient networking across heterogeneous networks (i.e...

  17. Double 90 Degrees Counterrotated End-to-End-Anastomosis: An Experimental Study of an Intestinal Anastomosis Technique.

    Science.gov (United States)

    Holzner, Philipp; Kulemann, Birte; Seifert, Gabriel; Glatz, Torben; Chikhladze, Sophia; Höppner, Jens; Hopt, Ulrich; Timme, Sylvia; Bronsert, Peter; Sick, Olivia; Zhou, Cheng; Marjanovic, Goran

    2015-06-01

    The aim of the article is to investigate a new anastomotic technique compared with standardized intestinal anastomotic procedures. A total of 32 male Wistar rats were randomized to three groups. In the Experimental Group (n = 10), the new double 90 degrees inversely rotated anastomosis was used, in the End Group (n = 10) a single-layer end-to-end anastomosis, and in the Side Group (n = 12) a single-layer side-to-side anastomosis. All anastomoses were done using interrupted sutures. On postoperative day 4, rats were relaparotomized. Bursting pressure, hydroxyproline concentration, a semiquantitative adhesion score and two histological anastomotic healing scores (mucosal healing according to Chiu and overall anastomotic healing according to Verhofstad) were collected. Most data are presented as median (range). p < 0.05 was considered significant. Anastomotic insufficiency occurred only in one rat of the Side Group. Median bursting pressure in the Experimental Group was 105 mm Hg (range = 72-161 mm Hg), significantly higher in the End Group (164 mm Hg; range = 99-210 mm Hg; p = 0.021) and lower in the Side Group by trend (81 mm Hg; range = 59-122 mm Hg; p = 0.093). Hydroxyproline concentration did not differ significantly in between the groups. The adhesion score was 2.5 (range = 1-3) in the Experimental Group, 2 (range = 1-2) in the End Group, but there were significantly more adhesions in the Side Group (range = 3-4); p = 0.020 versus Experimental Group, p < 0.001 versus End Group. The Chiu Score showed the worst mucosal healing in the Experimental Group. The overall Verhofstad Score was significantly worse (mean = 2.032; standard deviation [SD] = 0.842) p = 0.031 and p = 0.002 in the Experimental Group, compared with the Side Group (mean = 1.729; SD = 0.682) and the End Group (mean = 1.571; SD = 0.612). The new anastomotic technique is feasible and did not show any relevant complication. Even though it was superior to the side-to-side anastomosis by trend with

  18. A novel end-to-end classifier using domain transferred deep convolutional neural networks for biomedical images.

    Science.gov (United States)

    Pang, Shuchao; Yu, Zhezhou; Orgun, Mehmet A

    2017-03-01

    Highly accurate classification of biomedical images is an essential task in the clinical diagnosis of numerous medical diseases identified from those images. Traditional image classification methods combined with hand-crafted image feature descriptors and various classifiers are not able to effectively improve the accuracy rate and meet the high requirements of classification of biomedical images. The same also holds true for artificial neural network models directly trained with limited biomedical images used as training data or directly used as a black box to extract the deep features based on another distant dataset. In this study, we propose a highly reliable and accurate end-to-end classifier for all kinds of biomedical images via deep learning and transfer learning. We first apply domain transferred deep convolutional neural network for building a deep model; and then develop an overall deep learning architecture based on the raw pixels of original biomedical images using supervised training. In our model, we do not need the manual design of the feature space, seek an effective feature vector classifier or segment specific detection object and image patches, which are the main technological difficulties in the adoption of traditional image classification methods. Moreover, we do not need to be concerned with whether there are large training sets of annotated biomedical images, affordable parallel computing resources featuring GPUs or long times to wait for training a perfect deep model, which are the main problems to train deep neural networks for biomedical image classification as observed in recent works. With the utilization of a simple data augmentation method and fast convergence speed, our algorithm can achieve the best accuracy rate and outstanding classification ability for biomedical images. We have evaluated our classifier on several well-known public biomedical datasets and compared it with several state-of-the-art approaches. We propose a robust

  19. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: A vision for the future

    Science.gov (United States)

    Ramamurthy, M. K.

    2006-05-01

    yet revolutionary way of building applications and methods to connect and exchange information over the Web. This new approach, based on XML - a widely accepted format for exchanging data and corresponding semantics over the Internet - enables applications, computer systems, and information processes to work together in fundamentally different ways. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational landscape, discuss recent developments in cyberinfrastructure, and Unidata's role in and vision for providing easy-to use, robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  20. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: Unidata's Plans and Directions

    Science.gov (United States)

    Ramamurthy, M.

    2005-12-01

    work together in a fundamentally different way. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational drivers and discuss recent developments in cyberinfrastructure and Unidata's role and directions in providing robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  1. SU-F-J-177: A Novel Image Analysis Technique (center Pixel Method) to Quantify End-To-End Tests

    Energy Technology Data Exchange (ETDEWEB)

    Wen, N; Chetty, I [Henry Ford Health System, Detroit, MI (United States); Snyder, K [Henry Ford Hospital System, Detroit, MI (United States); Scheib, S [Varian Medical System, Barton (Switzerland); Qin, Y; Li, H [Henry Ford Health System, Detroit, Michigan (United States)

    2016-06-15

    Purpose: To implement a novel image analysis technique, “center pixel method”, to quantify end-to-end tests accuracy of a frameless, image guided stereotactic radiosurgery system. Methods: The localization accuracy was determined by delivering radiation to an end-to-end prototype phantom. The phantom was scanned with 0.8 mm slice thickness. The treatment isocenter was placed at the center of the phantom. In the treatment room, CBCT images of the phantom (kVp=77, mAs=1022, slice thickness 1 mm) were acquired to register to the reference CT images. 6D couch correction were applied based on the registration results. Electronic Portal Imaging Device (EPID)-based Winston Lutz (WL) tests were performed to quantify the errors of the targeting accuracy of the system at 15 combinations of gantry, collimator and couch positions. The images were analyzed using two different methods. a) The classic method. The deviation was calculated by measuring the radial distance between the center of the central BB and the full width at half maximum of the radiation field. b) The center pixel method. Since the imager projection offset from the treatment isocenter was known from the IsoCal calibration, the deviation was determined between the center of the BB and the central pixel of the imager panel. Results: Using the automatic registration method to localize the phantom and the classic method of measuring the deviation of the BB center, the mean and standard deviation of the radial distance was 0.44 ± 0.25, 0.47 ± 0.26, and 0.43 ± 0.13 mm for the jaw, MLC and cone defined field sizes respectively. When the center pixel method was used, the mean and standard deviation was 0.32 ± 0.18, 0.32 ± 0.17, and 0.32 ± 0.19 mm respectively. Conclusion: Our results demonstrated that the center pixel method accurately analyzes the WL images to evaluate the targeting accuracy of the radiosurgery system. The work was supported by a Research Scholar Grant, RSG-15-137-01-CCE from the American

  2. End-to-end probability for an interacting center vortex world line in Yang-Mills theory

    International Nuclear Information System (INIS)

    Teixeira, Bruno F.I.; Lemos, Andre L.L. de; Oxman, Luis E.

    2011-01-01

    Full text: The understanding of quark confinement is a very important open problem in Yang-Mills theory. In this regard, nontrivial topological defects are expected to play a relevant role to achieve a solution. Here we are interested in how to deal with these structures, relying on the Cho-Faddeev-Niemi decomposition and the possibility it offers to describe defects in terms of a local color frame. In particular, the path integral for a single center vortex is a fundamental object to handle the ensemble integration. As is well-known, in three dimensions center vortices are string-like and the associated physics is closely related with that of polymers. Using recent techniques developed in the latter context, we present in this work a detailed derivation of the equation for the end-to-end probability for a center vortex world line, including the effects of interactions. Its solution can be associated with a Green function that depends on the position and orientation at the boundaries, where monopole-like instantons are placed. In the limit of semi flexible polymers, an expansion only keeping the lower angular momenta for the final orientation leads to a reduced Green function for a complex vortex field minimally coupled to the dual Yang-Mills fields. This constitutes a key ingredient to propose an effective model for correlated monopoles, center vortices and the dual fields. (author)

  3. Minimizing Barriers in Learning for On-Call Radiology Residents-End-to-End Web-Based Resident Feedback System.

    Science.gov (United States)

    Choi, Hailey H; Clark, Jennifer; Jay, Ann K; Filice, Ross W

    2018-02-01

    Feedback is an essential part of medical training, where trainees are provided with information regarding their performance and further directions for improvement. In diagnostic radiology, feedback entails a detailed review of the differences between the residents' preliminary interpretation and the attendings' final interpretation of imaging studies. While the on-call experience of independently interpreting complex cases is important to resident education, the more traditional synchronous "read-out" or joint review is impossible due to multiple constraints. Without an efficient method to compare reports, grade discrepancies, convey salient teaching points, and view images, valuable lessons in image interpretation and report construction are lost. We developed a streamlined web-based system, including report comparison and image viewing, to minimize barriers in asynchronous communication between attending radiologists and on-call residents. Our system provides real-time, end-to-end delivery of case-specific and user-specific feedback in a streamlined, easy-to-view format. We assessed quality improvement subjectively through surveys and objectively through participation metrics. Our web-based feedback system improved user satisfaction for both attending and resident radiologists, and increased attending participation, particularly with regards to cases where substantive discrepancies were identified.

  4. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy

    International Nuclear Information System (INIS)

    Gallas, Raya R.; Huenemohr, Nora; Runz, Armin; Niebuhr, Nina I.; Greilich, Steffen; Jaekel, Oliver

    2015-01-01

    With the increasing complexity of external beam therapy ''end-to-end'' tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification.

  5. Vision-based mobile robot navigation through deep convolutional neural networks and end-to-end learning

    Science.gov (United States)

    Zhang, Yachu; Zhao, Yuejin; Liu, Ming; Dong, Liquan; Kong, Lingqin; Liu, Lingling

    2017-09-01

    In contrast to humans, who use only visual information for navigation, many mobile robots use laser scanners and ultrasonic sensors along with vision cameras to navigate. This work proposes a vision-based robot control algorithm based on deep convolutional neural networks. We create a large 15-layer convolutional neural network learning system and achieve the advanced recognition performance. Our system is trained from end to end to map raw input images to direction in supervised mode. The images of data sets are collected in a wide variety of weather conditions and lighting conditions. Besides, the data sets are augmented by adding Gaussian noise and Salt-and-pepper noise to avoid overfitting. The algorithm is verified by two experiments, which are line tracking and obstacle avoidance. The line tracking experiment is proceeded in order to track the desired path which is composed of straight and curved lines. The goal of obstacle avoidance experiment is to avoid the obstacles indoor. Finally, we get 3.29% error rate on the training set and 5.1% error rate on the test set in the line tracking experiment, 1.8% error rate on the training set and less than 5% error rate on the test set in the obstacle avoidance experiment. During the actual test, the robot can follow the runway centerline outdoor and avoid the obstacle in the room accurately. The result confirms the effectiveness of the algorithm and our improvement in the network structure and train parameters

  6. SampleCNN: End-to-End Deep Convolutional Neural Networks Using Very Small Filters for Music Classification

    Directory of Open Access Journals (Sweden)

    Jongpil Lee

    2018-01-01

    Full Text Available Convolutional Neural Networks (CNN have been applied to diverse machine learning tasks for different modalities of raw data in an end-to-end fashion. In the audio domain, a raw waveform-based approach has been explored to directly learn hierarchical characteristics of audio. However, the majority of previous studies have limited their model capacity by taking a frame-level structure similar to short-time Fourier transforms. We previously proposed a CNN architecture which learns representations using sample-level filters beyond typical frame-level input representations. The architecture showed comparable performance to the spectrogram-based CNN model in music auto-tagging. In this paper, we extend the previous work in three ways. First, considering the sample-level model requires much longer training time, we progressively downsample the input signals and examine how it affects the performance. Second, we extend the model using multi-level and multi-scale feature aggregation technique and subsequently conduct transfer learning for several music classification tasks. Finally, we visualize filters learned by the sample-level CNN in each layer to identify hierarchically learned features and show that they are sensitive to log-scaled frequency.

  7. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Gallas, Raya R.; Huenemohr, Nora; Runz, Armin; Niebuhr, Nina I.; Greilich, Steffen [German Cancer Research Center (DKFZ), Heidelberg (Germany). Div. of Medical Physics in Radiation Oncology; National Center for Radiation Research in Oncology, Heidelberg (Germany). Heidelberg Institute of Radiation Oncology (HIRO); Jaekel, Oliver [German Cancer Research Center (DKFZ), Heidelberg (Germany). Div. of Medical Physics in Radiation Oncology; National Center for Radiation Research in Oncology, Heidelberg (Germany). Heidelberg Institute of Radiation Oncology (HIRO); Heidelberg University Hospital (Germany). Dept. of Radiation Oncology; Heidelberg Ion-Beam Therapy Center (HIT), Heidelberg (Germany)

    2015-07-01

    With the increasing complexity of external beam therapy ''end-to-end'' tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification.

  8. Innovative strategy for effective critical laboratory result management: end-to-end process using automation and manual call centre.

    Science.gov (United States)

    Ti, Lian Kah; Ang, Sophia Bee Leng; Saw, Sharon; Sethi, Sunil Kumar; Yip, James W L

    2012-08-01

    Timely reporting and acknowledgement are crucial steps in critical laboratory results (CLR) management. The authors previously showed that an automated pathway incorporating short messaging system (SMS) texts, auto-escalation, and manual telephone back-up improved the rate and speed of physician acknowledgement compared with manual telephone calling alone. This study investigated if it also improved the rate and speed of physician intervention to CLR and whether utilising the manual back-up affected intervention rates. Data from seven audits between November 2007 and January 2011 were analysed. These audits were carried out to assess the robustness of CLR reporting process in the authors' institution. Comparisons were made in the rate and speed of acknowledgement and intervention between the audits performed before and after automation. Using the automation audits, the authors compared intervention data between communication with SMS only and when manual intervention was required. 1680 CLR were reported during the audit periods. Automation improved the rate (100% vs 84.2%; pautomation audits, the use of SMS only did not improve physician intervention rates. The automated communication pathway improved physician intervention rate and time in tandem with improved acknowledgement rate and time when compared with manual telephone calling. The use of manual intervention to augment automation did not adversely affect physician intervention rate, implying that an end-to-end pathway was more important than automation alone.

  9. Delayed primary end-to-end anastomosis for traumatic long segment urethral stricture and its short-term outcomes

    Directory of Open Access Journals (Sweden)

    Rajarshi Kumar

    2017-01-01

    Full Text Available Background: The purpose of this study is to evaluate the aetiology of posterior urethral stricture in children and analysis of results after delayed primary repair with extensive distal urethral mobilisation. Materials and Methods: This was a retrospective study carried out in a tertiary care centre from January 2009 to December 2013. Results: Eight children with median age 7.5 years (range 4–11 years, underwent delayed anastomotic urethroplasty: Six through perineal and two through combined perineal and transpubic approach. All the eight children had long-segment >2 cm stricture: Three posterior and five anterior urethral stricture. On a mean follow-up period of 33 months (range 24–48 m, all were passing urine with good flow and stream. Conclusion: End-to-end anastomosis in post-traumatic long segment posterior urethral stricture between prostatic and penile urethra in children is possible by perineal or combined perineal and transpubic approach with good results without any urethral replacement.

  10. A real-time 3D end-to-end augmented reality system (and its representation transformations)

    Science.gov (United States)

    Tytgat, Donny; Aerts, Maarten; De Busser, Jeroen; Lievens, Sammy; Rondao Alface, Patrice; Macq, Jean-Francois

    2016-09-01

    The new generation of HMDs coming to the market is expected to enable many new applications that allow free viewpoint experiences with captured video objects. Current applications usually rely on 3D content that is manually created or captured in an offline manner. In contrast, this paper focuses on augmented reality applications that use live captured 3D objects while maintaining free viewpoint interaction. We present a system that allows live dynamic 3D objects (e.g. a person who is talking) to be captured in real-time. Real-time performance is achieved by traversing a number of representation formats and exploiting their specific benefits. For instance, depth images are maintained for fast neighborhood retrieval and occlusion determination, while implicit surfaces are used to facilitate multi-source aggregation for both geometry and texture. The result is a 3D reconstruction system that outputs multi-textured triangle meshes at real-time rates. An end-to-end system is presented that captures and reconstructs live 3D data and allows for this data to be used on a networked (AR) device. For allocating the different functional blocks onto the available physical devices, a number of alternatives are proposed considering the available computational power and bandwidth for each of the components. As we will show, the representation format can play an important role in this functional allocation and allows for a flexible system that can support a highly heterogeneous infrastructure.

  11. Increasing gas producer profitability with virtual well visibility via an end-to-end, wireless Internet gas monitoring system

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, M.; Coleman, K.; Beck, R.; Lyon, R.; Potts, R. [Northrock Resources Ltd., Calgary, AB (Canada); Benterud, K. [Zed.i solutions, Calgary, AB (Canada)

    2003-07-01

    Most gas producing companies still use 100-year old technology to measure gas volumes because of the prohibitive costs of implementing corporate wide electronic information systems to replace circular mechanical chart technology. This paper describes how Northrock Resources Ltd. increased profitability using Smart-Alek{sup TM} while avoiding high implementation costs. Smart-Alek is a new type of fully integrated end-to-end electronic gas flow measurement (GFM) system based on Field Intelligence Network and End User Interference (FINE). Smart-Alek can analyze gas production through public wireless communications and a web-browser delivery system. The system has enabled Northrock to increase gas volumes with more accurate measurement and reduced downtime. In addition, operating costs were also decreased because the frequency of well visits was reduced and the administrative procedures of data collection was more efficient. The real-time well visibility of the tool has proven to be very effective in optimizing business profitability. 9 refs., 1 tab., 9 figs.

  12. End-to-end gene fusions and their impact on the production of multifunctional biomass degrading enzymes

    International Nuclear Information System (INIS)

    Rizk, Mazen; Antranikian, Garabed; Elleuche, Skander

    2012-01-01

    Highlights: ► Multifunctional enzymes offer an interesting approach for biomass degradation. ► Size and conformation of separate constructs play a role in the effectiveness of chimeras. ► A connecting linker allows for maximal flexibility and increased thermostability. ► Genes with functional similarities are the best choice for fusion candidates. -- Abstract: The reduction of fossil fuels, coupled with its increase in price, has made the search for alternative energy resources more plausible. One of the topics gaining fast interest is the utilization of lignocellulose, the main component of plants. Its primary constituents, cellulose and hemicellulose, can be degraded by a series of enzymes present in microorganisms, into simple sugars, later used for bioethanol production. Thermophilic bacteria have proven to be an interesting source of enzymes required for hydrolysis since they can withstand high and denaturing temperatures, which are usually required for processes involving biomass degradation. However, the cost associated with the whole enzymatic process is staggering. A solution for cost effective and highly active production is through the construction of multifunctional enzyme complexes harboring the function of more than one enzyme needed for the hydrolysis process. There are various strategies for the degradation of complex biomass ranging from the regulation of the enzymes involved, to cellulosomes, and proteins harboring more than one enzymatic activity. In this review, the construction of multifunctional biomass degrading enzymes through end-to-end gene fusions, and its impact on production and activity by choosing the enzymes and linkers is assessed.

  13. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy.

    Science.gov (United States)

    Gallas, Raya R; Hünemohr, Nora; Runz, Armin; Niebuhr, Nina I; Jäkel, Oliver; Greilich, Steffen

    2015-12-01

    With the increasing complexity of external beam therapy "end-to-end" tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification. Copyright © 2015. Published by Elsevier GmbH.

  14. On cryptographic security of end-to-end encrypted connections in WhatsApp and Telegram messengers

    Directory of Open Access Journals (Sweden)

    Sergey V. Zapechnikov

    2017-11-01

    Full Text Available The aim of this work is to analyze the available possibilities for improving secure messaging with end-to-end connections under conditions of external violator actions and distrusted service provider. We made a comparative analysis of cryptographic security mechanisms for two widely used messengers: Telegram and WhatsApp. It was found that Telegram is based on MTProto protocol, while WhatsApp is based on the alternative Signal protocol. We examine the specific features of messengers implementation associated with random number generation on the most popular Android mobile platform. It was shown that Signal has better security properties. It is used in several other popular messengers such as TextSecure, RedPhone, GoogleAllo, FacebookMessenger, Signal along with WhatsApp. A number of possible attacks on both messengers were analyzed in details. In particular, we demonstrate that the metadata are poorly protected in both messengers. Metadata security may be one of the goals for further studies.

  15. Secondary link adaptation in cognitive radio networks: End-to-end performance with cross-layer design

    KAUST Repository

    Ma, Hao

    2012-04-01

    Under spectrum-sharing constraints, we consider the secondary link exploiting cross-layer combining of adaptive modulation and coding (AMC) at the physical layer with truncated automatic repeat request (T-ARQ) at the data link layer in cognitive radio networks. Both, basic AMC and aggressive AMC, are adopted to optimize the overall average spectral efficiency, subject to the interference constraints imposed by the primary user of the shared spectrum band and a target packet loss rate. We achieve the optimal boundary points in closed form to choose the AMC transmission modes by taking into account the channel state information from the secondary transmitter to both the primary receiver and the secondary receiver. Moreover, numerical results substantiate that, without any cost in the transmitter/receiver design nor the end-to-end delay, the scheme with aggressive AMC outperforms that with conventional AMC. The main reason is that, with aggressive AMC, different transmission modes utilized in the initial packet transmission and the following retransmissions match the time-varying channel conditions better than the basic pattern. © 2012 IEEE.

  16. Chinese Medical Question Answer Matching Using End-to-End Character-Level Multi-Scale CNNs

    Directory of Open Access Journals (Sweden)

    Sheng Zhang

    2017-07-01

    Full Text Available This paper focuses mainly on the problem of Chinese medical question answer matching, which is arguably more challenging than open-domain question answer matching in English due to the combination of its domain-restricted nature and the language-specific features of Chinese. We present an end-to-end character-level multi-scale convolutional neural framework in which character embeddings instead of word embeddings are used to avoid Chinese word segmentation in text preprocessing, and multi-scale convolutional neural networks (CNNs are then introduced to extract contextual information from either question or answer sentences over different scales. The proposed framework can be trained with minimal human supervision and does not require any handcrafted features, rule-based patterns, or external resources. To validate our framework, we create a new text corpus, named cMedQA, by harvesting questions and answers from an online Chinese health and wellness community. The experimental results on the cMedQA dataset show that our framework significantly outperforms several strong baselines, and achieves an improvement of top-1 accuracy by up to 19%.

  17. West Coast fish, mammal, and bird species diets - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  18. Gulf of California species and catch spatial distributions and historical time series - Developing end-to-end models of the Gulf of California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the northern Gulf of California, linking oceanography, biogeochemistry, food web...

  19. West Coast fish, mammal, bird life history and abunance parameters - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  20. INTEGRITY -- Integrated Human Exploration Mission Simulation Facility

    Science.gov (United States)

    Henninger, D.; Tri, T.; Daues, K.

    It is proposed to develop a high -fidelity ground facil ity to carry out long-duration human exploration mission simulations. These would not be merely computer simulations - they would in fact comprise a series of actual missions that just happen to stay on earth. These missions would include all elements of an actual mission, using actual technologies that would be used for the real mission. These missions would also include such elements as extravehicular activities, robotic systems, telepresence and teleoperation, surface drilling technology--all using a simulated planetary landscape. A sequence of missions would be defined that get progressively longer and more robust, perhaps a series of five or six missions over a span of 10 to 15 years ranging in durat ion from 180 days up to 1000 days. This high-fidelity ground facility would operate hand-in-hand with a host of other terrestrial analog sites such as the Antarctic, Haughton Crater, and the Arizona desert. Of course, all of these analog mission simulations will be conducted here on earth in 1-g, and NASA will still need the Shuttle and ISS to carry out all the microgravity and hypogravity science experiments and technology validations. The proposed missions would have sufficient definition such that definitive requirements could be derived from them to serve as direction for all the program elements of the mission. Additionally, specific milestones would be established for the "launch" date of each mission so that R&D programs would have both good requirements and solid milestones from which to build their implementation plans. Mission aspects that could not be directly incorporated into the ground facility would be simulated via software. New management techniques would be developed for evaluation in this ground test facility program. These new techniques would have embedded metrics which would allow them to be continuously evaluated and adjusted so that by the time the sequence of missions is completed

  1. Adaptation and validation of a commercial head phantom for cranial radiosurgery dosimetry end-to-end audit.

    Science.gov (United States)

    Dimitriadis, Alexis; Palmer, Antony L; Thomas, Russell A S; Nisbet, Andrew; Clark, Catharine H

    2017-06-01

    To adapt and validate an anthropomorphic head phantom for use in a cranial radiosurgery audit. Two bespoke inserts were produced for the phantom: one for providing the target and organ at risk for delineation and the other for performing dose measurements. The inserts were tested to assess their positional accuracy. A basic treatment plan dose verification with an ionization chamber was performed to establish a baseline accuracy for the phantom and beam model. The phantom and inserts were then used to perform dose verification measurements of a radiosurgery plan. The dose was measured with alanine pellets, EBT extended dose film and a plastic scintillation detector (PSD). Both inserts showed reproducible positioning (±0.5 mm) and good positional agreement between them (±0.6 mm). The basic treatment plan measurements showed agreement to the treatment planning system (TPS) within 0.5%. Repeated film measurements showed consistent gamma passing rates with good agreement to the TPS. For 2%-2 mm global gamma, the mean passing rate was 96.7% and the variation in passing rates did not exceed 2.1%. The alanine pellets and PSD showed good agreement with the TPS (-0.1% and 0.3% dose difference in the target) and good agreement with each other (within 1%). The adaptations to the phantom showed acceptable accuracies. The presence of alanine and PSD do not affect film measurements significantly, enabling simultaneous measurements by all three detectors. Advances in knowledge: A novel method for thorough end-to-end test of radiosurgery, with capability to incorporate all steps of the clinical pathway in a time-efficient and reproducible manner, suitable for a national audit.

  2. First Demonstration of Real-Time End-to-End 40 Gb/s PAM-4 System using 10-G Transmitter for Next Generation Access Applications

    DEFF Research Database (Denmark)

    Wei, Jinlong; Eiselt, Nicklas; Griesser, Helmut

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved.......We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved....

  3. Partial QoS-Aware Opportunistic Relay Selection Over Two-Hop Channels: End-to-End Performance Under Spectrum-Sharing Requirements

    KAUST Repository

    Yuli Yang,; Hao Ma,; Aissa, Sonia

    2014-01-01

    it with transmission constraints imposed on the transmit power budget and interference to other users. By analyzing the statistics of received SNRs in the first and second hops, we obtain the end-to-end PLR of this scheme in closed form under the considered scenario

  4. The End-to-end Demonstrator for improved decision making in the water sector in Europe (EDgE)

    Science.gov (United States)

    Wood, Eric; Wanders, Niko; Pan, Ming; Sheffield, Justin; Samaniego, Luis; Thober, Stephan; Kumar, Rohinni; Prudhomme, Christel; Houghton-Carr, Helen

    2017-04-01

    High-resolution simulations of water resources from hydrological models are vital to supporting important climate services. Apart from a high level of detail, both spatially and temporally, it is important to provide simulations that consistently cover a range of timescales, from historical reanalysis to seasonal forecast and future projections. In the new EDgE project commissioned by the ECMWF (C3S) we try to fulfill these requirements. EDgE is a proof-of-concept project which combines climate data and state-of-the-art hydrological modelling to demonstrate a water-oriented information system implemented through a web application. EDgE is working with key European stakeholders representative of private and public sectors to jointly develop and tailor approaches and techniques. With these tools, stakeholders are assisted in using improved climate information in decision-making, and supported in the development of climate change adaptation and mitigation policies. Here, we present the first results of the EDgE modelling chain, which is divided into three main processes: 1) pre-processing and downscaling; 2) hydrological modelling; 3) post-processing. Consistent downscaling and bias corrections for historical simulations, seasonal forecasts and climate projections ensure that the results across scales are robust. The daily temporal resolution and 5km spatial resolution ensure locally relevant simulations. With the use of four hydrological models (PCR-GLOBWB, VIC, mHM, Noah-MP), uncertainty between models is properly addressed, while consistency is guaranteed by using identical input data for static land surface parameterizations. The forecast results are communicated to stakeholders via Sectoral Climate Impact Indicators (SCIIs) that have been created in collaboration with the end-user community of the EDgE project. The final product of this project is composed of 15 years of seasonal forecast and 10 climate change projections, all combined with four hydrological

  5. Including 10-Gigabit-capable Passive Optical Network under End-to-End Generalized Multi-Protocol Label Switching Provisioned Quality of Service

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Gavler, Anders; Wessing, Henrik

    2012-01-01

    of the network where quality of service signaling is bridged. This article proposes strategies for generalized multi-protocol label switching control over next emerging passive optical network standard, i.e., the 10-gigabit-capable passive optical network. Node management and resource allocation approaches...... are discussed, and possible issues are raised. The analysis shows that consideration of a 10-gigabit-capable passive optical network as a generalized multi-protocol label switching controlled domain is valid and may advance end-to-end quality of service provisioning for passive optical network based customers.......End-to-end quality of service provisioning is still a challenging task despite many years of research and development in this area. Considering a generalized multi-protocol label switching based core/metro network and resource reservation protocol capable home gateways, it is the access part...

  6. Influence of suture technique and suture material selection on the mechanics of end-to-end and end-to-side anastomoses.

    Science.gov (United States)

    Baumgartner, N; Dobrin, P B; Morasch, M; Dong, Q S; Mrkvicka, R

    1996-05-01

    Experiments were performed in dogs to evaluate the mechanics of 26 end-to-end and 42 end-to-side artery-vein graft anastomoses constructed with continuous polypropylene sutures (Surgilene; Davis & Geck, Division of American Cyanamid Co., Danbury, Conn.), continuous polybutester sutures (Novafil; Davis & Geck), and interrupted stitches with either suture material. After construction, the grafts and adjoining arteries were excised, mounted in vitro at in situ length, filled with a dilute barium sulfate suspension, and pressurized in 25 mm Hg steps up to 200 mm Hg. Radiographs were obtained at each pressure. The computed cross-sectional areas of the anastomoses were compared with those of the native arteries at corresponding pressures. Results showed that for the end-to-end anastomoses at 100 mm Hg the cross-sectional areas of the continuous Surgilene anastomoses were 70% of the native artery cross-sectional areas, the cross-sectional areas of the continuous Novafil anastomoses were 90% of the native artery cross-sectional areas, and the cross-sectional areas of the interrupted anastomoses were 107% of the native artery cross-sectional areas (p anastomoses demonstrated no differences in cross-sectional areas or compliance for the three suture techniques. This suggests that, unlike with end-to-end anastomoses, when constructing an end-to-side anastomosis in patients any of the three suture techniques may be acceptable.

  7. One stage functional end-to-end stapled intestinal anastomosis and resection performed by nonexpert surgeons for the treatment of small intestinal obstruction in 30 dogs.

    Science.gov (United States)

    Jardel, Nicolas; Hidalgo, Antoine; Leperlier, Dimitri; Manassero, Mathieu; Gomes, Aymeric; Bedu, Anne Sophie; Moissonnier, Pierre; Fayolle, Pascal; Begon, Dominique; Riquois, Elisabeth; Viateau, Véronique

    2011-02-01

    To describe stapled 1-stage functional end-to-end intestinal anastomosis for treatment of small intestinal obstruction in dogs and evaluate outcome when the technique is performed by nonexpert surgeons after limited training in the technique. Case series. Dogs (n=30) with intestinal lesions requiring an enterectomy. Stapled 1-stage functional end-to-end anastomosis and resection using a GIA-60 and a TA-55 stapling devices were performed under supervision of senior residents and faculty surgeons by junior surgeons previously trained in the technique on pigs. Procedure duration and technical problems were recorded. Short-term results were collected during hospitalization and at suture removal. Long-term outcome was established by clinical and ultrasonographic examinations at least 2 months after surgery and from written questionnaires, completed by owners. Mean±SD procedure duration was 15±12 minutes. Postoperative recovery was uneventful in 25 dogs. One dog had anastomotic leakage, 1 had a localized abscess at the transverse staple line, and 3 dogs developed an incisional abdominal wall abscess. No long-term complications occurred (follow-up, 2-32 months). Stapled 1-stage functional end-to-end anastomosis and resection is a fast and safe procedure in the hand of nonexpert but trained surgeons. © Copyright 2011 by The American College of Veterinary Surgeons.

  8. Architecture oriented modeling and simulation method for combat mission profile

    Directory of Open Access Journals (Sweden)

    CHEN Xia

    2017-05-01

    Full Text Available In order to effectively analyze the system behavior and system performance of combat mission profile, an architecture-oriented modeling and simulation method is proposed. Starting from the architecture modeling,this paper describes the mission profile based on the definition from National Military Standard of China and the US Department of Defense Architecture Framework(DoDAFmodel, and constructs the architecture model of the mission profile. Then the transformation relationship between the architecture model and the agent simulation model is proposed to form the mission profile executable model. At last,taking the air-defense mission profile as an example,the agent simulation model is established based on the architecture model,and the input and output relations of the simulation model are analyzed. It provides method guidance for the combat mission profile design.

  9. Primary and secondary structure dependence of peptide flexibility assessed by fluorescence-based measurement of end-to-end collision rates.

    Science.gov (United States)

    Huang, Fang; Hudgins, Robert R; Nau, Werner M

    2004-12-22

    The intrachain fluorescence quenching of the fluorophore 2,3-diazabicyclo[2.2.2]oct-2-ene (DBO) is measured in short peptide fragments, namely the two strands and the turn of the N-terminal beta-hairpin of ubiquitin. The investigated peptides adopt a random-coil conformation in aqueous solution according to CD and NMR experiments. The combination of quenchers with different quenching efficiencies, namely tryptophan and tyrosine, allows the extrapolation of the rate constants for end-to-end collision rates as well as the dissociation of the end-to-end encounter complex. The measured activation energies for fluorescence quenching demonstrate that the end-to-end collision process in peptides is partially controlled by internal friction within the backbone, while measurements in solvents of different viscosities (H2O, D2O, and 7.0 M guanidinium chloride) suggest that solvent friction is an additional important factor in determining the collision rate. The extrapolated end-to-end collision rates, which are only slightly larger than the experimental rates for the DBO/Trp probe/quencher system, provide a measure of the conformational flexibility of the peptide backbone. The chain flexibility is found to be strongly dependent on the type of secondary structure that the peptides represent. The collision rates for peptides derived from the beta-strand motifs (ca. 1 x 10(7) s(-1)) are ca. 4 times slower than that derived from the beta-turn. The results provide further support for the hypothesis that chain flexibility is an important factor in the preorganization of protein fragments during protein folding. Mutations to the beta-turn peptide show that subtle sequence changes strongly affect the flexibility of peptides as well. The protonation and charge status of the peptides, however, are shown to have no significant effect on the flexibility of the investigated peptides. The meaning and definition of end-to-end collision rates in the context of protein folding are critically

  10. Next Generation Simulation Framework for Robotic and Human Space Missions

    Science.gov (United States)

    Cameron, Jonathan M.; Balaram, J.; Jain, Abhinandan; Kuo, Calvin; Lim, Christopher; Myint, Steven

    2012-01-01

    The Dartslab team at NASA's Jet Propulsion Laboratory (JPL) has a long history of developing physics-based simulations based on the Darts/Dshell simulation framework that have been used to simulate many planetary robotic missions, such as the Cassini spacecraft and the rovers that are currently driving on Mars. Recent collaboration efforts between the Dartslab team at JPL and the Mission Operations Directorate (MOD) at NASA Johnson Space Center (JSC) have led to significant enhancements to the Dartslab DSENDS (Dynamics Simulator for Entry, Descent and Surface landing) software framework. The new version of DSENDS is now being used for new planetary mission simulations at JPL. JSC is using DSENDS as the foundation for a suite of software known as COMPASS (Core Operations, Mission Planning, and Analysis Spacecraft Simulation) that is the basis for their new human space mission simulations and analysis. In this paper, we will describe the collaborative process with the JPL Dartslab and the JSC MOD team that resulted in the redesign and enhancement of the DSENDS software. We will outline the improvements in DSENDS that simplify creation of new high-fidelity robotic/spacecraft simulations. We will illustrate how DSENDS simulations are assembled and show results from several mission simulations.

  11. A fully automatic end-to-end method for content-based image retrieval of CT scans with similar liver lesion annotations.

    Science.gov (United States)

    Spanier, A B; Caplan, N; Sosna, J; Acar, B; Joskowicz, L

    2018-01-01

    The goal of medical content-based image retrieval (M-CBIR) is to assist radiologists in the decision-making process by retrieving medical cases similar to a given image. One of the key interests of radiologists is lesions and their annotations, since the patient treatment depends on the lesion diagnosis. Therefore, a key feature of M-CBIR systems is the retrieval of scans with the most similar lesion annotations. To be of value, M-CBIR systems should be fully automatic to handle large case databases. We present a fully automatic end-to-end method for the retrieval of CT scans with similar liver lesion annotations. The input is a database of abdominal CT scans labeled with liver lesions, a query CT scan, and optionally one radiologist-specified lesion annotation of interest. The output is an ordered list of the database CT scans with the most similar liver lesion annotations. The method starts by automatically segmenting the liver in the scan. It then extracts a histogram-based features vector from the segmented region, learns the features' relative importance, and ranks the database scans according to the relative importance measure. The main advantages of our method are that it fully automates the end-to-end querying process, that it uses simple and efficient techniques that are scalable to large datasets, and that it produces quality retrieval results using an unannotated CT scan. Our experimental results on 9 CT queries on a dataset of 41 volumetric CT scans from the 2014 Image CLEF Liver Annotation Task yield an average retrieval accuracy (Normalized Discounted Cumulative Gain index) of 0.77 and 0.84 without/with annotation, respectively. Fully automatic end-to-end retrieval of similar cases based on image information alone, rather that on disease diagnosis, may help radiologists to better diagnose liver lesions.

  12. Experience of using MOSFET detectors for dose verification measurements in an end-to-end 192Ir brachytherapy quality assurance system.

    Science.gov (United States)

    Persson, Maria; Nilsson, Josef; Carlsson Tedgren, Åsa

    Establishment of an end-to-end system for the brachytherapy (BT) dosimetric chain could be valuable in clinical quality assurance. Here, the development of such a system using MOSFET (metal oxide semiconductor field effect transistor) detectors and experience gained during 2 years of use are reported with focus on the performance of the MOSFET detectors. A bolus phantom was constructed with two implants, mimicking prostate and head & neck treatments, using steel needles and plastic catheters to guide the 192 Ir source and house the MOSFET detectors. The phantom was taken through the BT treatment chain from image acquisition to dose evaluation. During the 2-year evaluation-period, delivered doses were verified a total of 56 times using MOSFET detectors which had been calibrated in an external 60 Co beam. An initial experimental investigation on beam quality differences between 192 Ir and 60 Co is reported. The standard deviation in repeated MOSFET measurements was below 3% in the six measurement points with dose levels above 2 Gy. MOSFET measurements overestimated treatment planning system doses by 2-7%. Distance-dependent experimental beam quality correction factors derived in a phantom of similar size as that used for end-to-end tests applied on a time-resolved measurement improved the agreement. MOSFET detectors provide values stable over time and function well for use as detectors for end-to-end quality assurance purposes in 192 Ir BT. Beam quality correction factors should address not only distance from source but also phantom dimensions. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  13. Apollo 16 astronauts in Apollo Command Module Mission Simulator

    Science.gov (United States)

    1972-01-01

    Astronaut Thomas K. Mattingly II, command module pilot of the Apollo 16 lunar landing mission, participates in extravehicular activity (EVA) training in bldg 5 at the Manned Spacecraft Center (MSC). In the right background is Astronaut Charles M. Duke Jr., lunar module pilot. They are inside the Apollo Command Module Mission Simulator (31046); Mattingly (right foreground) and Duke (right backgroung) in the Apollo Command Module Mission Simulator for EVA simulation and training. Astronaut John W. Young, commander, can be seen in the left background (31047).

  14. Modeling and Simulation for Mission Operations Work System Design

    Science.gov (United States)

    Sierhuis, Maarten; Clancey, William J.; Seah, Chin; Trimble, Jay P.; Sims, Michael H.

    2003-01-01

    Work System analysis and design is complex and non-deterministic. In this paper we describe Brahms, a multiagent modeling and simulation environment for designing complex interactions in human-machine systems. Brahms was originally conceived as a business process design tool that simulates work practices, including social systems of work. We describe our modeling and simulation method for mission operations work systems design, based on a research case study in which we used Brahms to design mission operations for a proposed discovery mission to the Moon. We then describe the results of an actual method application project-the Brahms Mars Exploration Rover. Space mission operations are similar to operations of traditional organizations; we show that the application of Brahms for space mission operations design is relevant and transferable to other types of business processes in organizations.

  15. Interoperable End-to-End Remote Patient Monitoring Platform Based on IEEE 11073 PHD and ZigBee Health Care Profile.

    Science.gov (United States)

    Clarke, Malcolm; de Folter, Joost; Verma, Vivek; Gokalp, Hulya

    2018-05-01

    This paper describes the implementation of an end-to-end remote monitoring platform based on the IEEE 11073 standards for personal health devices (PHD). It provides an overview of the concepts and approaches and describes how the standard has been optimized for small devices with limited resources of processor, memory, and power that use short-range wireless technology. It explains aspects of IEEE 11073, including the domain information model, state model, and nomenclature, and how these support its plug-and-play architecture. It shows how these aspects underpin a much larger ecosystem of interoperable devices and systems that include IHE PCD-01, HL7, and BlueTooth LE medical devices, and the relationship to the Continua Guidelines, advocating the adoption of data standards and nomenclature to support semantic interoperability between health and ambient assisted living in future platforms. The paper further describes the adaptions that have been made in order to implement the standard on the ZigBee Health Care Profile and the experiences of implementing an end-to-end platform that has been deployed to frail elderly patients with chronic disease(s) and patients with diabetes.

  16. User-oriented end-to-end transport protocols for the real-time distribution of telemetry data from NASA spacecraft

    Science.gov (United States)

    Hooke, A. J.

    1979-01-01

    A set of standard telemetry protocols for downlink data flow facilitating the end-to-end transport of instrument data from the spacecraft to the user in real time is proposed. The direct switching of data by autonomous message 'packets' that are assembled by the source instrument on the spacecraft is discussed. The data system consists thus of a format on a message rather than word basis, and such packet telemetry would include standardized protocol headers. Standards are being developed within the NASA End-to-End Data System (NEEDS) program for the source packet and transport frame protocols. The source packet protocol contains identification of both the sequence number of the packet as it is generated by the source and the total length of the packet, while the transport frame protocol includes a sequence count defining the serial number of the frame as it is generated by the spacecraft data system, and a field specifying any 'options' selected in the format of the frame itself.

  17. Ferromagnetic interaction in an asymmetric end-to-end azido double-bridged copper(II) dinuclear complex: a combined structure, magnetic, polarized neutron diffraction and theoretical study.

    Science.gov (United States)

    Aronica, Christophe; Jeanneau, Erwann; El Moll, Hani; Luneau, Dominique; Gillon, Béatrice; Goujon, Antoine; Cousson, Alain; Carvajal, Maria Angels; Robert, Vincent

    2007-01-01

    A new end-to-end azido double-bridged copper(II) complex [Cu(2)L(2)(N(3))2] (1) was synthesized and characterized (L=1,1,1-trifluoro-7-(dimethylamino)-4-methyl-5-aza-3-hepten-2-onato). Despite the rather long Cu-Cu distance (5.105(1) A), the magnetic interaction is ferromagnetic with J= +16 cm(-1) (H=-JS(1)S(2)), a value that has been confirmed by DFT and high-level correlated ab initio calculations. The spin distribution was studied by using the results from polarized neutron diffraction. This is the first such study on an end-to-end system. The experimental spin density was found to be localized mainly on the copper(II) ions, with a small degree of delocalization on the ligand (L) and terminal azido nitrogens. There was zero delocalization on the central nitrogen, in agreement with DFT calculations. Such a picture corresponds to an important contribution of the d(x2-y2) orbital and a small population of the d(z2) orbital, in agreement with our calculations. Based on a correlated wavefunction analysis, the ferromagnetic behavior results from a dominant double spin polarization contribution and vanishingly small ionic forms.

  18. Simulation and debriefing in neonatology 2016: Mission incomplete.

    Science.gov (United States)

    Halamek, Louis P

    2016-11-01

    Simulation can be an effective tool to facilitate the acquisition and maintenance of the cognitive, technical and behavioral skills necessary to carry out our mission in neonatology: the delivery of safe, effective and efficient care to our patients. Prominent examples of successful implementation of simulation within neonatology include the Neonatal Resuscitation Program, the International Pediatric Simulation Society, and the International Network for Simulation-Based Pediatric Innovation, Research and Education. Despite these successes much remains to be accomplished. Expanding simulation beyond technical skill acquisition, using simulated environments to conduct research into human and system performance, incorporating simulation into high-stakes skill assessments, embracing the expertise of the more extensive modeling and simulation community and, in general, applying simulation to healthcare with the same degree of gravitas with which it is deployed in other high-risk industries are all tasks that must be completed in order to achieve our mission. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Mixed integer nonlinear programming model of wireless pricing scheme with QoS attribute of bandwidth and end-to-end delay

    Science.gov (United States)

    Irmeilyana, Puspita, Fitri Maya; Indrawati

    2016-02-01

    The pricing for wireless networks is developed by considering linearity factors, elasticity price and price factors. Mixed Integer Nonlinear Programming of wireless pricing model is proposed as the nonlinear programming problem that can be solved optimally using LINGO 13.0. The solutions are expected to give some information about the connections between the acceptance factor and the price. Previous model worked on the model that focuses on bandwidth as the QoS attribute. The models attempt to maximize the total price for a connection based on QoS parameter. The QoS attributes used will be the bandwidth and the end to end delay that affect the traffic. The maximum goal to maximum price is achieved when the provider determine the requirement for the increment or decrement of price change due to QoS change and amount of QoS value.

  20. Crystal structure of Aquifex aeolicus gene product Aq1627: a putative phosphoglucosamine mutase reveals a unique C-terminal end-to-end disulfide linkage.

    Science.gov (United States)

    Sridharan, Upasana; Kuramitsu, Seiki; Yokoyama, Shigeyuki; Kumarevel, Thirumananseri; Ponnuraj, Karthe

    2017-06-27

    The Aq1627 gene from Aquifex aeolicus, a hyperthermophilic bacterium has been cloned and overexpressed in Escherichia coli. The protein was purified to homogeneity and its X-ray crystal structure was determined to 1.3 Å resolution using multiple wavelength anomalous dispersion phasing. The structural and sequence analysis of Aq1627 is suggestive of a putative phosphoglucosamine mutase. The structural features of Aq1627 further indicate that it could belong to a new subclass of the phosphoglucosamine mutase family. Aq1627 structure contains a unique C-terminal end-to-end disulfide bond, which links two monomers and this structural information can be used in protein engineering to make proteins more stable in different applications.

  1. Reconstruction after ureteral resection during HIPEC surgery: Re-implantation with uretero-neocystostomy seems safer than end-to-end anastomosis.

    Science.gov (United States)

    Pinar, U; Tremblay, J-F; Passot, G; Dazza, M; Glehen, O; Tuech, J-J; Pocard, M

    2017-09-01

    Resection of the pelvic ureter may be necessary in cytoreductive surgery for peritoneal carcinomatosis in combination with hyperthermic intraperitoneal chemotherapy (HIPEC). As the morbidity for cytoreductive surgery with HIPEC has decreased, expert teams have begun to perform increasingly complex surgical procedures associated with HIPEC, including pelvic reconstructions. After ureteral resection, two types of reconstruction are possible: uretero-ureteral end-to-end anastomosis and uretero-vesical re-implantation or uretero-neocystostomy (the so-called psoas hitch technique). By compiling the experience of three surgical teams that perform HIPEC surgeries, we have tried to compare the effectiveness of these two techniques. A retrospective comparative case-matched multicenter study was conducted for patients undergoing operation between 2005 and 2014. Patients included had undergone resection of the pelvic ureter during cytoreductive surgery with HIPEC for peritoneal carcinomatomosis; ureteral reconstruction was by either end-to-end anastomosis (EEA group) or re-implantation uretero-neocystostomy (RUC group). The primary endpoint was the occurrence of urinary fistula in postoperative follow-up. There were 14 patients in the EEA group and 14 in the RUC group. The groups were comparable for age, extent of carcinomatosis (PCI index) and operative duration. Four urinary fistulas occurred in the EEA group (28.5%) versus zero fistulas in the RUC group (0%) (P=0.0308). Re-implantation with uretero-neocystostomy during cytoreductive surgery with HIPEC is the preferred technique for reconstruction after ureteral resection in case of renal conservation. Copyright © 2017. Published by Elsevier Masson SAS.

  2. Poster - 44: Development and implementation of a comprehensive end-to-end testing methodology for linac-based frameless SRS QA using a modified commercial stereotactic anthropomorphic phantom

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Derek; Mutanga, Theodore [University of Toronto, Carlo Fidani Peel Regional Cancer Center (Canada)

    2016-08-15

    Purpose: An end-to-end testing methodology was designed to evaluate the overall SRS treatment fidelity, incorporating all steps in the linac-based frameless radiosurgery treatment delivery process. The study details our commissioning experience of the Steev (CIRS, Norfolk, VA) stereotactic anthropomorphic head phantom including modification, test design, and baseline measurements. Methods: Repeated MR and CT scans were performed with interchanging inserts. MR-CT fusion accuracy was evaluated and the insert spatial coincidence was verified on CT. Five non-coplanar arcs delivered a prescription dose to a 15 mm spherical CTV with 2 mm PTV margin. Following setup, CBCT-based shifts were applied as per protocol. Sequential measurements were performed by interchanging inserts without disturbing the setup. Spatial and dosimetric accuracy was assessed by a combination of CBCT hidden target, radiochromic film, and ion chamber measurements. To facilitate film registration, the film insert was modified in-house by etching marks. Results: MR fusion error and insert spatial coincidences were within 0.3 mm. Both CBCT and film measurements showed spatial displacements of 1.0 mm in similar directions. Both coronal and sagittal films reported 2.3 % higher target dose relative to the treatment plan. The corrected ion chamber measurement was similarly greater by 1.0 %. The 3 %/2 mm gamma pass rate was 99% for both films Conclusions: A comprehensive end-to-end testing methodology was implemented for our SRS QA program. The Steev phantom enabled realistic evaluation of the entire treatment process. Overall spatial and dosimetric accuracy of the delivery were 1 mm and 3 % respectively.

  3. A Validation Approach of an End-to-End Whole Genome Sequencing Workflow for Source Tracking of Listeria monocytogenes and Salmonella enterica

    Directory of Open Access Journals (Sweden)

    Anne-Catherine Portmann

    2018-03-01

    Full Text Available Whole genome sequencing (WGS, using high throughput sequencing technology, reveals the complete sequence of the bacterial genome in a few days. WGS is increasingly being used for source tracking, pathogen surveillance and outbreak investigation due to its high discriminatory power. In the food industry, WGS used for source tracking is beneficial to support contamination investigations. Despite its increased use, no standards or guidelines are available today for the use of WGS in outbreak and/or trace-back investigations. Here we present a validation of our complete (end-to-end WGS workflow for Listeria monocytogenes and Salmonella enterica including: subculture of isolates, DNA extraction, sequencing and bioinformatics analysis. This end-to-end WGS workflow was evaluated according to the following performance criteria: stability, repeatability, reproducibility, discriminatory power, and epidemiological concordance. The current study showed that few single nucleotide polymorphism (SNPs were observed for L. monocytogenes and S. enterica when comparing genome sequences from five independent colonies from the first subculture and five independent colonies after the tenth subculture. Consequently, the stability of the WGS workflow for L. monocytogenes and S. enterica was demonstrated despite the few genomic variations that can occur during subculturing steps. Repeatability and reproducibility were also demonstrated. The WGS workflow was shown to have a high discriminatory power and has the ability to show genetic relatedness. Additionally, the WGS workflow was able to reproduce published outbreak investigation results, illustrating its capability of showing epidemiological concordance. The current study proposes a validation approach comprising all steps of a WGS workflow and demonstrates that the workflow can be applied to L. monocytogenes or S. enterica.

  4. SU-F-T-76: Total Skin Electron Therapy: An-End-To-End Examination of the Absolute Dosimetry with a Rando Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Cui, G; Ha, J; Zhou, S; Cui, J; Shiu, A [University Southern California, Los Angeles, CA (United States)

    2016-06-15

    Purpose: To examine and validate the absolute dose for total skin electron therapy (TSET) through an end-to-end test with a Rando phantom using optically stimulated luminescent dosimeters (OSLDs) and EBT3 radiochromic films. Methods: A Varian Trilogy linear accelerator equipped with the special procedure 6 MeV HDTSe- was used to perform TSET irradiations using a modified Stanford 6-dual-field technique. The absolute dose was calibrated using a Markus ion chamber at a reference depth of 1.3cm at 100 cm SSD with a field size of 36 × 36 cm at the isocenter in solid water slabs. The absolute dose was cross validated by a farmer ion chamber. Then the dose rate in the unit of cGy/Mu was calibrated using the Markus chamber at the treatment position. OSLDs were used to independently verify the dose using the calibrated dose rate. Finally, a patient treatment plan (200 cGy/cycle) was delivered in the QA mode to a Rando phantom, which had 16 pairs of OSLDs and EBT3 films taped onto its surface at different anatomical positions. The doses recorded were read out to validate the absolute dosimetry for TSET. Results: The OSLD measurements were within 7% agreement with the planned dose except the shoulder areas, where the doses recorded were 23% lower on average than those of the planned. The EBT3 film measurements were within 10% agreement with the planned dose except the shoulder and the scalp vertex areas, where the respective doses recorded were 18% and 14% lower on average than those of the planned. The OSLDs gave more consistent dose measurements than those of the EBT3 films. Conclusion: The absolute dosimetry for TSET was validated by an end-to-end test with a Rando phantom using the OSLDs and EBT3 films. The beam calibration and monitor unit calculations were confirmed.

  5. STS-49 crew in JSC's FB Shuttle Mission Simulator (SMS) during simulation

    Science.gov (United States)

    1992-01-01

    STS-49 Endeavour, Orbiter Vehicle (OV) 105, crewmembers participate in a simulation in JSC's Fixed Base (FB) Shuttle Mission Simulator (SMS) located in the Mission Simulation and Training Facility Bldg 5. Wearing launch and entry suits (LESs) and launch and entry helmets (LEH) and seated on the FB-SMS middeck are (left to right) Mission Specialist (MS) Thomas D. Akers, MS Kathryn C. Thornton, and MS Pierre J. Thuot.

  6. Modeling and Simulation for Multi-Missions Space Exploration Vehicle

    Science.gov (United States)

    Chang, Max

    2011-01-01

    Asteroids and Near-Earth Objects [NEOs] are of great interest for future space missions. The Multi-Mission Space Exploration Vehicle [MMSEV] is being considered for future Near Earth Object missions and requires detailed planning and study of its Guidance, Navigation, and Control [GNC]. A possible mission of the MMSEV to a NEO would be to navigate the spacecraft to a stationary orbit with respect to the rotating asteroid and proceed to anchor into the surface of the asteroid with robotic arms. The Dynamics and Real-Time Simulation [DARTS] laboratory develops reusable models and simulations for the design and analysis of missions. In this paper, the development of guidance and anchoring models are presented together with their role in achieving mission objectives and relationships to other parts of the simulation. One important aspect of guidance is in developing methods to represent the evolution of kinematic frames related to the tasks to be achieved by the spacecraft and its robot arms. In this paper, we compare various types of mathematical interpolation methods for position and quaternion frames. Subsequent work will be on analyzing the spacecraft guidance system with different movements of the arms. With the analyzed data, the guidance system can be adjusted to minimize the errors in performing precision maneuvers.

  7. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study.

    Science.gov (United States)

    Bowen, S R; Nyflot, M J; Herrmann, C; Groh, C M; Meyer, J; Wollenweber, S D; Stearns, C W; Kinahan, P E; Sandison, G A

    2015-05-07

    Effective positron emission tomography / computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [(18)F]FDG. The lung lesion insert was driven by six different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/Bmean) ratios, target volumes, planned equivalent uniform target doses, and 2%-2 mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10-20%, treatment planning errors were 5-10%, and treatment delivery errors were 5-30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5-10% in PET/CT imaging, PET/CT imaging to RT planning, and RT delivery under a dose painting paradigm is feasible within an integrated respiratory motion phantom workflow. For a limited set of cases, the magnitude

  8. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study

    International Nuclear Information System (INIS)

    Bowen, S R; Nyflot, M J; Meyer, J; Sandison, G A; Herrmann, C; Groh, C M; Wollenweber, S D; Stearns, C W; Kinahan, P E

    2015-01-01

    Effective positron emission tomography / computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [ 18 F]FDG. The lung lesion insert was driven by six different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/B mean ) ratios, target volumes, planned equivalent uniform target doses, and 2%-2 mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10–20%, treatment planning errors were 5–10%, and treatment delivery errors were 5–30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5–10% in PET/CT imaging, <5% in treatment planning, and <2% in treatment delivery. We have demonstrated that estimation of respiratory motion uncertainty and its propagation from PET/CT imaging to RT

  9. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study

    Science.gov (United States)

    Bowen, S R; Nyflot, M J; Hermann, C; Groh, C; Meyer, J; Wollenweber, S D; Stearns, C W; Kinahan, P E; Sandison, G A

    2015-01-01

    Effective positron emission tomography/computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [18F]FDG. The lung lesion insert was driven by 6 different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy (VMAT) were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/Bmean) ratios, target volumes, planned equivalent uniform target doses (EUD), and 2%-2mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10–20%, treatment planning errors were 5–10%, and treatment delivery errors were 5–30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5–10% in PET/CT imaging, PET/CT imaging to RT planning, and RT delivery under a dose painting paradigm is feasible within an integrated respiratory motion phantom workflow. For a limited set of cases, the

  10. Mars Exploration Rover Terminal Descent Mission Modeling and Simulation

    Science.gov (United States)

    Raiszadeh, Behzad; Queen, Eric M.

    2004-01-01

    Because of NASA's added reliance on simulation for successful interplanetary missions, the MER mission has developed a detailed EDL trajectory modeling and simulation. This paper summarizes how the MER EDL sequence of events are modeled, verification of the methods used, and the inputs. This simulation is built upon a multibody parachute trajectory simulation tool that has been developed in POST I1 that accurately simulates the trajectory of multiple vehicles in flight with interacting forces. In this model the parachute and the suspended bodies are treated as 6 Degree-of-Freedom (6 DOF) bodies. The terminal descent phase of the mission consists of several Entry, Descent, Landing (EDL) events, such as parachute deployment, heatshield separation, deployment of the lander from the backshell, deployment of the airbags, RAD firings, TIRS firings, etc. For an accurate, reliable simulation these events need to be modeled seamlessly and robustly so that the simulations will remain numerically stable during Monte-Carlo simulations. This paper also summarizes how the events have been modeled, the numerical issues, and modeling challenges.

  11. Presence of calcium in the vessel walls after end-to-end arterial anastomoses with polydioxanone and polypropylene sutures in growing dogs.

    Science.gov (United States)

    Gersak, B

    1993-10-01

    The presence of calcium in the vessel walls after end-to-end arterial anastomoses performed with polydioxanone and polypropylene interrupted sutures was studied in 140 anastomoses in 35 10-week-old German shepherd dogs. Histologic examination with hematoxylin and eosin, van Gieson, and von Kossa staining techniques was performed after the animals were killed 6 months after the operation. Ketamine hydrochloride was used as an anesthetic agent. At the start of the investigation the dogs weighed 14.5 +/- 2.6 kg (mean +/- standard deviation, n = 35), and after 6 months they weighed 45.3 +/- 3.1 kg (mean +/- standard deviation, n = 35). The diameter of the sutured arteries in the first operation was 2.6 +/- 0.5 mm (mean +/- standard deviation, n = 140). With each dog, both brachial and both femoral arteries were used--one artery for each different type of suture. In different dogs, different arteries were used for the same type of suture. The prevalence of calcifications after 6 months was determined from the numeric density of calcifications with standard stereologic techniques. The sutured and sutureless parts taken from longitudinal sections from each artery were studied, and t test values were calculated as follows: In paired samples, statistically significant differences in numerical density of calcifications were seen between sutured and sutureless arterial parts for both materials (sutureless part versus part with polydioxanone sutures, p 0.05, n = 70) and sutureless parts (p > 0.05, n = 70).

  12. Poly(ethyl glyoxylate)-Poly(ethylene oxide) Nanoparticles: Stimuli-Responsive Drug Release via End-to-End Polyglyoxylate Depolymerization.

    Science.gov (United States)

    Fan, Bo; Gillies, Elizabeth R

    2017-08-07

    The ability to disrupt polymer assemblies in response to specific stimuli provides the potential to release drugs selectively at certain sites or conditions in vivo. However, most stimuli-responsive delivery systems require many stimuli-initiated events to release drugs. "Self-immolative polymers" offer the potential to provide amplified responses to stimuli as they undergo complete end-to-end depolymerization following the cleavage of a single end-cap. Herein, linker end-caps were developed to conjugate self-immolative poly(ethyl glyoxylate) (PEtG) with poly(ethylene oxide) (PEO) to form amphiphilic block copolymers. These copolymers were self-assembled to form nanoparticles in aqueous solution. Cleavage of the linker end-caps were triggered by a thiol reducing agent, UV light, H 2 O 2 , and combinations of these stimuli, resulting in nanoparticle disintegration. Low stimuli concentrations were effective in rapidly disrupting the nanoparticles. Nile red, doxorubin, and curcumin were encapsulated into the nanoparticles and were selectively released upon application of the appropriate stimulus. The ability to tune the stimuli-responsiveness simply by changing the linker end-cap makes this new platform highly attractive for applications in drug delivery.

  13. System for Informatics in the Molecular Pathology Laboratory: An Open-Source End-to-End Solution for Next-Generation Sequencing Clinical Data Management.

    Science.gov (United States)

    Kang, Wenjun; Kadri, Sabah; Puranik, Rutika; Wurst, Michelle N; Patil, Sushant A; Mujacic, Ibro; Benhamed, Sonia; Niu, Nifang; Zhen, Chao Jie; Ameti, Bekim; Long, Bradley C; Galbo, Filipo; Montes, David; Iracheta, Crystal; Gamboa, Venessa L; Lopez, Daisy; Yourshaw, Michael; Lawrence, Carolyn A; Aisner, Dara L; Fitzpatrick, Carrie; McNerney, Megan E; Wang, Y Lynn; Andrade, Jorge; Volchenboum, Samuel L; Furtado, Larissa V; Ritterhouse, Lauren L; Segal, Jeremy P

    2018-04-24

    Next-generation sequencing (NGS) diagnostic assays increasingly are becoming the standard of care in oncology practice. As the scale of an NGS laboratory grows, management of these assays requires organizing large amounts of information, including patient data, laboratory processes, genomic data, as well as variant interpretation and reporting. Although several Laboratory Information Systems and/or Laboratory Information Management Systems are commercially available, they may not meet all of the needs of a given laboratory, in addition to being frequently cost-prohibitive. Herein, we present the System for Informatics in the Molecular Pathology Laboratory, a free and open-source Laboratory Information System/Laboratory Information Management System for academic and nonprofit molecular pathology NGS laboratories, developed at the Genomic and Molecular Pathology Division at the University of Chicago Medicine. The System for Informatics in the Molecular Pathology Laboratory was designed as a modular end-to-end information system to handle all stages of the NGS laboratory workload from test order to reporting. We describe the features of the system, its clinical validation at the Genomic and Molecular Pathology Division at the University of Chicago Medicine, and its installation and testing within a different academic center laboratory (University of Colorado), and we propose a platform for future community co-development and interlaboratory data sharing. Copyright © 2018. Published by Elsevier Inc.

  14. Partial QoS-Aware Opportunistic Relay Selection Over Two-Hop Channels: End-to-End Performance Under Spectrum-Sharing Requirements

    KAUST Repository

    Yuli Yang,

    2014-10-01

    In this paper, we propose a partial quality-of-service (QoS)-oriented relay selection scheme with a decode-and-forward (DF) relaying protocol, to reduce the feedback amount required for relay selection. In the proposed scheme, the activated relay is the one with the maximum signal-to-noise power ratio (SNR) in the second hop among those whose packet loss rates (PLRs) in the first hop achieve a predetermined QoS level. For the purpose of evaluating the performance of the proposed scheme, we exploit it with transmission constraints imposed on the transmit power budget and interference to other users. By analyzing the statistics of received SNRs in the first and second hops, we obtain the end-to-end PLR of this scheme in closed form under the considered scenario. Moreover, to compare the proposed scheme with popular relay selection schemes, we also derive the closed-form PLR expressions for partial relay selection (PRS) and opportunistic relay selection (ORS) criteria in the same scenario under study. Illustrative numerical results demonstrate the accuracy of our derivations and substantiate that the proposed relay selection scheme is a promising alternative with respect to the tradeoff between performance and complexity.

  15. Albert-Lembert versus hybrid-layered suture in hand sewn end-to-end cervical esophagogastric anastomosis after esophageal squamous cell carcinoma resection.

    Science.gov (United States)

    Feng, Fan; Sun, Li; Xu, Guanghui; Hong, Liu; Yang, Jianjun; Cai, Lei; Li, Guocai; Guo, Man; Lian, Xiao; Zhang, Hongwei

    2015-11-01

    Hand sewn cervical esophagogastric anastomosis (CEGA) is regarded as preferred technique by surgeons after esophagectomy. However, considering the anastomotic leakage and stricture, the optimal technique for performing this anastomosis is still under debate. Between November 2010 and September 2012, 230 patients who underwent esophagectomy with hand sewn end-to-end (ETE) CEGA for esophageal squamous cell carcinoma (ESCC) were analyzed retrospectively, including 111 patients underwent Albert-Lembert suture anastomosis and 119 patients underwent hybrid-layered suture anastomosis. Anastomosis construction time was recorded during operation. Anastomotic leakage was recorded through upper gastrointestinal water-soluble contrast examination. Anastomotic stricture was recorded during follow up. The hybrid-layered suture was faster than Albert-Lembert suture (29.40±1.24 min vs. 33.83±1.41 min, P=0.02). The overall anastomotic leak rate was 7.82%, the leak rate in hybrid-layered suture group was significantly lower than that in Albert-Lembert suture group (3.36% vs. 12.61%, P=0.01). The overall anastomotic stricture rate was 9.13%, the stricture rate in hybrid-layered suture group was significantly lower than that in Albert-Lembert suture group (5.04% vs. 13.51%, P=0.04). Hand sewn ETE CEGA with hybrid-layered suture is associated with lower anastomotic leakage and stricture rate compared to hand sewn ETE CEGA with Albert-Lembert suture.

  16. Stapled side-to-side anastomosis might be better than handsewn end-to-end anastomosis in ileocolic resection for Crohn's disease: a meta-analysis.

    Science.gov (United States)

    He, Xiaosheng; Chen, Zexian; Huang, Juanni; Lian, Lei; Rouniyar, Santosh; Wu, Xiaojian; Lan, Ping

    2014-07-01

    Ileocolic anastomosis is an essential step in the treatment to restore continuity of the gastrointestinal tract following ileocolic resection in patients with Crohn's disease (CD). However, the association between anastomotic type and surgical outcome is controversial. The aim of this meta-analysis is to compare surgical outcomes between stapled side-to-side anastomosis (SSSA) and handsewn end-to-end anastomosis (HEEA) after ileocolic resection in patients with CD. Studies comparing SSSA with HEEA after ileocolic resection in patients with CD were identified in PubMed and EMBASE. Outcomes such as complication, recurrence, and re-operation were evaluated. Eight studies (three randomized controlled trials, one prospective non-randomized trial, and four non-randomized retrospective trials) comparing SSSA (396 cases) and HEEA (425 cases) were included. As compared with HEEA, SSSA was superior in terms of overall postoperative complications [odds ratio (OR), 0.54; 95 % confidence interval (CI) 0.32-0.93], anastomotic leak (OR 0.45; 95 % CI 0.20-1.00), recurrence (OR 0.20; 95 % CI 0.07-0.55), and re-operation for recurrence (OR 0.18; 95 % CI 0.07-0.45). Postoperative hospital stay, mortality, and complications other than anastomotic leak were comparable. Based on the results of our meta-analysis, SSSA would appear to be the preferred procedure after ileocolic resection for CD, with reduced overall postoperative complications, especially anastomotic leak, and a decreased recurrence and re-operation rate.

  17. Distributed Large Data-Object Environments: End-to-End Performance Analysis of High Speed Distributed Storage Systems in Wide Area ATM Networks

    Science.gov (United States)

    Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary

    1996-01-01

    We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.

  18. Mission Assurance Modeling and Simulation: A Cyber Security Roadmap

    Science.gov (United States)

    Gendron, Gerald; Roberts, David; Poole, Donold; Aquino, Anna

    2012-01-01

    This paper proposes a cyber security modeling and simulation roadmap to enhance mission assurance governance and establish risk reduction processes within constrained budgets. The term mission assurance stems from risk management work by Carnegie Mellon's Software Engineering Institute in the late 19905. By 2010, the Defense Information Systems Agency revised its cyber strategy and established the Program Executive Officer-Mission Assurance. This highlights a shift from simply protecting data to balancing risk and begins a necessary dialogue to establish a cyber security roadmap. The Military Operations Research Society has recommended a cyber community of practice, recognizing there are too few professionals having both cyber and analytic experience. The authors characterize the limited body of knowledge in this symbiotic relationship. This paper identifies operational and research requirements for mission assurance M&S supporting defense and homeland security. M&S techniques are needed for enterprise oversight of cyber investments, test and evaluation, policy, training, and analysis.

  19. An end-to-end examination of geometric accuracy of IGRT using a new digital accelerator equipped with onboard imaging system.

    Science.gov (United States)

    Wang, Lei; Kielar, Kayla N; Mok, Ed; Hsu, Annie; Dieterich, Sonja; Xing, Lei

    2012-02-07

    The Varian's new digital linear accelerator (LINAC), TrueBeam STx, is equipped with a high dose rate flattening filter free (FFF) mode (6 MV and 10 MV), a high definition multileaf collimator (2.5 mm leaf width), as well as onboard imaging capabilities. A series of end-to-end phantom tests were performed, TrueBeam-based image guided radiation therapy (IGRT), to determine the geometric accuracy of the image-guided setup and dose delivery process for all beam modalities delivered using intensity modulated radiation therapy (IMRT) and RapidArc. In these tests, an anthropomorphic phantom with a Ball Cube II insert and the analysis software (FilmQA (3cognition)) were used to evaluate the accuracy of TrueBeam image-guided setup and dose delivery. Laser cut EBT2 films with 0.15 mm accuracy were embedded into the phantom. The phantom with the film inserted was first scanned with a GE Discovery-ST CT scanner, and the images were then imported to the planning system. Plans with steep dose fall off surrounding hypothetical targets of different sizes were created using RapidArc and IMRT with FFF and WFF (with flattening filter) beams. Four RapidArc plans (6 MV and 10 MV FFF) and five IMRT plans (6 MV and 10 MV FFF; 6 MV, 10 MV and 15 MV WFF) were studied. The RapidArc plans with 6 MV FFF were planned with target diameters of 1 cm (0.52 cc), 2 cm (4.2 cc) and 3 cm (14.1 cc), and all other plans with a target diameter of 3 cm. Both onboard planar and volumetric imaging procedures were used for phantom setup and target localization. The IMRT and RapidArc plans were then delivered, and the film measurements were compared with the original treatment plans using a gamma criteria of 3%/1 mm and 3%/2 mm. The shifts required in order to align the film measured dose with the calculated dose distributions was attributed to be the targeting error. Targeting accuracy of image-guided treatment using TrueBeam was found to be within 1 mm. For irradiation of the 3 cm target, the gammas (3%, 1

  20. Healing of esophageal anastomoses performed with the biofragmentable anastomosis ring versus the end-to-end anastomosis stapler: comparative experimental study in dogs.

    Science.gov (United States)

    Kovács, Tibor; Köves, István; Orosz, Zsolt; Németh, Tibor; Pandi, Erzsébet; Kralovanszky, Judit

    2003-04-01

    The biofragmentable anastomosis ring (BAR) has been used successfully for anastomoses from the stomach to the upper rectum. The healing of intrathoracic esophageal anastomoses performed with the BAR or an end-to-end anastomosis (EEA) stapler on an experimental model was compared. Parameters of tissue repair were evaluated: macroscopic examination, bursting strength (BS), collagen (hydroxyproline, or HP), histology (H&E and Picrosirius red staining for collagen). A series of 48 mongrel dogs were randomly separated into two groups (30 BAR, 18 stapler) and subgroups according to the time of autopsy (days 4, 7, 14, 28). Mortality was 13.3% (4 BAR cases) with two deaths not related to surgery (excluded). There were four leaks in the BAR group (14.3%) and no leaks or deaths but two strictures in the stapler group. BS was significantly higher in the BAR group during the first week, and values were almost equal from the second week with both methods. The HP rate was significantly reduced on days 4 and 7 in both groups compared to the reference values; the values were close to reference values from the second week (lower in the BAR group). Stapled anastomoses caused less pronounced inflammation and were associated with an earlier start of regeneration, but the difference was not significant compared to that in the BAR group. Accumulation of new collagen (green polarization) started on day 7 in both groups, but maturation (orange-red polarization) was significantly more advanced in the BAR group after the second week. A strong linear correlation between the BS and HP rate was found with both methods. There was no significant difference in the complication rate or healing of intrathoracic BAR and stapled anastomoses. The BAR method is simple, quick, and safe; and it seems to be a feasible procedure for creating intrathoracic esophageal anastomoses in dogs.

  1. End-to-end process of hollow spacecraft structures with high frequency and low mass obtained with in-house structural optimization tool and additive manufacturing

    Directory of Open Access Journals (Sweden)

    Alexandru-Mihai CISMILIANU

    2017-09-01

    Full Text Available In the space sector the most decisive elements are: mass reduction, cost saving and minimum lead time; here, structural optimization and additive layer manufacturing (ALM fit best. The design must be driven by stiffness, because an important requirement for spacecraft (S/C structures is to reduce the dynamic coupling between the S/C and the launch vehicle. The objective is to create an end-to-end process, from the input given by the customer to the manufacturing of an aluminum part as light as possible but at the same time considerably stiffer while taking the full advantage of the design flexibility given by ALM. To design and optimize the parts, a specialized in-house tool was used, guaranteeing a load-sufficient material distribution. Using topological optimization, the iterations between the design and the stress departments were diminished, thus greatly reducing the lead time. In order to improve and lighten the obtained structure a design with internal cavities and hollow beams was considered. This implied developing of a procedure for powder evacuation through iterations with the manufacturer while optimizing the design for ALM. The resulted part can be then manufactured via ALM with no need of further design adjustments. To achieve a high-quality part with maximum efficiency, it is essential to have a loop between the design team and the manufacturer. Topological optimization and ALM work hand in hand if used properly. The team achieved a more efficient structure using topology optimization and ALM, than using conventional design and manufacturing methods.

  2. Automated Detection of Clinically Significant Prostate Cancer in mp-MRI Images Based on an End-to-End Deep Neural Network.

    Science.gov (United States)

    Wang, Zhiwei; Liu, Chaoyue; Cheng, Danpeng; Wang, Liang; Yang, Xin; Cheng, Kwang-Ting

    2018-05-01

    Automated methods for detecting clinically significant (CS) prostate cancer (PCa) in multi-parameter magnetic resonance images (mp-MRI) are of high demand. Existing methods typically employ several separate steps, each of which is optimized individually without considering the error tolerance of other steps. As a result, they could either involve unnecessary computational cost or suffer from errors accumulated over steps. In this paper, we present an automated CS PCa detection system, where all steps are optimized jointly in an end-to-end trainable deep neural network. The proposed neural network consists of concatenated subnets: 1) a novel tissue deformation network (TDN) for automated prostate detection and multimodal registration and 2) a dual-path convolutional neural network (CNN) for CS PCa detection. Three types of loss functions, i.e., classification loss, inconsistency loss, and overlap loss, are employed for optimizing all parameters of the proposed TDN and CNN. In the training phase, the two nets mutually affect each other and effectively guide registration and extraction of representative CS PCa-relevant features to achieve results with sufficient accuracy. The entire network is trained in a weakly supervised manner by providing only image-level annotations (i.e., presence/absence of PCa) without exact priors of lesions' locations. Compared with most existing systems which require supervised labels, e.g., manual delineation of PCa lesions, it is much more convenient for clinical usage. Comprehensive evaluation based on fivefold cross validation using 360 patient data demonstrates that our system achieves a high accuracy for CS PCa detection, i.e., a sensitivity of 0.6374 and 0.8978 at 0.1 and 1 false positives per normal/benign patient.

  3. Combined fishing and climate forcing in the southern Benguela upwelling ecosystem: an end-to-end modelling approach reveals dampened effects.

    Directory of Open Access Journals (Sweden)

    Morgane Travers-Trolet

    Full Text Available The effects of climate and fishing on marine ecosystems have usually been studied separately, but their interactions make ecosystem dynamics difficult to understand and predict. Of particular interest to management, the potential synergism or antagonism between fishing pressure and climate forcing is analysed in this paper, using an end-to-end ecosystem model of the southern Benguela ecosystem, built from coupling hydrodynamic, biogeochemical and multispecies fish models (ROMS-N2P2Z2D2-OSMOSE. Scenarios of different intensities of upwelling-favourable wind stress combined with scenarios of fishing top-predator fish were tested. Analyses of isolated drivers show that the bottom-up effect of the climate forcing propagates up the food chain whereas the top-down effect of fishing cascades down to zooplankton in unfavourable environmental conditions but dampens before it reaches phytoplankton. When considering both climate and fishing drivers together, it appears that top-down control dominates the link between top-predator fish and forage fish, whereas interactions between the lower trophic levels are dominated by bottom-up control. The forage fish functional group appears to be a central component of this ecosystem, being the meeting point of two opposite trophic controls. The set of combined scenarios shows that fishing pressure and upwelling-favourable wind stress have mostly dampened effects on fish populations, compared to predictions from the separate effects of the stressors. Dampened effects result in biomass accumulation at the top predator fish level but a depletion of biomass at the forage fish level. This should draw our attention to the evolution of this functional group, which appears as both structurally important in the trophic functioning of the ecosystem, and very sensitive to climate and fishing pressures. In particular, diagnoses considering fishing pressure only might be more optimistic than those that consider combined effects

  4. STS-37 Mission Specialist (MS) Ross during simulation in JSC's FB-SMS

    Science.gov (United States)

    1991-01-01

    STS-37 Mission Specialist (MS) Jerry L. Ross 'borrows' the pilots station to rehearse some of his scheduled duties for his upcoming mission. He is on the flight deck of the fixed-based (FB) shuttle mission simulator (SMS) during this unsuited simulation. The SMS is part of JSC's Mission Simulation and Training Facility Bldg 5.

  5. A novel PON based UMTS broadband wireless access network architecture with an algorithm to guarantee end to end QoS

    Science.gov (United States)

    Sana, Ajaz; Hussain, Shahab; Ali, Mohammed A.; Ahmed, Samir

    2007-09-01

    In this paper we proposes a novel Passive Optical Network (PON) based broadband wireless access network architecture to provide multimedia services (video telephony, video streaming, mobile TV, mobile emails etc) to mobile users. In the conventional wireless access networks, the base stations (Node B) and Radio Network Controllers (RNC) are connected by point to point T1/E1 lines (Iub interface). The T1/E1 lines are expensive and add up to operating costs. Also the resources (transceivers and T1/E1) are designed for peak hours traffic, so most of the time the dedicated resources are idle and wasted. Further more the T1/E1 lines are not capable of supporting bandwidth (BW) required by next generation wireless multimedia services proposed by High Speed Packet Access (HSPA, Rel.5) for Universal Mobile Telecommunications System (UMTS) and Evolution Data only (EV-DO) for Code Division Multiple Access 2000 (CDMA2000). The proposed PON based back haul can provide Giga bit data rates and Iub interface can be dynamically shared by Node Bs. The BW is dynamically allocated and the unused BW from lightly loaded Node Bs is assigned to heavily loaded Node Bs. We also propose a novel algorithm to provide end to end Quality of Service (QoS) (between RNC and user equipment).The algorithm provides QoS bounds in the wired domain as well as in wireless domain with compensation for wireless link errors. Because of the air interface there can be certain times when the user equipment (UE) is unable to communicate with Node B (usually referred to as link error). Since the link errors are bursty and location dependent. For a proposed approach, the scheduler at the Node B maps priorities and weights for QoS into wireless MAC. The compensations for errored links is provided by the swapping of services between the active users and the user data is divided into flows, with flows allowed to lag or lead. The algorithm guarantees (1)delay and throughput for error-free flows,(2)short term fairness

  6. SU-E-J-25: End-To-End (E2E) Testing On TomoHDA System Using a Real Pig Head for Intracranial Radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Corradini, N; Leick, M; Bonetti, M; Negretti, L [Clinica Luganese, Radiotherapy Center, Lugano (Switzerland)

    2015-06-15

    Purpose: To determine the MVCT imaging uncertainty on the TomoHDA system for intracranial radiosurgery treatments. To determine the end-to-end (E2E) overall accuracy of the TomoHDA system for intracranial radiosurgery. Methods: A pig head was obtained from the butcher, cut coronally through the brain, and preserved in formaldehyde. The base of the head was fixed to a positioning plate allowing precise movement, i.e. translation and rotation, in all 6 axes. A repeatability test was performed on the pig head to determine uncertainty in the image bone registration algorithm. Furthermore, the test studied images with MVCT slice thicknesses of 1 and 3 mm in unison with differing scan lengths. A sensitivity test was performed to determine the registration algorithm’s ability to find the absolute position of known translations/rotations of the pig head. The algorithm’s ability to determine absolute position was compared against that of manual operators, i.e. a radiation therapist and radiation oncologist. Finally, E2E tests for intracranial radiosurgery were performed by measuring the delivered dose distributions within the pig head using Gafchromic films. Results: The repeatability test uncertainty was lowest for the MVCTs of 1-mm slice thickness, which measured less than 0.10 mm and 0.12 deg for all axes. For the sensitivity tests, the bone registration algorithm performed better than human eyes and a maximum difference of 0.3 mm and 0.4 deg was observed for the axes. E2E test results in absolute position difference measured 0.03 ± 0.21 mm in x-axis and 0.28 ± 0.18 mm in y-axis. A maximum difference of 0.32 and 0.66 mm was observed in x and y, respectively. The average peak dose difference between measured and calculated dose was 2.7 cGy or 0.4%. Conclusion: Our tests using a pig head phantom estimate the TomoHDA system to have a submillimeter overall accuracy for intracranial radiosurgery.

  7. SU-E-J-25: End-To-End (E2E) Testing On TomoHDA System Using a Real Pig Head for Intracranial Radiosurgery

    International Nuclear Information System (INIS)

    Corradini, N; Leick, M; Bonetti, M; Negretti, L

    2015-01-01

    Purpose: To determine the MVCT imaging uncertainty on the TomoHDA system for intracranial radiosurgery treatments. To determine the end-to-end (E2E) overall accuracy of the TomoHDA system for intracranial radiosurgery. Methods: A pig head was obtained from the butcher, cut coronally through the brain, and preserved in formaldehyde. The base of the head was fixed to a positioning plate allowing precise movement, i.e. translation and rotation, in all 6 axes. A repeatability test was performed on the pig head to determine uncertainty in the image bone registration algorithm. Furthermore, the test studied images with MVCT slice thicknesses of 1 and 3 mm in unison with differing scan lengths. A sensitivity test was performed to determine the registration algorithm’s ability to find the absolute position of known translations/rotations of the pig head. The algorithm’s ability to determine absolute position was compared against that of manual operators, i.e. a radiation therapist and radiation oncologist. Finally, E2E tests for intracranial radiosurgery were performed by measuring the delivered dose distributions within the pig head using Gafchromic films. Results: The repeatability test uncertainty was lowest for the MVCTs of 1-mm slice thickness, which measured less than 0.10 mm and 0.12 deg for all axes. For the sensitivity tests, the bone registration algorithm performed better than human eyes and a maximum difference of 0.3 mm and 0.4 deg was observed for the axes. E2E test results in absolute position difference measured 0.03 ± 0.21 mm in x-axis and 0.28 ± 0.18 mm in y-axis. A maximum difference of 0.32 and 0.66 mm was observed in x and y, respectively. The average peak dose difference between measured and calculated dose was 2.7 cGy or 0.4%. Conclusion: Our tests using a pig head phantom estimate the TomoHDA system to have a submillimeter overall accuracy for intracranial radiosurgery

  8. OpenCyto: an open source infrastructure for scalable, robust, reproducible, and automated, end-to-end flow cytometry data analysis.

    Directory of Open Access Journals (Sweden)

    Greg Finak

    2014-08-01

    Full Text Available Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis framework designed to lower the barrier of entry to automated flow data analysis algorithms by addressing key areas that we believe have held back wider adoption of automated approaches. OpenCyto supports end-to-end data analysis that is robust and reproducible while generating results that are easy to interpret. We have improved the existing, widely used core BioConductor flow cytometry infrastructure by allowing analysis to scale in a memory efficient manner to the large flow data sets that arise in clinical trials, and integrating domain-specific knowledge as part of the pipeline through the hierarchical relationships among cell populations. Pipelines are defined through a text-based csv file, limiting the need to write data-specific code, and are data agnostic to simplify repetitive analysis for core facilities. We demonstrate how to analyze two large cytometry data sets: an intracellular cytokine staining (ICS data set from a published HIV vaccine trial focused on detecting rare, antigen-specific T-cell populations, where we identify a new subset of CD8 T-cells with a vaccine-regimen specific response that could not be identified through manual analysis, and a CyTOF T-cell phenotyping data set where a large staining panel and many cell populations are a challenge for traditional analysis. The substantial improvements to the core BioConductor flow cytometry packages give OpenCyto the potential for wide adoption. It can rapidly leverage new developments in

  9. SU-E-T-19: A New End-To-End Test Method for ExacTrac for Radiation and Plan Isocenter Congruence

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S; Nguyen, N; Liu, F; Huang, Y [Rhode Island Hospital / Warren Alpert Medical, Providence, RI (United States); Sio, T [Mayo Clinic, Rochester, MN (United States); Jung, J [East Carolina University, Greenville, North Carolina (United States); Pyakuryal, A [UniversityIllinois at Chicago, Chicago, IL (United States); Jang, S [Princeton Radiation Oncology Ctr., Jamesburg, NJ (United States)

    2014-06-01

    Purpose: To combine and integrate quality assurance (QA) of target localization and radiation isocenter End to End (E2E) test of BrainLAB ExacTrac system, a new QA approach was devised using anthropomorphic head and neck phantom. This test insures the target localization as well as radiation isocenter congruence which is one step ahead the current ExacTrac QA procedures. Methods: The head and neck phantom typically used for CyberKnife E2E test was irradiated to the sphere target that was visible in CT-sim images. The CT-sim was performed using 1 mm thickness slice with helical scanning technique. The size of the sphere was 3-cm diameter and contoured as a target volume using iPlan V.4.5.2. A conformal arc plan was generated using MLC-based with 7 fields, and five of them were include couch rotations. The prescription dose was 5 Gy and 95% coverage to the target volume. For the irradiation, two Gafchromic films were perpendicularly inserted into the cube that hold sphere inside. The linac used for the irradiation was TrueBeam STx equipped with HD120 MLC. In order to use ExacTrac, infra-red head–array was used to correlate orthogonal X-ray images. Results: Using orthogonal X-rays of ExacTrac the phantom was positioned. For each field, phantom was check again with X-rays and re-positioned if necessary. After each setup using ExacTrac, the target was irradiated. The films were analyzed to determine the deviation of the radiation isocenter in all three dimensions: superior-inferior, left-right and anterior-posterior. The total combining error was found to be 0.76 mm ± 0.05 mm which was within sub-millimeter accuracy. Conclusion: Until now, E2E test for ExacTrac was separately implemented to test image localization and radiation isocenter. This new method can be used for periodic QA procedures.

  10. RTEMP: Exploring an end-to-end, agnostic platform for multidisciplinary real-time analytics in the space physics community and beyond

    Science.gov (United States)

    Chaddock, D.; Donovan, E.; Spanswick, E.; Jackel, B. J.

    2014-12-01

    Large-scale, real-time, sensor-driven analytics are a highly effective set of tools in many research environments; however, the barrier to entry is expensive and the learning curve is steep. These systems need to operate efficiently from end to end, with the key aspects being data transmission, acquisition, management and organization, and retrieval. When building a generic multidisciplinary platform, acquisition and data management needs to be designed with scalability and flexibility as the primary focus. Additionally, in order to leverage current sensor web technologies, the integration of common sensor data standards (ie. SensorML and SWE Services) should be supported. Perhaps most important, researchers should be able to get started and integrate the platform into their set of research tools as easily and quickly as possible. The largest issue with current platforms is that the sensor data must be formed and described using the previously mentioned standards. As useful as these standards are for organizing data, they are cumbersome to adopt, often restrictive, and are required to be geospatially-driven. Our solution, RTEMP (Real-time Environment Monitoring Platform), is a real-time analytics platform with over ten years and an estimated two million dollars of investment. It has been developed for our continuously expanding requirements of operating and building remote sensors and supporting equipment for space physics research. A key benefit of our approach is RTEMP's ability to manage agnostic data. This allows data that flows through the system to be structured in any way that best addresses the needs of the sensor operators and data users, enabling extensive flexibility and streamlined development and research. Here we begin with an overview of RTEMP and how it is structured. Additionally, we will showcase the ways that we are using RTEMP and how it is being adopted by researchers in an increasingly broad range of other research fields. We will lay out a

  11. WE-DE-BRA-11: A Study of Motion Tracking Accuracy of Robotic Radiosurgery Using a Novel CCD Camera Based End-To-End Test System

    Energy Technology Data Exchange (ETDEWEB)

    Wang, L; M Yang, Y [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA (United States); Nelson, B [Logos Systems Intl, Scotts Valley, CA (United States)

    2016-06-15

    Purpose: A novel end-to-end test system using a CCD camera and a scintillator based phantom (XRV-124, Logos Systems Int’l) capable of measuring the beam-by-beam delivery accuracy of Robotic Radiosurgery (CyberKnife) was developed and reported in our previous work. This work investigates its application in assessing the motion tracking (Synchrony) accuracy for CyberKnife. Methods: A QA plan with Anterior and Lateral beams (with 4 different collimator sizes) was created (Multiplan v5.3) for the XRV-124 phantom. The phantom was placed on a motion platform (superior and inferior movement), and the plans were delivered on the CyberKnife M6 system using four motion patterns: static, Sine- wave, Sine with 15° phase shift, and a patient breathing pattern composed of 2cm maximum motion with 4 second breathing cycle. Under integral recording mode, the time-averaged beam vectors (X, Y, Z) were measured by the phantom and compared with static delivery. In dynamic recording mode, the beam spots were recorded at a rate of 10 frames/second. The beam vector deviation from average position was evaluated against the various breathing patterns. Results: The average beam position of the six deliveries with no motion and three deliveries with Synchrony tracking on ideal motion (sinewave without phase shift) all agree within −0.03±0.00 mm, 0.10±0.04, and 0.04±0.03 in the X, Y, and X directions. Radiation beam width (FWHM) variations are within ±0.03 mm. Dynamic video record showed submillimeter tracking stability for both regular and irregular breathing pattern; however the tracking error up to 3.5 mm was observed when a 15 degree phase shift was introduced. Conclusion: The XRV-124 system is able to provide 3D and 4D targeting accuracy for CyberKnife delivery with Synchrony. The experimental results showed sub-millimeter delivery in phantom with excellent correlation in target to breathing motion. The accuracy was degraded when irregular motion and phase shift was introduced.

  12. Using Small UAS for Mission Simulation, Science Validation, and Definition

    Science.gov (United States)

    Abakians, H.; Donnellan, A.; Chapman, B. D.; Williford, K. H.; Francis, R.; Ehlmann, B. L.; Smith, A. T.

    2017-12-01

    Small Unmanned Aerial Systems (sUAS) are increasingly being used across JPL and NASA for science data collection, mission simulation, and mission validation. They can also be used as proof of concept for development of autonomous capabilities for Earth and planetary exploration. sUAS are useful for reconstruction of topography and imagery for a variety of applications ranging from fault zone morphology, Mars analog studies, geologic mapping, photometry, and estimation of vegetation structure. Imagery, particularly multispectral imagery can be used for identifying materials such as fault lithology or vegetation type. Reflectance maps can be produced for wetland or other studies. Topography and imagery observations are useful in radar studies such as from UAVSAR or the future NISAR mission to validate 3D motions and to provide imagery in areas of disruption where the radar measurements decorrelate. Small UAS are inexpensive to operate, reconfigurable, and agile, making them a powerful platform for validating mission science measurements, and also for providing surrogate data for existing or future missions.

  13. SU-F-J-150: Development of An End-To-End Chain Test for the First-In-Man MR-Guided Treatments with the MRI Linear Accelerator by Using the Alderson Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Hoogcarspel, S; Kerkmeijer, L; Lagendijk, J; Van Vulpen, M; Raaymakers, B [University Medical Center Utrecht, Utrecht, Utrecht (Netherlands)

    2016-06-15

    The Alderson phantom is a human shaped quality assurance tool that has been used for over 30 years in radiotherapy. The phantom can provide integrated tests of the entire chain of treatment planning and delivery. The purpose of this research was to investigate if this phantom can be used to chain test a treatment on the MRI linear accelerator (MRL) which is currently being developed at the UMC Utrecht, in collaboration with Elekta and Philips. The latter was demonstrated by chain testing the future First-in-Man treatments with this system.An Alderson phantom was used to chain test an entire treatment with the MRL. First, a CT was acquired of the phantom with additional markers that are both visible on MR and CT. A treatment plan for treating bone metastases in the sacrum was made. The phantom was consecutively placed in the MRL. For MRI imaging, an 3D volume was acquired. The initially developed treatment plan was then simulated on the new MRI dataset. For simulation, both the MR and CT data was used by registering them together. Before treatment delivery a MV image was acquired and compared with a DRR that was calculated form the MR/CT registration data. Finally, the treatment was delivered. Figure 1 shows both the T1 weighted MR-image of the phantom and the CT that was registered to the MR image. Figure 2 shows both the calculated and measured MV image that was acquired by the MV panel. Figure 3 shows the dose distribution that was simulated. The total elapsed time for the entire procedure excluding irradiation was 13:35 minutes.The Alderson Phantom yields sufficient MR contrast and can be used for full MR guided radiotherapy treatment chain testing. As a result, we are able to perform an end-to-end chain test of the future First-in-Man treatments.

  14. Numerical simulation support to the ESA/THOR mission

    Science.gov (United States)

    Valentini, F.; Servidio, S.; Perri, S.; Perrone, D.; De Marco, R.; Marcucci, M. F.; Daniele, B.; Bruno, R.; Camporeale, E.

    2016-12-01

    THOR is a spacecraft concept currently undergoing study phase as acandidate for the next ESA medium size mission M4. THOR has been designedto solve the longstanding physical problems of particle heating andenergization in turbulent plasmas. It will provide high resolutionmeasurements of electromagnetic fields and particle distribution functionswith unprecedented resolution, with the aim of exploring the so-calledkinetic scales. We present the numerical simulation framework which is supporting the THOR mission during the study phase. The THOR teamincludes many scientists developing and running different simulation codes(Eulerian-Vlasov, Particle-In-Cell, Gyrokinetics, Two-fluid, MHD, etc.),addressing the physics of plasma turbulence, shocks, magnetic reconnectionand so on.These numerical codes are being used during the study phase, mainly withthe aim of addressing the following points:(i) to simulate the response of real particle instruments on board THOR, byemploying an electrostatic analyser simulator which mimics the response ofthe CSW, IMS and TEA instruments to the particle velocity distributions ofprotons, alpha particle and electrons, as obtained from kinetic numericalsimulations of plasma turbulence.(ii) to compare multi-spacecraft with single-spacecraft configurations inmeasuring current density, by making use of both numerical models ofsynthetic turbulence and real data from MMS spacecraft.(iii) to investigate the validity of the Taylor hypothesis indifferent configurations of plasma turbulence

  15. STS-37 Mission Specialist (MS) Godwin during simulation in JSC's FB-SMS

    Science.gov (United States)

    1991-01-01

    STS-37 Mission Specialist (MS) Linda M. Godwin rehearses some phases of her scheduled duties on the middeck of the fixed-based (FB) shuttle mission simulator (SMS) located in JSC's Mission Simulation and Training Facility Bldg 5. Godwin is inspecting supplies stowed in the middeck lockers during this unsuited simulation.

  16. Private ground infrastructures for space exploration missions simulations

    Science.gov (United States)

    Souchier, Alain

    2010-06-01

    The Mars Society, a private non profit organisation devoted to promote the red planet exploration, decided to implement simulated Mars habitat in two locations on Earth: in northern Canada on the rim of a meteoritic crater (2000), in a US Utah desert, location of a past Jurassic sea (2001). These habitats have been built with large similarities to actual planned habitats for first Mars exploration missions. Participation is open to everybody either proposing experimentations or wishing only to participate as a crew member. Participants are from different organizations: Mars Society, Universities, experimenters working with NASA or ESA. The general philosophy of the work conducted is not to do an innovative scientific work on the field but to learn how the scientific work is affected or modified by the simulation conditions. Outside activities are conducted with simulated spacesuits limiting the experimenter abilities. Technology or procedures experimentations are also conducted as well as experimentations on the crew psychology and behaviour.

  17. Demonstration of the First Real-Time End-to-End 40-Gb/s PAM-4 for Next-Generation Access Applications using 10-Gb/s Transmitter

    DEFF Research Database (Denmark)

    Wei, J. L.; Eiselt, Nicklas; Griesser, Helmut

    2016-01-01

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next-generation access applications using 10-Gb/s class transmitters only. Based on the measurement of a real-time 40-Gb/s PAM system, low-cost upstream and downstream link power budgets are estimated. Up...

  18. Mission at Mubasi - A Simulation for Leadership Development

    Science.gov (United States)

    Cummings, Pau; Aude, Steven; Fallesen, Jon

    2012-01-01

    The United States Army is investing in simulations as a way of providing practice for leader decision making. Such simulations, grounded in lessons learned from deployment experienced leaders, place less experienced and more junior leaders in challenging situations they might soon be confronted with. And given increased demands on the Army to become more efficient, while maintaining acceptable levels of mission readiness, simulations offer a cost effective complement to live field training. So too, the design parameters of such a simulation can be made to reinforce specific behavior responses which teach leaders known theory and application of effective (and ineffective) decision making. With this in mind, the Center for Army Leadership (CAL) determined that decision-making was of critical importance. Specifically, the following aspects of decision-making were viewed as particularly important for today's Army leaders: 1) Decision dilemmas, in the form of equally appealing or equally unappealing choices, such that there is no clear "right" or "wrong" choice 2) Making decisions with incomplete or ambiguous information, and 3) Predicting and experiencing second- and third-order consequences of decisions. It is decision making in such a setting or environment that Army leaders are increasingly confronted with given the full spectrum of military operations they must be prepared for. This paper details the approach and development of this decision making simulation.

  19. STS-26 MS Hilmers on fixed based (FB) shuttle mission simulator (SMS) middeck

    Science.gov (United States)

    1988-01-01

    STS-26 Discovery, Orbiter Vehicle (OV) 103, Mission Specialist (MS) David C. Hilmers prepares to ascend a ladder representing the interdeck access hatch from the shuttle middeck to the flight deck. The STS-26 crew is training in the fixed base (FB) shuttle mission simulator (SMS) located in JSC Mission Simulation and Training Facility Bldg 5.

  20. STS-26 MS Lounge in fixed based (FB) shuttle mission simulator (SMS)

    Science.gov (United States)

    1988-01-01

    STS-26 Discovery, Orbiter Vehicle (OV) 103, Mission Specialist (MS) John M. Lounge, wearing comunications kit assembly headset and crouched on the aft flight deck, performs checklist inspection during training session. The STS-26 crew is training in the fixed base (FB) shuttle mission simulator (SMS) located in JSC Mission Simulation and Training Facility Bldg 5.

  1. Performance Simulations for a Spaceborne Methane Lidar Mission

    Science.gov (United States)

    Kiemle, C.; Kawa, Stephan Randolph; Quatrevalet, Mathieu; Browell, Edward V.

    2014-01-01

    Future spaceborne lidar measurements of key anthropogenic greenhouse gases are expected to close current observational gaps particularly over remote, polar, and aerosol-contaminated regions, where actual in situ and passive remote sensing observation techniques have difficulties. For methane, a "Methane Remote Lidar Mission" was proposed by Deutsches Zentrum fuer Luft- und Raumfahrt and Centre National d'Etudes Spatiales in the frame of a German-French climate monitoring initiative. Simulations assess the performance of this mission with the help of Moderate Resolution Imaging Spectroradiometer and Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations of the earth's surface albedo and atmospheric optical depth. These are key environmental parameters for integrated path differential absorption lidar which uses the surface backscatter to measure the total atmospheric methane column. Results showthat a lidar with an average optical power of 0.45W at 1.6 µm wavelength and a telescope diameter of 0.55 m, installed on a low Earth orbit platform(506 km), will measure methane columns at precisions of 1.2%, 1.7%, and 2.1% over land, water, and snow or ice surfaces, respectively, for monthly aggregated measurement samples within areas of 50 × 50 km2. Globally, the mean precision for the simulated year 2007 is 1.6%, with a standard deviation of 0.7%. At high latitudes, a lower reflectance due to snow and ice is compensated by denser measurements, owing to the orbital pattern. Over key methane source regions such as densely populated areas, boreal and tropical wetlands, or permafrost, our simulations show that the measurement precision will be between 1 and 2%.

  2. STS-26 MS Nelson on fixed based (FB) shuttle mission simulator (SMS) middeck

    Science.gov (United States)

    1988-01-01

    STS-26 Discovery, Orbiter Vehicle (OV) 103, Mission Specialist (MS) George D. Nelson trains on the middeck of the fixed based (FB) shuttle mission simulator (SMS). Nelson, wearing communications assembly headset, adjusts camera mounting bracket.

  3. Treatment of a partially thrombosed giant aneurysm of the vertebral artery by aneurysm trapping and direct vertebral artery-posterior inferior cerebellar artery end-to-end anastomosis: technical case report.

    Science.gov (United States)

    Benes, Ludwig; Kappus, Christoph; Sure, Ulrich; Bertalanffy, Helmut

    2006-07-01

    The purpose of this article is to focus for the first time on the operative management of a direct vertebral artery (VA)-posterior inferior cerebellar artery (PICA) end-to-end anastomosis in a partially thrombosed giant VA-PICA-complex aneurysm and to underline its usefulness as an additional treatment option. The operative technique of a direct VA-PICA end-to-end anatomosis is described in detail. The VA was entering the large aneurysm sack. Distally, the PICA originated from the aneurysm sack-VA-complex. The donor and recipient vessel were cut close to the aneurysm. Whereas the VA was cut in a straight manner, the PICA was cut at an oblique 45-degree angle to enlarge the vascular end diameter. Vessel ends were flushed with heparinized saline and sutured. The thrombotic material inside the aneurysm sack was removed and the distal VA clipped, leaving the anterior spinal artery and brainstem perforators free. The patient regained consciousness without additional morbidity. Magnetic resonance imaging scans revealed a completely decompressed brainstem without infarction. The postoperative angiograms demonstrated a good filling of the anastomosed PICA. Despite the caliber mistmatch of these two vessels the direct VA-PICA end-to-end anastomosis provides an accurate alternative in addition to other anastomoses and bypass techniques, when donor and recipient vessels are suitable and medullary perforators do not have to be disrupted.

  4. STS-37 crewmembers train in JSC's FB shuttle mission simulator (SMS)

    Science.gov (United States)

    1991-01-01

    STS-37 Commander Steven R. Nagel (left) and Mission Specialist (MS) Jerry L. Ross rehearse some of their scheduled duties on the flight deck of JSC's fixed-based (FB) shuttle mission simulator (SMS) located in the Mission Simulation and Training Facility Bldg 5. During the unsuited simulation, Nagel reviews checklist while seated at the commanders station as Ross looks on from the pilots station.

  5. STS-26 crew trains in JSC fixed-based (FB) shuttle mission simulator (SMS)

    Science.gov (United States)

    1987-01-01

    STS-26 Discovery, Orbiter Vehicle (OV) 103, crewmembers (left to right) Commander Frederick H. Hauck, Pilot Richard O. Covey, Mission Specialist (MS) George D. Nelson, MS David C. Hilmers, and MS John M. Lounge pose on the middeck in fixed-based (FB) shuttle mission simulator (SMS) located in JSC Mission Simulation and Training Facility Bldg 5. A simulation for their anticipated June 1988 flight began 10-20-87.

  6. National Renewable Energy Laboratory (NREL) Topic 2 Final Report: End-to-End Communication and Control System to Support Clean Energy Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Hudgins, Andrew P. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Carrillo, Ismael M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Jin, Xin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Simmins, John [Electric Power Research Inst. (EPRI), Palo Alto, CA (United States)

    2018-02-21

    This document is the final report of a two-year development, test, and demonstration project, 'Cohesive Application of Standards- Based Connected Devices to Enable Clean Energy Technologies.' The project was part of the National Renewable Energy Laboratory's (NREL's) Integrated Network Testbed for Energy Grid Research and Technology (INTEGRATE) initiative hosted at Energy Systems Integration Facility (ESIF). This project demonstrated techniques to control distribution grid events using the coordination of traditional distribution grid devices and high-penetration renewable resources and demand response. Using standard communication protocols and semantic standards, the project examined the use cases of high/low distribution voltage, requests for volt-ampere-reactive (VAR) power support, and transactive energy strategies using Volttron. Open source software, written by EPRI to control distributed energy resources (DER) and demand response (DR), was used by an advanced distribution management system (ADMS) to abstract the resources reporting to a collection of capabilities rather than needing to know specific resource types. This architecture allows for scaling both horizontally and vertically. Several new technologies were developed and tested. Messages from the ADMS based on the common information model (CIM) were developed to control the DER and DR management systems. The OpenADR standard was used to help manage grid events by turning loads off and on. Volttron technology was used to simulate a homeowner choosing the price at which to enter the demand response market. Finally, the ADMS used newly developed algorithms to coordinate these resources with a capacitor bank and voltage regulator to respond to grid events.

  7. Airborne Instrument Simulator for the Lidar Surface Topography (LIST) Mission

    Science.gov (United States)

    Yu, Anthony W.; Krainak, Michael A.; Harding, David J.; Abshire, James B.; Sun, Xiaoli; Cavanaugh, John; Valett, Susan; Ramos-Izquierdo, Luis

    2010-01-01

    In 2007, the National Research Council (NRC) completed its first decadal survey for Earth science at the request of NASA, NOAA, and USGS. The Lidar Surface Topography (LIST) mission is one of fifteen missions recommended by NRC, whose primary objectives are to map global topography and vegetation structure at 5 m spatial resolution, and to acquire global coverage with a few years. NASA Goddard conducted an initial mission concept study for the LIST mission 2007, and developed the initial measurement requirements for the mission.

  8. Airborne Lidar Simulator for the Lidar Surface Topography (LIST) Mission

    Science.gov (United States)

    Yu, Anthony W.; Krainak, Michael A.; Abshire, James B.; Cavanaugh, John; Valett, Susan; Ramos-Izquierdo, Luis

    2010-01-01

    In 2007, the National Research Council (NRC) completed its first decadal survey for Earth science at the request of NASA, NOAA, and USGS. The Lidar Surface Topography (LIST) mission is one of fifteen missions recommended by NRC, whose primary objectives are to map global topography and vegetation structure at 5 m spatial resolution, and to acquire global surface height mapping within a few years. NASA Goddard conducted an initial mission concept study for the LIST mission in 2007, and developed the initial measurement requirements for the mission.

  9. STS-36 crewmembers train in JSC's FB shuttle mission simulator (SMS)

    Science.gov (United States)

    1989-01-01

    STS-36 Mission Specialist (MS) David C. Hilmers, seated on the aft flight deck, discusses procedures with Commander John O. Creighton (left) and Pilot John H. Casper during a simulation in JSC's Fixed Based (FB) Shuttle Mission Simulator (SMS). Casper reviews a checklist at the pilots station on the forward flight deck. The crewmembers are rehearsing crew cabin activities for their upcoming Department of Defense (DOD) mission aboard Atlantis, Orbiter Vehicle (OV) 104.

  10. SU-F-P-39: End-To-End Validation of a 6 MV High Dose Rate Photon Beam, Configured for Eclipse AAA Algorithm Using Golden Beam Data, for SBRT Treatments Using RapidArc

    Energy Technology Data Exchange (ETDEWEB)

    Ferreyra, M; Salinas Aranda, F; Dodat, D; Sansogne, R; Arbiser, S [Vidt Centro Medico, Ciudad Autonoma De Buenos Aires, Ciudad Autonoma de Buenos Aire (Argentina)

    2016-06-15

    Purpose: To use end-to-end testing to validate a 6 MV high dose rate photon beam, configured for Eclipse AAA algorithm using Golden Beam Data (GBD), for SBRT treatments using RapidArc. Methods: Beam data was configured for Varian Eclipse AAA algorithm using the GBD provided by the vendor. Transverse and diagonals dose profiles, PDDs and output factors down to a field size of 2×2 cm2 were measured on a Varian Trilogy Linac and compared with GBD library using 2% 2mm 1D gamma analysis. The MLC transmission factor and dosimetric leaf gap were determined to characterize the MLC in Eclipse. Mechanical and dosimetric tests were performed combining different gantry rotation speeds, dose rates and leaf speeds to evaluate the delivery system performance according to VMAT accuracy requirements. An end-to-end test was implemented planning several SBRT RapidArc treatments on a CIRS 002LFC IMRT Thorax Phantom. The CT scanner calibration curve was acquired and loaded in Eclipse. PTW 31013 ionization chamber was used with Keithley 35617EBS electrometer for absolute point dose measurements in water and lung equivalent inserts. TPS calculated planar dose distributions were compared to those measured using EPID and MapCheck, as an independent verification method. Results were evaluated with gamma criteria of 2% dose difference and 2mm DTA for 95% of points. Results: GBD set vs. measured data passed 2% 2mm 1D gamma analysis even for small fields. Machine performance tests show results are independent of machine delivery configuration, as expected. Absolute point dosimetry comparison resulted within 4% for the worst case scenario in lung. Over 97% of the points evaluated in dose distributions passed gamma index analysis. Conclusion: Eclipse AAA algorithm configuration of the 6 MV high dose rate photon beam using GBD proved efficient. End-to-end test dose calculation results indicate it can be used clinically for SBRT using RapidArc.

  11. STS-26 crew on fixed based (FB) shuttle mission simulator (SMS) flight deck

    Science.gov (United States)

    1988-01-01

    STS-26 Discovery, Orbiter Vehicle (OV) 103, Commander Frederick H. Hauck (left) and Pilot Richard O. Covey review checklists in their respective stations on the foward flight deck. The STS-26 crew is training in the fixed base (FB) shuttle mission simulator (SMS) located in JSC Mission Simulation and Training Facility Bldg 5.

  12. SIP end to end performance metrics

    OpenAIRE

    Vozňák, Miroslav; Rozhon, Jan

    2012-01-01

    The paper deals with a SIP performance testing methodology. The main contribution to the field of performance testing of SIP infrastructure consists in the possibility to perform the standardized stress tests with the developed SIP TesterApp without a deeper knowledge in the area of SIP communication. The developed tool exploits several of open-source applications such as jQuery, Python, JSON and the cornerstone SIP generator SIPp, the result is highly modifiable and the ...

  13. CASTOR end-to-end monitoring

    International Nuclear Information System (INIS)

    Rekatsinas, Theodoros; Duellmann, Dirk; Pokorski, Witold; Ponce, Sebastien; Rabacal, Bartolomeu; Waldron, Dennis; Wojcieszuk, Jacek

    2010-01-01

    With the start of Large Hadron Collider approaching, storage and management of raw event data, as well as reconstruction and analysis data, is of crucial importance for the researchers. The CERN Advanced STORage system (CASTOR) is a hierarchical system developed at CERN, used to store physics production files and user files. CASTOR, as one of the essential software tools used by the LHC experiments, has to provide reliable services for storing and managing data. Monitoring of this complicated system is mandatory in order to assure its stable operation and improve its future performance. This paper presents the new monitoring system of CASTOR which provides operation and user request specific metrics. This system is build around a dedicated, optimized database schema. The schema is populated by PL/SQL procedures, which process a stream of incoming raw metadata from different CASTOR components, initially collected by the Distributed Logging Facility (DLF). A web interface has been developed for the visualization of the monitoring data. The different histograms and plots are created using PHP scripts which query the monitoring database.

  14. End-to-end energy efficient communication

    DEFF Research Database (Denmark)

    Dittmann, Lars

    Awareness of energy consumption in communication networks such as the Internet is currently gaining momentum as it is commonly acknowledged that increased network capacity (currently driven by video applications) requires significant more electrical power. This paper stresses the importance...

  15. Sleep and cognitive function of crewmembers and mission controllers working 24-h shifts during a simulated 105-day spaceflight mission

    Science.gov (United States)

    Barger, Laura K.; Wright, Kenneth P.; Burke, Tina M.; Chinoy, Evan D.; Ronda, Joseph M.; Lockley, Steven W.; Czeisler, Charles A.

    2014-01-01

    The success of long-duration space missions depends on the ability of crewmembers and mission support specialists to be alert and maintain high levels of cognitive function while operating complex, technical equipment. We examined sleep, nocturnal melatonin levels and cognitive function of crewmembers and the sleep and cognitive function of mission controllers who participated in a high-fidelity 105-day simulated spaceflight mission at the Institute of Biomedical Problems (Moscow). Crewmembers were required to perform daily mission duties and work one 24-h extended duration work shift every sixth day. Mission controllers nominally worked 24-h extended duration shifts. Supplemental lighting was provided to crewmembers and mission controllers. Participants' sleep was estimated by wrist-actigraphy recordings. Overall, results show that crewmembers and mission controllers obtained inadequate sleep and exhibited impaired cognitive function, despite countermeasure use, while working extended duration shifts. Crewmembers averaged 7.04±0.92 h (mean±SD) and 6.94±1.08 h (mean±SD) in the two workdays prior to the extended duration shifts, 1.88±0.40 h (mean±SD) during the 24-h work shift, and then slept 10.18±0.96 h (mean±SD) the day after the night shift. Although supplemental light was provided, crewmembers' average nocturnal melatonin levels remained elevated during extended 24-h work shifts. Naps and caffeine use were reported by crewmembers during ˜86% and 45% of extended night work shifts, respectively. Even with reported use of wake-promoting countermeasures, significant impairments in cognitive function were observed. Mission controllers slept 5.63±0.95 h (mean±SD) the night prior to their extended duration work shift. On an average, 89% of night shifts included naps with mission controllers sleeping an average of 3.4±1.0 h (mean±SD) during the 24-h extended duration work shift. Mission controllers also showed impaired cognitive function during extended

  16. Electronic remote blood issue: a combination of remote blood issue with a system for end-to-end electronic control of transfusion to provide a "total solution" for a safe and timely hospital blood transfusion service.

    Science.gov (United States)

    Staves, Julie; Davies, Amanda; Kay, Jonathan; Pearson, Oliver; Johnson, Tony; Murphy, Michael F

    2008-03-01

    The rapid provision of red cell (RBC) units to patients needing blood urgently is an issue of major importance in transfusion medicine. The development of electronic issue (sometimes termed "electronic crossmatch") has facilitated rapid provision of RBC units by avoidance of the serologic crossmatch in eligible patients. A further development is the issue of blood under electronic control at blood refrigerator remote from the blood bank. This study evaluated a system for electronic remote blood issue (ERBI) developed as an enhancement of a system for end-to-end electronic control of hospital transfusion. Practice was evaluated before and after its introduction in cardiac surgery. Before the implementation of ERBI, the median time to deliver urgently required RBC units to the patient was 24 minutes. After its implementation, RBC units were obtained from the nearby blood refrigerator in a median time of 59 seconds (range, 30 sec to 2 min). The study also found that unused requests were reduced significantly from 42 to 20 percent, the number of RBC units issued reduced by 52 percent, the number of issued units that were transfused increased from 40 to 62 percent, and there was a significant reduction in the workload of both blood bank and clinical staff. This study evaluated a combination of remote blood issue with an end-to-end electronically controlled hospital transfusion process, ERBI. ERBI reduced the time to make blood available for surgical patients and improved the efficiency of hospital transfusion.

  17. Phobos Environment Model and Regolith Simulant for MMX Mission

    Science.gov (United States)

    Miyamoto, H.; Niihara, T.; Wada, K.; Ogawa, K.; Baresi, N.; Abell, Paul A.; Asphaug, E.; Britt, D.; Dodbiba, G.; Fujita, T.; hide

    2018-01-01

    Phobos and Deimos, the two moons of Mars, are considered to be scientifically important and potential human mission's target. Martian Moons eXplorer (MMX) is the JAXA's mission to explore Phobos (and/or Deimos), which is scheduled to be launched in 2024. The main spacecraft of MMX will perform in-situ observations of both Phobos and Deimos, land on one of them (most likely, Phobos), and bring samples back to Earth. Small landing modules may be included in the mission as for the Hayabusa-2 mission. The designs of both the landing and sampling devices depend largely on the surface conditions of the target body and on how this surface reacts to an external action in the low gravity conditions of the target. Thus, the Landing Operation Working Team (LOWT) of MMX, which is composed of both scientists and engineers, is studying Phobos' surface based on previous observations and theoretical/experimental considerations. Though engineering motivation initiated this activity, the results will be extremely useful for scientific purposes.

  18. Safety and efficacy of the NiTi Shape Memory Compression Anastomosis Ring (CAR/ColonRing) for end-to-end compression anastomosis in anterior resection or low anterior resection.

    Science.gov (United States)

    Kang, Jeonghyun; Park, Min Geun; Hur, Hyuk; Min, Byung Soh; Lee, Kang Young; Kim, Nam Kyu

    2013-04-01

    Compression anastomoses may represent an improvement over traditional hand-sewn or stapled techniques. This prospective exploratory study aimed to assess the efficacy and complication rates in patients undergoing anterior resection (AR) or low anterior resection (LAR) anastomosed with a novel end-to-end compression anastomosis ring, the ColonRing. In all, 20 patients (13 male) undergoing AR or LAR were enrolled to be anastomosed using the NiTi Shape Memory End-to-End Compression Anastomosis Ring (NiTi Medical Technologies Ltd, Netanya, Israel). Demographic, intraoperative, and postoperative data were collected. Patients underwent AR (11/20) or LAR using laparoscopy (75%), robotic (10%) surgery, or an open laparotomy (15%) approach, with a median anastomotic level of 14.5 cm (range, 4-25 cm). Defunctioning loop ileostomies were formed in 6 patients for low anastomoses. Surgeons rated the ColonRing device as either easy or very easy to use. One patient developed an anastomotic leakage in the early postoperative period; there were no late postoperative complications. Mean time to passage of first flatus and commencement of oral fluids was 2.5 days and 3.2 days, respectively. Average hospital stay was 12.6 days (range, 8-23 days). Finally, the device was expelled on average 15.3 days postoperatively without difficulty. This is the first study reporting results in a significant number of LAR patients and the first reported experience from South Korea; it shows that the compression technique is surgically feasible, easy to use, and without significant complication rates. A large randomized controlled trial is warranted to investigate the benefits of the ColonRing over traditional stapling techniques.

  19. Efficacy and safety of a NiTi CAR 27 compression ring for end-to-end anastomosis compared with conventional staplers: A real-world analysis in Chinese colorectal cancer patients

    Science.gov (United States)

    Lu, Zhenhai; Peng, Jianhong; Li, Cong; Wang, Fulong; Jiang, Wu; Fan, Wenhua; Lin, Junzhong; Wu, Xiaojun; Wan, Desen; Pan, Zhizhong

    2016-01-01

    OBJECTIVES: This study aimed to evaluate the safety and efficacy of a new nickel-titanium shape memory alloy compression anastomosis ring, NiTi CAR 27, in constructing an anastomosis for colorectal cancer resection compared with conventional staples. METHODS: In total, 234 consecutive patients diagnosed with colorectal cancer receiving sigmoidectomy and anterior resection for end-to-end anastomosis from May 2010 to June 2012 were retrospectively analyzed. The postoperative clinical parameters, postoperative complications and 3-year overall survival in 77 patients using a NiTi CAR 27 compression ring (CAR group) and 157 patients with conventional circular staplers (STA group) were compared. RESULTS: There were no statistically significant differences between the patients in the two groups in terms of general demographics and tumor features. A clinically apparent anastomotic leak occurred in 2 patients (2.6%) in the CAR group and in 5 patients (3.2%) in the STA group (p=0.804). These eight patients received a temporary diverting ileostomy. One patient (1.3%) in the CAR group was diagnosed with anastomotic stricture through an electronic colonoscopy after 3 months postoperatively. The incidence of postoperative intestinal obstruction was comparable between the two groups (p=0.192). With a median follow-up duration of 39.6 months, the 3-year overall survival rate was 83.1% in the CAR group and 89.0% in the STA group (p=0.152). CONCLUSIONS: NiTi CAR 27 is safe and effective for colorectal end-to-end anastomosis. Its use is equivalent to that of the conventional circular staplers. This study suggests that NiTi CAR 27 may be a beneficial alternative in colorectal anastomosis in Chinese colorectal cancer patients. PMID:27276395

  20. STS-29 Commander Coats in JSC fixed base (FB) shuttle mission simulator (SMS)

    Science.gov (United States)

    1986-01-01

    STS-29 Discovery, Orbiter Vehicle (OV) 103, Commander Michael L. Coats sits at commanders station forward flight deck controls in JSC fixed base (FB) shuttle mission simulator (SMS). Coats, wearing communications kit assembly headset and flight coveralls, looks away from forward control panels to aft flight deck. Pilots station seat back appears in foreground. FB-SMS is located in JSC Mission Simulation and Training Facility Bldg 5.

  1. STS-26 Commander Hauck in fixed based (FB) shuttle mission simulator (SMS)

    Science.gov (United States)

    1988-01-01

    STS-26 Discovery, Orbiter Vehicle (OV) 103, Commander Frederick H. Hauck, wearing comunications kit assembly headset and seated in the commanders seat on forward flight deck, looks over his shoulder toward the aft flight deck. A flight data file (FDF) notebook rests on his lap. The STS-26 crew is training in the fixed base (FB) shuttle mission simulator (SMS) located in JSC Mission Simulation and Training Facility Bldg 5.

  2. STS-57 crewmembers train in JSC's FB Shuttle Mission Simulator (SMS)

    Science.gov (United States)

    1993-01-01

    STS-57 Endeavour, Orbiter Vehicle (OV) 105, Mission Specialist 2 (MS2) Nancy J. Sherlock, holding computer diskettes and procedural checklist, discusses equipment operation with Commander Ronald J. Grabe on the middeck of JSC's fixed based (FB) shuttle mission simulator (SMS). Payload Commander (PLC) G. David Low points to a forward locker location as MS3 Peter J.K. Wisoff switches controls on overhead panels MO42F and MO58F, and MS4 Janice E. Voss looks on. The FB-SMS is located in the Mission Simulation and Training Facility Bldg 5.

  3. Internet Technology for Future Space Missions

    Science.gov (United States)

    Hennessy, Joseph F. (Technical Monitor); Rash, James; Casasanta, Ralph; Hogie, Keith

    2002-01-01

    Ongoing work at National Aeronautics and Space Administration Goddard Space Flight Center (NASA/GSFC), seeks to apply standard Internet applications and protocols to meet the technology challenge of future satellite missions. Internet protocols and technologies are under study as a future means to provide seamless dynamic communication among heterogeneous instruments, spacecraft, ground stations, constellations of spacecraft, and science investigators. The primary objective is to design and demonstrate in the laboratory the automated end-to-end transport of files in a simulated dynamic space environment using off-the-shelf, low-cost, commodity-level standard applications and protocols. The demonstrated functions and capabilities will become increasingly significant in the years to come as both earth and space science missions fly more sensors and the present labor-intensive, mission-specific techniques for processing and routing data become prohibitively. This paper describes how an IP-based communication architecture can support all existing operations concepts and how it will enable some new and complex communication and science concepts. The authors identify specific end-to-end data flows from the instruments to the control centers and scientists, and then describe how each data flow can be supported using standard Internet protocols and applications. The scenarios include normal data downlink and command uplink as well as recovery scenarios for both onboard and ground failures. The scenarios are based on an Earth orbiting spacecraft with downlink data rates from 300 Kbps to 4 Mbps. Included examples are based on designs currently being investigated for potential use by the Global Precipitation Measurement (GPM) mission.

  4. Simulation Approach to Mission Risk and Reliability Analysis, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  5. Mission simulation as an approach to develop requirements for automation in Advanced Life Support Systems

    Science.gov (United States)

    Erickson, J. D.; Eckelkamp, R. E.; Barta, D. J.; Dragg, J.; Henninger, D. L. (Principal Investigator)

    1996-01-01

    This paper examines mission simulation as an approach to develop requirements for automation and robotics for Advanced Life Support Systems (ALSS). The focus is on requirements and applications for command and control, control and monitoring, situation assessment and response, diagnosis and recovery, adaptive planning and scheduling, and other automation applications in addition to mechanized equipment and robotics applications to reduce the excessive human labor requirements to operate and maintain an ALSS. Based on principles of systems engineering, an approach is proposed to assess requirements for automation and robotics using mission simulation tools. First, the story of a simulated mission is defined in terms of processes with attendant types of resources needed, including options for use of automation and robotic systems. Next, systems dynamics models are used in simulation to reveal the implications for selected resource allocation schemes in terms of resources required to complete operational tasks. The simulations not only help establish ALSS design criteria, but also may offer guidance to ALSS research efforts by identifying gaps in knowledge about procedures and/or biophysical processes. Simulations of a planned one-year mission with 4 crewmembers in a Human Rated Test Facility are presented as an approach to evaluation of mission feasibility and definition of automation and robotics requirements.

  6. SU-F-P-37: Implementation of An End-To-End QA Test of the Radiation Therapy Imaging, Planning and Delivery Process to Identify and Correct Possible Sources of Deviation

    International Nuclear Information System (INIS)

    Salinas Aranda, F; Suarez, V; Arbiser, S; Sansogne, R

    2016-01-01

    Purpose: To implement an end-to-end QA test of the radiation therapy imaging, planning and delivery process, aimed to assess the dosimetric agreement accuracy between planned and delivered treatment, in order to identify and correct possible sources of deviation. To establish an internal standard for machine commissioning acceptance. Methods: A test involving all steps of the radiation therapy: imaging, planning and delivery process was designed. The test includes analysis of point dose and planar dose distributions agreement between TPS calculated and measured dose. An ad hoc 16 cm diameter PMMA phantom was constructed with one central and four peripheral bores that can accommodate calibrated electron density inserts. Using Varian Eclipse 10.0 and Elekta XiO 4.50 planning systems, IMRT, RapidArc and 3DCRT with hard and dynamic wedges plans were planned on the phantom and tested. An Exradin A1SL chamber is used with a Keithley 35617EBS electrometer for point dose measurements in the phantom. 2D dose distributions were acquired using MapCheck and Varian aS1000 EPID.Gamma analysis was performed for evaluation of 2D dose distribution agreement using MapCheck software and Varian Portal Dosimetry Application.Varian high energy Clinacs Trilogy, 2100C/CD, 2000CR and low energy 6X/EX where tested.TPS-CT# vs. electron density table were checked for CT-scanners used. Results: Calculated point doses were accurate to 0.127% SD: 0.93%, 0.507% SD: 0.82%, 0.246% SD: 1.39% and 0.012% SD: 0.01% for LoX-3DCRT, HiX-3DCRT, IMRT and RapidArc plans respectively. Planar doses pass gamma 3% 3mm in all cases and 2% 2mm for VMAT plans. Conclusion: Implementation of a simple and reliable quality assurance tool was accomplished. The end-to-end proved efficient, showing excellent agreement between planned and delivered dose evidencing strong consistency of the whole process from imaging through planning to delivery. This test can be used as a first step in beam model acceptance for clinical

  7. Battery Simulation Tool for Worst Case Analysis and Mission Evaluations

    Directory of Open Access Journals (Sweden)

    Lefeuvre Stéphane

    2017-01-01

    The first part of this paper presents the PSpice models including their respective variable parameters at SBS and cell level. Then the second part of the paper introduces to the reader the model parameters that were chosen and identified to perform Monte Carlo Analysis simulations. The third part reflects some MCA results for a VES16 battery module. Finally the reader will see some other simulations that were performed by re-using the battery model for an another Saft battery cell type (MP XTD for a specific space application, at high temperature.

  8. 3D Printed Surgical Instruments Evaluated by a Simulated Crew of a Mars Mission.

    Science.gov (United States)

    Wong, Julielynn Y; Pfahnl, Andreas C

    2016-09-01

    The first space-based fused deposition modeling (FDM) 3D printer became operational in 2014. This study evaluated whether Mars simulation crewmembers of the Hawai'i Space Exploration Analog and Simulation (HI-SEAS) II mission with no prior surgical experience could utilize acrylonitrile butadiene styrene (ABS) thermoplastic surgical instruments FDM 3D printed on Earth to complete simulated surgical tasks. This study sought to examine the feasibility of using 3D printed surgical tools when the primary crew medical officer is incapacitated and the back-up crew medical officer must conduct a surgical procedure during a simulated extended space mission. During a 4 mo duration ground-based analog mission, five simulation crewmembers with no prior surgical experience completed 16 timed sets of simulated prepping, draping, incising, and suturing tasks to evaluate the relative speed of using four ABS thermoplastic instruments printed on Earth compared to conventional instruments. All four simulated surgical tasks were successfully performed using 3D printed instruments by Mars simulation crewmembers with no prior surgical experience. There was no substantial difference in time to completion of simulated tasks with control vs. 3D printed sponge stick, towel clamp, scalpel handle, and toothed forceps. These limited findings support further investigation into the creation of an onboard digital catalog of validated 3D printable surgical instrument design files to support autonomous, crew-administered healthcare on Mars missions. Future work could include addressing sterility, biocompatibility, and having astronaut crew medical officers test a wider range of surgical instruments printed in microgravity during actual surgical procedures. Wong JY, Pfahnl AC. 3D printed surgical instruments evaluated by a simulated crew of a Mars mission. Aerosp Med Hum Perform. 2016; 87(9):806-810.

  9. Psychological and behavioral changes during confinement in a 520-day simulated interplanetary mission to mars.

    Directory of Open Access Journals (Sweden)

    Mathias Basner

    Full Text Available Behavioral health risks are among the most serious and difficult to mitigate risks of confinement in space craft during long-duration space exploration missions. We report on behavioral and psychological reactions of a multinational crew of 6 healthy males confined in a 550 m(3 chamber for 520 days during the first Earth-based, high-fidelity simulated mission to Mars. Rest-activity of crewmembers was objectively measured throughout the mission with wrist-worn actigraphs. Once weekly throughout the mission crewmembers completed the Beck Depression Inventory-II (BDI-II, Profile of Moods State short form (POMS, conflict questionnaire, the Psychomotor Vigilance Test (PVT-B, and series of visual analogue scales on stress and fatigue. We observed substantial inter-individual differences in the behavioral responses of crewmembers to the prolonged mission confinement and isolation. The crewmember with the highest average POMS total mood disturbance score throughout the mission also reported symptoms of depression in 93% of mission weeks, which reached mild-to-moderate levels in >10% of mission weeks. Conflicts with mission control were reported five times more often than conflicts among crewmembers. Two crewmembers who had the highest ratings of stress and physical exhaustion accounted for 85% of the perceived conflicts. One of them developed a persistent sleep onset insomnia with ratings of poor sleep quality, which resulted in chronic partial sleep deprivation, elevated ratings of daytime tiredness, and frequent deficits in behavioral alertness. Sleep-wake timing was altered in two other crewmembers, beginning in the first few months of the mission and persisting throughout. Two crewmembers showed neither behavioral disturbances nor reports of psychological distress during the 17-month period of mission confinement. These results highlight the importance of identifying behavioral, psychological, and biological markers of characteristics that

  10. Rearrangement of potassium ions and Kv1.1/Kv1.2 potassium channels in regenerating axons following end-to-end neurorrhaphy: ionic images from TOF-SIMS.

    Science.gov (United States)

    Liu, Chiung-Hui; Chang, Hung-Ming; Wu, Tsung-Huan; Chen, Li-You; Yang, Yin-Shuo; Tseng, To-Jung; Liao, Wen-Chieh

    2017-10-01

    The voltage-gated potassium channels Kv1.1 and Kv1.2 that cluster at juxtaparanodal (JXP) regions are essential in the regulation of nerve excitability and play a critical role in axonal conduction. When demyelination occurs, Kv1.1/Kv1.2 activity increases, suppressing the membrane potential nearly to the equilibrium potential of K + , which results in an axonal conduction blockade. The recovery of K + -dependent communication signals and proper clustering of Kv1.1/Kv1.2 channels at JXP regions may directly reflect nerve regeneration following peripheral nerve injury. However, little is known about potassium channel expression and its relationship with the dynamic potassium ion distribution at the node of Ranvier during the regenerative process of peripheral nerve injury (PNI). In the present study, end-to-end neurorrhaphy (EEN) was performed using an in vivo model of PNI. The distribution of K + at regenerating axons following EEN was detected by time-of-flight secondary-ion mass spectrometry. The specific localization and expression of Kv1.1/Kv1.2 channels were examined by confocal microscopy and western blotting. Our data showed that the re-establishment of K + distribution and intensity was correlated with the functional recovery of compound muscle action potential morphology in EEN rats. Furthermore, the re-clustering of Kv1.1/1.2 channels 1 and 3 months after EEN at the nodal region of the regenerating nerve corresponded to changes in the K + distribution. This study provided direct evidence of K + distribution in regenerating axons for the first time. We proposed that the Kv1.1/Kv1.2 channels re-clustered at the JXP regions of regenerating axons are essential for modulating the proper patterns of K + distribution in axons for maintaining membrane potential stability after EEN.

  11. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    Science.gov (United States)

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems

  12. Phased mission modelling of systems with maintenance-free operating periods using simulated Petri nets

    Energy Technology Data Exchange (ETDEWEB)

    Chew, S.P.; Dunnett, S.J. [Department of Aeronautical and Automotive Engineering, Loughborough University, Loughborough, Leics (United Kingdom); Andrews, J.D. [Department of Aeronautical and Automotive Engineering, Loughborough University, Loughborough, Leics (United Kingdom)], E-mail: j.d.andrews@lboro.ac.uk

    2008-07-15

    A common scenario in engineering is that of a system which operates throughout several sequential and distinct periods of time, during which the modes and consequences of failure differ from one another. This type of operation is known as a phased mission, and for the mission to be a success the system must successfully operate throughout all of the phases. Examples include a rocket launch and an aeroplane flight. Component or sub-system failures may occur at any time during the mission, yet not affect the system performance until the phase in which their condition is critical. This may mean that the transition from one phase to the next is a critical event that leads to phase and mission failure, with the root cause being a component failure in a previous phase. A series of phased missions with no maintenance may be considered as a maintenance-free operating period (MFOP). This paper describes the use of a Petri net (PN) to model the reliability of the MFOP and phased missions scenario. The model uses Monte-Carlo simulation to obtain its results, and due to the modelling power of PNs, can consider complexities such as component failure rate interdependencies and mission abandonment. The model operates three different types of PN which interact to provide the overall system reliability modelling. The model is demonstrated and validated by considering two simple examples that can be solved analytically.

  13. Phased mission modelling of systems with maintenance-free operating periods using simulated Petri nets

    International Nuclear Information System (INIS)

    Chew, S.P.; Dunnett, S.J.; Andrews, J.D.

    2008-01-01

    A common scenario in engineering is that of a system which operates throughout several sequential and distinct periods of time, during which the modes and consequences of failure differ from one another. This type of operation is known as a phased mission, and for the mission to be a success the system must successfully operate throughout all of the phases. Examples include a rocket launch and an aeroplane flight. Component or sub-system failures may occur at any time during the mission, yet not affect the system performance until the phase in which their condition is critical. This may mean that the transition from one phase to the next is a critical event that leads to phase and mission failure, with the root cause being a component failure in a previous phase. A series of phased missions with no maintenance may be considered as a maintenance-free operating period (MFOP). This paper describes the use of a Petri net (PN) to model the reliability of the MFOP and phased missions scenario. The model uses Monte-Carlo simulation to obtain its results, and due to the modelling power of PNs, can consider complexities such as component failure rate interdependencies and mission abandonment. The model operates three different types of PN which interact to provide the overall system reliability modelling. The model is demonstrated and validated by considering two simple examples that can be solved analytically

  14. Constellations of Next Generation Gravity Missions: Simulations regarding optimal orbits and mitigation of aliasing errors

    Science.gov (United States)

    Hauk, M.; Pail, R.; Gruber, T.; Purkhauser, A.

    2017-12-01

    The CHAMP and GRACE missions have demonstrated the tremendous potential for observing mass changes in the Earth system from space. In order to fulfil future user needs a monitoring of mass distribution and mass transport with higher spatial and temporal resolution is required. This can be achieved by a Bender-type Next Generation Gravity Mission (NGGM) consisting of a constellation of satellite pairs flying in (near-)polar and inclined orbits, respectively. For these satellite pairs the observation concept of the GRACE Follow-on mission with a laser-based low-low satellite-to-satellite tracking (ll-SST) system and more precise accelerometers and state-of-the-art star trackers is adopted. By choosing optimal orbit constellations for these satellite pairs high frequency mass variations will be observable and temporal aliasing errors from under-sampling will not be the limiting factor anymore. As part of the European Space Agency (ESA) study "ADDCON" (ADDitional CONstellation and Scientific Analysis Studies of the Next Generation Gravity Mission) a variety of mission design parameters for such constellations are investigated by full numerical simulations. These simulations aim at investigating the impact of several orbit design choices and at the mitigation of aliasing errors in the gravity field retrieval by co-parametrization for various constellations of Bender-type NGGMs. Choices for orbit design parameters such as altitude profiles during mission lifetime, length of retrieval period, value of sub-cycles and choice of prograde versus retrograde orbits are investigated as well. Results of these simulations are presented and optimal constellations for NGGM's are identified. Finally, a short outlook towards new geophysical applications like a near real time service for hydrology is given.

  15. Behavioral and biological effects of autonomous versus scheduled mission management in simulated space-dwelling groups

    Science.gov (United States)

    Roma, Peter G.; Hursh, Steven R.; Hienz, Robert D.; Emurian, Henry H.; Gasior, Eric D.; Brinson, Zabecca S.; Brady, Joseph V.

    2011-05-01

    Logistical constraints during long-duration space expeditions will limit the ability of Earth-based mission control personnel to manage their astronaut crews and will thus increase the prevalence of autonomous operations. Despite this inevitability, little research exists regarding crew performance and psychosocial adaptation under such autonomous conditions. To this end, a newly-initiated study on crew management systems was conducted to assess crew performance effectiveness under rigid schedule-based management of crew activities by Mission Control versus more flexible, autonomous management of activities by the crews themselves. Nine volunteers formed three long-term crews and were extensively trained in a simulated planetary geological exploration task over the course of several months. Each crew then embarked on two separate 3-4 h missions in a counterbalanced sequence: Scheduled, in which the crews were directed by Mission Control according to a strict topographic and temporal region-searching sequence, and Autonomous, in which the well-trained crews received equivalent baseline support from Mission Control but were free to explore the planetary surface as they saw fit. Under the autonomous missions, performance in all three crews improved (more high-valued geologic samples were retrieved), subjective self-reports of negative emotional states decreased, unstructured debriefing logs contained fewer references to negative emotions and greater use of socially-referent language, and salivary cortisol output across the missions was attenuated. The present study provides evidence that crew autonomy may improve performance and help sustain if not enhance psychosocial adaptation and biobehavioral health. These controlled experimental data contribute to an emerging empirical database on crew autonomy which the international astronautics community may build upon for future research and ultimately draw upon when designing and managing missions.

  16. Multiagent Modeling and Simulation in Human-Robot Mission Operations Work System Design

    Science.gov (United States)

    Sierhuis, Maarten; Clancey, William J.; Sims, Michael H.; Shafto, Michael (Technical Monitor)

    2001-01-01

    This paper describes a collaborative multiagent modeling and simulation approach for designing work systems. The Brahms environment is used to model mission operations for a semi-autonomous robot mission to the Moon at the work practice level. It shows the impact of human-decision making on the activities and energy consumption of a robot. A collaborative work systems design methodology is described that allows informal models, created with users and stakeholders, to be used as input to the development of formal computational models.

  17. Scalable Integrated Multi-Mission Support System Simulator Release 3.0

    Science.gov (United States)

    Kim, John; Velamuri, Sarma; Casey, Taylor; Bemann, Travis

    2012-01-01

    The Scalable Integrated Multi-mission Support System (SIMSS) is a tool that performs a variety of test activities related to spacecraft simulations and ground segment checks. SIMSS is a distributed, component-based, plug-and-play client-server system useful for performing real-time monitoring and communications testing. SIMSS runs on one or more workstations and is designed to be user-configurable or to use predefined configurations for routine operations. SIMSS consists of more than 100 modules that can be configured to create, receive, process, and/or transmit data. The SIMSS/GMSEC innovation is intended to provide missions with a low-cost solution for implementing their ground systems, as well as significantly reducing a mission s integration time and risk.

  18. Effective teamwork and communication mitigate task saturation in simulated critical care air transport team missions.

    Science.gov (United States)

    Davis, Bradley; Welch, Katherine; Walsh-Hart, Sharon; Hanseman, Dennis; Petro, Michael; Gerlach, Travis; Dorlac, Warren; Collins, Jocelyn; Pritts, Timothy

    2014-08-01

    Critical Care Air Transport Teams (CCATTs) are a critical component of the United States Air Force evacuation paradigm. This study was conducted to assess the incidence of task saturation in simulated CCATT missions and to determine if there are predictable performance domains. Sixteen CCATTs were studied over a 6-month period. Performance was scored using a tool assessing eight domains of performance. Teams were also assessed during critical events to determine the presence or absence of task saturation and its impact on patient care. Sixteen simulated missions were reviewed and 45 crisis events identified. Task saturation was present in 22/45 (49%) of crisis events. Scoring demonstrated that task saturation was associated with poor performance in teamwork (odds ratio [OR] = 1.96), communication (OR = 2.08), and mutual performance monitoring (OR = 1.9), but not maintenance of guidelines, task management, procedural skill, and equipment management. We analyzed the effect of task saturation on adverse patient outcomes during crisis events. Adverse outcomes occurred more often when teams were task saturated as compared to non-task-saturated teams (91% vs. 23%; RR 4.1, p < 0.0001). Task saturation is observed in simulated CCATT missions. Nontechnical skills correlate with task saturation. Task saturation is associated with worsening physiologic derangements in simulated patients. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  19. Simulation of the Chang'E-5 mission contribution in lunar long wavelength gravity field improvement

    Science.gov (United States)

    Yan, Jianguo; Yang, Xuan; Ping, Jinsong; Ye, Mao; Liu, Shanhong; Jin, Weitong; Li, Fei; Barriot, Jean-Pierre

    2018-06-01

    The precision of lunar gravity field estimation has improved by means of three to five orders of magnitude since the successful GRAIL lunar mission. There are still discrepancies however, in the low degree coefficients and long wavelength components of the solutions developed by two space research centers (JPL and GSFC). These discrepancies hint at the possibilities for improving the accuracy in the long wavelength part of the lunar gravity field. In the near future, China will launch the Chang'E-5 lunar mission. In this sample-return mission, there will be a chance to do KBRR measurements between an ascending module and an orbiting module. These two modules will fly around lunar at an inclination of ˜49 degrees, with an orbital height of 100 km and an inter-satellite distance of 200 km. In our research, we simulated the contribution of the KBRR tracking mode for different GRAIL orbital geometries. This analysis indicated possible deficiencies in the low degree coefficient solutions for the polar satellite-to-satellite tracking mode at various orbital heights. We also investigated the potential contributions of the KBRR to the Chang'E-5 mission goal of lunar gravity field recovery, especially in the long wavelength component. Potential improvements were assessed using various power spectrums of the lunar gravity field models. In addition, we also investigated possible improvements in solving lunar tidal Love number K2. These results may assist the implementation of the Chang'E-5 mission.

  20. Desert Rats 2011 Mission Simulation: Effects of Microgravity Operational Modes on Fields Geology Capabilities

    Science.gov (United States)

    Bleacher, Jacob E.; Hurtado, J. M., Jr.; Meyer, J. A.

    2012-01-01

    Desert Research and Technology Studies (DRATS) is a multi-year series of NASA tests that deploy planetary surface hardware and exercise mission and science operations in difficult conditions to advance human and robotic exploration capabilities. DRATS 2011 (Aug. 30-Sept. 9, 2011) tested strategies for human exploration of microgravity targets such as near-Earth asteroids (NEAs). Here we report the crew perspective on the impact of simulated microgravity operations on our capability to conduct field geology.

  1. The Value of Biomedical Simulation Environments to Future Human Space Flight Missions

    Science.gov (United States)

    Mulugeta,Lealem; Myers, Jerry G.; Lewandowski, Beth; Platts, Steven H.

    2011-01-01

    Mars and NEO missions will expose astronaut to extended durations of reduced reduced gravity, isolation and higher radiation. These new operation conditions pose health risks that are not well understood and perhaps unanticipated. Advanced computational simulation environments can beneficially augment research to predict, assess and mitigate potential hazards to astronaut health. The NASA Digital Astronaut Project (DAP), within the NASA Human Research Program, strives to achieve this goal.

  2. Mission operations management

    Science.gov (United States)

    Rocco, David A.

    1994-01-01

    Redefining the approach and philosophy that operations management uses to define, develop, and implement space missions will be a central element in achieving high efficiency mission operations for the future. The goal of a cost effective space operations program cannot be realized if the attitudes and methodologies we currently employ to plan, develop, and manage space missions do not change. A management philosophy that is in synch with the environment in terms of budget, technology, and science objectives must be developed. Changing our basic perception of mission operations will require a shift in the way we view the mission. This requires a transition from current practices of viewing the mission as a unique end product, to a 'mission development concept' built on the visualization of the end-to-end mission. To achieve this change we must define realistic mission success criteria and develop pragmatic approaches to achieve our goals. Custom mission development for all but the largest and most unique programs is not practical in the current budget environment, and we simply do not have the resources to implement all of our planned science programs. We need to shift our management focus to allow us the opportunity make use of methodologies and approaches which are based on common building blocks that can be utilized in the space, ground, and mission unique segments of all missions.

  3. Operation and evaluation of the Terminal Configured Vehicle Mission Simulator in an automated terminal area metering and spacing ATC environment

    Science.gov (United States)

    Houck, J. A.

    1980-01-01

    This paper describes the work being done at the National Aeronautics and Space Administration's Langley Research Center on the development of a mission simulator for use in the Terminal Configured Vehicle Program. A brief description of the goals and objectives of the Terminal Configured Vehicle Program is presented. A more detailed description of the Mission Simulator, in its present configuration, and its components is provided. Finally, a description of the first research study conducted in the Mission Simulator is presented along with a discussion of some preliminary results from this study.

  4. Development of a multi-media crew-training program for the terminal configured vehicle mission simulator

    Science.gov (United States)

    Rhouck, J. A.; Markos, A. T.

    1980-01-01

    This paper describes the work being done at the National Aeronautics and Space Administration's (NASA) Langley Research Center on the development of a multi-media crew-training program for the Terminal Configured Vehicle (TCV) Mission Simulator. Brief descriptions of the goals and objectives of the TCV Program and of the TCV Mission Simulator are presented. A detailed description of the training program is provided along with a description of the performance of the first group of four commercial pilots to be qualified in the TCV Mission Simulator.

  5. MDP: Reliable File Transfer for Space Missions

    Science.gov (United States)

    Rash, James; Criscuolo, Ed; Hogie, Keith; Parise, Ron; Hennessy, Joseph F. (Technical Monitor)

    2002-01-01

    This paper presents work being done at NASA/GSFC by the Operating Missions as Nodes on the Internet (OMNI) project to demonstrate the application of the Multicast Dissemination Protocol (MDP) to space missions to reliably transfer files. This work builds on previous work by the OMNI project to apply Internet communication technologies to space communication. The goal of this effort is to provide an inexpensive, reliable, standard, and interoperable mechanism for transferring files in the space communication environment. Limited bandwidth, noise, delay, intermittent connectivity, link asymmetry, and one-way links are all possible issues for space missions. Although these are link-layer issues, they can have a profound effect on the performance of transport and application level protocols. MDP, a UDP-based reliable file transfer protocol, was designed for multicast environments which have to address these same issues, and it has done so successfully. Developed by the Naval Research Lab in the mid 1990's, MDP is now in daily use by both the US Post Office and the DoD. This paper describes the use of MDP to provide automated end-to-end data flow for space missions. It examines the results of a parametric study of MDP in a simulated space link environment and discusses the results in terms of their implications for space missions. Lessons learned are addressed, which suggest minor enhancements to the MDP user interface to add specific features for space mission requirements, such as dynamic control of data rate, and a checkpoint/resume capability. These are features that are provided for in the protocol, but are not implemented in the sample MDP application that was provided. A brief look is also taken at the status of standardization. A version of MDP known as NORM (Neck Oriented Reliable Multicast) is in the process of becoming an IETF standard.

  6. A simulated avalanche search and rescue mission induces temporary physiological and behavioural changes in military dogs.

    Science.gov (United States)

    Diverio, Silvana; Barbato, Olimpia; Cavallina, Roberta; Guelfi, Gabriella; Iaboni, Martina; Zasso, Renato; Di Mari, Walter; Santoro, Michele Matteo; Knowles, Toby G

    2016-09-01

    Saving human lives is of paramount importance in avalanche rescue missions. Avalanche military dogs represent an invaluable resource in these operations. However, their performance can be influenced by several environmental, social and transport challenges. If too severe, these are likely to activate a range of responses to stress, which might put at risk the dogs' welfare. The aim of this study was to assess the physiological and behavioural responses of a group of military dogs to a Simulated Avalanche Search and Rescue mission (SASR). Seventeen avalanche dogs from the Italian Military Force Guardia di Finanza (SAGF dogs) were monitored during a simulated search for a buried operator in an artificial avalanche area (SASR). Heart rate (HR), body temperature (RBT) and blood samples were collected at rest the day before the trial (T0), immediately after helicopter transport at the onset of the SASR (T1), after the discovery of the buried operator (T2) and 2h later (T3). Heart rate (HR), rectal body temperature (RBT), cortisol, aspartate aminotransferase (AST), creatine kinase (CK), non-esterified fatty acids (NEFA) and lactate dehydrogenase (LDH) were measured. During the search mission the behaviour of each SAGF dog was measured by focal animal sampling and qualitatively assessed by its handler and two observers. Inter-rater agreement was evaluated. Snow and environmental variables were also measured. All dogs successfully completed their search for the buried, simulated victim within 10min. The SASR was shown to exert significant increases on RBT, NEFA and cortisol (Pdog's search mission ability was found only for motivation, signalling behaviour, signs of stress and possessive reward playing. More time signalling was related to shorter search time. In conclusion, despite extreme environmental and training conditions only temporary physiological and behavioural changes were recorded in the avalanche dogs. Their excellent performance in successful simulated SASR

  7. A High Fidelity Approach to Data Simulation for Space Situational Awareness Missions

    Science.gov (United States)

    Hagerty, S.; Ellis, H., Jr.

    2016-09-01

    Space Situational Awareness (SSA) is vital to maintaining our Space Superiority. A high fidelity, time-based simulation tool, PROXOR™ (Proximity Operations and Rendering), supports SSA by generating realistic mission scenarios including sensor frame data with corresponding truth. This is a unique and critical tool for supporting mission architecture studies, new capability (algorithm) development, current/future capability performance analysis, and mission performance prediction. PROXOR™ provides a flexible architecture for sensor and resident space object (RSO) orbital motion and attitude control that simulates SSA, rendezvous and proximity operations scenarios. The major elements of interest are based on the ability to accurately simulate all aspects of the RSO model, viewing geometry, imaging optics, sensor detector, and environmental conditions. These capabilities enhance the realism of mission scenario models and generated mission image data. As an input, PROXOR™ uses a library of 3-D satellite models containing 10+ satellites, including low-earth orbit (e.g., DMSP) and geostationary (e.g., Intelsat) spacecraft, where the spacecraft surface properties are those of actual materials and include Phong and Maxwell-Beard bidirectional reflectance distribution function (BRDF) coefficients for accurate radiometric modeling. We calculate the inertial attitude, the changing solar and Earth illumination angles of the satellite, and the viewing angles from the sensor as we propagate the RSO in its orbit. The synthetic satellite image is rendered at high resolution and aggregated to the focal plane resolution resulting in accurate radiometry even when the RSO is a point source. The sensor model includes optical effects from the imaging system [point spread function (PSF) includes aberrations, obscurations, support structures, defocus], detector effects (CCD blooming, left/right bias, fixed pattern noise, image persistence, shot noise, read noise, and quantization

  8. Use of Web 2.0 Technologies for Public Outreach on a Simulated Mars Mission

    Science.gov (United States)

    Shiro, B.; Palaia, J.; Ferrone, K.

    2009-12-01

    Recent advances in social media and internet communications have revolutionized the ways people interact and disseminate information. Astronauts are already starting to take advantage of these tools by blogging and tweeting from space, and almost all NASA missions now have presences on the major social networking sites. One priority for future human explorers on Mars will be communicating their experiences to the people back on Earth. During July 2009, a six-member crew of volunteers carried out a simulated Mars mission at the Flashline Mars Arctic Research Station (FMARS) on Devon Island in the Canadian Arctic. Living in a habitat, conducting EVAs wearing spacesuits, and observing communication delays with “Earth,” the crew endured restrictions similar to those that will be faced by future human Mars explorers. Throughout the expedition, crewmembers posted regular blog entries, reports, photos, videos, and updates to their website and social media outlets Twitter, Facebook, YouTube, and Picasa Web Albums. During the sixteen EVAs of their field science research campaign, FMARS crewmembers collected GPS track information and took geotagged photos using GPS-enabled cameras. They combined their traverse GPS tracks with photo location information into KML/KMZ files that website visitors can view in Google Maps or Google Earth. Although the crew observed a strict 20-minute communication delay with “Earth” to simulate a real Mars mission, they broke this rule to conduct four very successful live webcasts with student groups using Skype since education and public outreach were important objectives of the endeavor. This presentation will highlight the use of Web 2.0 technologies for public outreach during the simulated Mars expedition and the implications for other remote scientific journeys. The author embarks on a "rover" to carry out an EVA near the FMARS Habitat. The satellite dish to the right of the structure was used for all communications with the remote

  9. Euso-Balloon: A pathfinder mission for the JEM-EUSO experiment

    Energy Technology Data Exchange (ETDEWEB)

    Osteria, Giuseppe, E-mail: osteria@na.infn.it [Istituto Nazionale di Fisica Nucleare Sezione di Napoli, Naples (Italy); Scotti, Valentina [Istituto Nazionale di Fisica Nucleare Sezione di Napoli, Naples (Italy); Università di Napoli Federico II, Dipartimento di Fisica, Naples (Italy)

    2013-12-21

    EUSO-Balloon is a pathfinder mission for JEM-EUSO, the near-UV telescope proposed to be installed on board the ISS in 2017. The main objective of this pathfinder mission is to perform a full scale end-to-end test of all the key technologies and instrumentation of JEM-EUSO detectors and to prove the entire detection chain. EUSO-Balloon will measure the atmospheric and terrestrial UV background components, in different observational modes, fundamental for the development of the simulations. Through a series of flights performed by the French Space Agency CNES, EUSO-Balloon also has the potential to detect Extensive Air Showers (EAS) from above. EUSO-Balloon will be mounted in an unpressurized gondola of a stratospheric balloon. We will describe the instrument and the electronic system which performs instrument control and data management in such a critical environment.

  10. Multiplatform Mission Planning and Operations Simulation Environment for Adaptive Remote Sensors

    Science.gov (United States)

    Smith, G.; Ball, C.; O'Brien, A.; Johnson, J. T.

    2017-12-01

    We report on the design and development of mission simulator libraries to support the emerging field of adaptive remote sensors. We will outline the current state of the art in adaptive sensing, provide analysis of how the current approach to performing observing system simulation experiments (OSSEs) must be changed to enable adaptive sensors for remote sensing, and present an architecture to enable their inclusion in future OSSEs.The growing potential of sensors capable of real-time adaptation of their operational parameters calls for a new class of mission planning and simulation tools. Existing simulation tools used in OSSEs assume a fixed set of sensor parameters in terms of observation geometry, frequencies used, resolution, or observation time, which allows simplifications to be made in the simulation and allows sensor observation errors to be characterized a priori. Adaptive sensors may vary these parameters depending on the details of the scene observed, so that sensor performance is not simple to model without conducting OSSE simulations that include sensor adaptation in response to varying observational environment. Adaptive sensors are of significance to resource-constrained, small satellite platforms because they enable the management of power and data volumes while providing methods for multiple sensors to collaborate.The new class of OSSEs required to utilize adaptive sensors located on multiple platforms must answer the question: If the physical act of sensing has a cost, how does the system determine if the science value of a measurement is worth the cost and how should that cost be shared among the collaborating sensors?Here we propose to answer this question using an architecture structured around three modules: ADAPT, MANAGE and COLLABORATE. The ADAPT module is a set of routines to facilitate modeling of adaptive sensors, the MANAGE module will implement a set of routines to facilitate simulations of sensor resource management when power and data

  11. The Value of Biomedical Simulation Environments to Future Human Space Flight Missions

    Science.gov (United States)

    Mulugeta, Lealem; Myers, Jerry G.; Skytland, Nicholas G.; Platts, Steven H.

    2010-01-01

    With the ambitious goals to send manned missions to asteroids and onto Mars, substantial work will be required to ensure the well being of the men and women who will undertake these difficult missions. Unlike current International Space Station or Shuttle missions, astronauts will be required to endure long-term exposure to higher levels of radiation, isolation and reduced gravity. These new operation conditions will pose health risks that are currently not well understood and perhaps unanticipated. Therefore, it is essential to develop and apply advanced tools to predict, assess and mitigate potential hazards to astronaut health. NASA s Digital Astronaut Project (DAP) is working to develop and apply computational models of physiologic response to space flight operation conditions over various time periods and environmental circumstances. The collective application and integration of well vetted models assessing the physiology, biomechanics and anatomy is referred to as the Digital Astronaut. The Digital Astronaut simulation environment will serve as a practical working tool for use by NASA in operational activities such as the prediction of biomedical risks and functional capabilities of astronauts. In additional to space flight operation conditions, DAP s work has direct applicability to terrestrial biomedical research by providing virtual environments for hypothesis testing, experiment design, and to reduce animal/human testing. A practical application of the DA to assess pre and post flight responses to exercise is illustrated and the difficulty in matching true physiological responses is discussed.

  12. A simulation based optimization approach to model and design life support systems for manned space missions

    Science.gov (United States)

    Aydogan, Selen

    This dissertation considers the problem of process synthesis and design of life-support systems for manned space missions. A life-support system is a set of technologies to support human life for short and long-term spaceflights, via providing the basic life-support elements, such as oxygen, potable water, and food. The design of the system needs to meet the crewmember demand for the basic life-support elements (products of the system) and it must process the loads generated by the crewmembers. The system is subject to a myriad of uncertainties because most of the technologies involved are still under development. The result is high levels of uncertainties in the estimates of the model parameters, such as recovery rates or process efficiencies. Moreover, due to the high recycle rates within the system, the uncertainties are amplified and propagated within the system, resulting in a complex problem. In this dissertation, two algorithms have been successfully developed to help making design decisions for life-support systems. The algorithms utilize a simulation-based optimization approach that combines a stochastic discrete-event simulation and a deterministic mathematical programming approach to generate multiple, unique realizations of the controlled evolution of the system. The timelines are analyzed using time series data mining techniques and statistical tools to determine the necessary technologies, their deployment schedules and capacities, and the necessary basic life-support element amounts to support crew life and activities for the mission duration.

  13. An IP-Based Software System for Real-time, Closed Loop, Multi-Spacecraft Mission Simulations

    Science.gov (United States)

    Cary, Everett; Davis, George; Higinbotham, John; Burns, Richard; Hogie, Keith; Hallahan, Francis

    2003-01-01

    This viewgraph presentation provides information on the architecture of a computerized testbest for simulating Distributed Space Systems (DSS) for controlling spacecraft flying in formation. The presentation also discusses and diagrams the Distributed Synthesis Environment (DSE) for simulating and planning DSS missions.

  14. Identification of the main processes underlying ecosystem functioning in the Eastern English Channel, with a focus on flatfish species, as revealed through the application of the Atlantis end-to-end model

    Science.gov (United States)

    Girardin, Raphaël; Fulton, Elizabeth A.; Lehuta, Sigrid; Rolland, Marie; Thébaud, Olivier; Travers-Trolet, Morgane; Vermard, Youen; Marchal, Paul

    2018-02-01

    The ecosystem model Atlantis was used to investigate the key dynamics and processes that structure the Eastern English Channel ecosystem, with a particular focus on two commercial flatfish species, sole (Solea solea) and plaice (Pleuronectes platessa). This complex model was parameterized with data collected from diverse sources (a literature review, survey data, as well as landings and stock assessment information) and tuned so both simulated biomass and catch fit 2002-2011 observations. Here, the outputs are mainly presented for the two focus species and for some other vertebrates found to be important in the trophic network. The calibration process revealed the importance of coastal areas in the Eastern English Channel and of nutrient inputs from estuaries: a lack of river nutrients decreases the productivity of nursery grounds and adversely affects the production of sole and plaice. The role of discards in the trophic network is also highlighted. While sole and plaice did not have a strong influence on the trophic network of vertebrates, they are important predators for benthic invertebrates and compete for food with crustaceans, whiting (Merlangius merlangus) and other demersal fish. We also found that two key species, cod (Gadus morhua) and whiting, thoroughly structured the Eastern English Channel trophic network.

  15. CH4 IPDA Lidar mission data simulator and processor for MERLIN: prototype development at LMD/CNRS/Ecole Polytechnique

    Science.gov (United States)

    Olivier, Chomette; Armante, Raymond; Crevoisier, Cyril; Delahaye, Thibault; Edouart, Dimitri; Gibert, Fabien; Nahan, Frédéric; Tellier, Yoann

    2018-04-01

    The MEthane Remote sensing Lidar missioN (MERLIN), currently in phase C, is a joint cooperation between France and Germany on the development of a spatial Integrated Path Differential Absorption (IPDA) LIDAR (LIght Detecting And Ranging) to conduct global observations of atmospheric methane. This presentation will focus on the status of a LIDAR mission data simulator and processor developed at LMD (Laboratoire de Météorologie Dynamique), Ecole Polytechnique, France, for MERLIN to assess the performances in realistic observational situations.

  16. Use of Web 2.0 Technologies for Public Outreach on a Simulated Mars Mission

    Science.gov (United States)

    Ferrone, Kristine; Shiro, Brian; Palaia, Joseph E., IV

    2009-01-01

    Recent advances in social media and internet communications have revolutionized the ways people interact and disseminate information. Astronauts are already taking advantage of these tools by blogging and tweeting from space, and almost all NASA missions now have presences on the major social networking sites. One priotity for future human explorers on Mars will be communicating their experiences to the people back on Earth. During July 2009, a 6-member crew of volunteers carried out a simulated Mars mission at the Flashline Mars Arctic Research Station (FMARS). The Mars Society built the mock Mars habitat in 2000-01 to help develop key knowledge and inspire the public for human Mars exploration. It is located on Devon island about 1600 km from the North Pole within the Arctic Circle. The structure is situated on the rim of Haughton Crater in an environment geologically and biologically analogous to Mars. Living in a habitat, conducting EVAs wearing spacesuits, and observing communication delays with "Earth,"the crew endured restrictions similar to those that will be faced by future human Mars explorers. Throughout the expedition, crewmembers posted daily blog entries, reports, photos, videos, and updates to their website and social media outlets Twitter, Facebook, YouTube, and Picasa Web Albums. During the sixteen EVAs of thier field science research campaign, FMARS crewmembers collected GPS track information and took geotagged photos using GPS-enabled cameras. They combined their traverse GPS tracks with photo location information into KML/KMZ files that website visitors can view in Google Earth.

  17. Wireless Monitoring of Changes in Crew Relations during Long-Duration Mission Simulation.

    Directory of Open Access Journals (Sweden)

    Bernd Johannes

    Full Text Available Group structure and cohesion along with their changes over time play an important role in the success of missions where crew members spend prolonged periods of time under conditions of isolation and confinement. Therefore, an objective system for unobtrusive monitoring of crew cohesion and possible individual stress reactions is of high interest. For this purpose, an experimental wireless group structure (WLGS monitoring system integrated into a mobile psychophysiological system was developed. In the presented study the WLGS module was evaluated separately in six male subjects (27-38 years old participating in a 520-day simulated mission to Mars. Two days per week, each crew member wore a small sensor that registered the presence and distance of the sensors either worn by the other subjects or strategically placed throughout the isolation facility. The registration between two sensors was on average 91.0% in accordance. A correspondence of 95.7% with the survey video on day 475 confirmed external reliability. An integrated score of the "crew relation time index" was calculated and analyzed over time. Correlation analyses of a sociometric questionnaire (r = .35-.55, p< .05 and an ethological group approach (r = .45-.66, p < 05 provided initial evidence of the method's validity as a measure of cohesion when taking behavioral and activity patterns into account (e.g. only including activity phases in the afternoon. This confirms our assumption that the registered amount of time spent together during free time is associated with the intensity of personal relationships.

  18. Simulation and Control Lab Development for Power and Energy Management for NASA Manned Deep Space Missions

    Science.gov (United States)

    McNelis, Anne M.; Beach, Raymond F.; Soeder, James F.; McNelis, Nancy B.; May, Ryan; Dever, Timothy P.; Trase, Larry

    2014-01-01

    The development of distributed hierarchical and agent-based control systems will allow for reliable autonomous energy management and power distribution for on-orbit missions. Power is one of the most critical systems on board a space vehicle, requiring quick response time when a fault or emergency is identified. As NASAs missions with human presence extend beyond low earth orbit autonomous control of vehicle power systems will be necessary and will need to reliably function for long periods of time. In the design of autonomous electrical power control systems there is a need to dynamically simulate and verify the EPS controller functionality prior to use on-orbit. This paper presents the work at NASA Glenn Research Center in Cleveland, Ohio where the development of a controls laboratory is being completed that will be utilized to demonstrate advanced prototype EPS controllers for space, aeronautical and terrestrial applications. The control laboratory hardware, software and application of an autonomous controller for demonstration with the ISS electrical power system is the subject of this paper.

  19. Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO

    Science.gov (United States)

    Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael

    2014-01-01

    For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.

  20. Space Geodetic Technique Co-location in Space: Simulation Results for the GRASP Mission

    Science.gov (United States)

    Kuzmicz-Cieslak, M.; Pavlis, E. C.

    2011-12-01

    The Global Geodetic Observing System-GGOS, places very stringent requirements in the accuracy and stability of future realizations of the International Terrestrial Reference Frame (ITRF): an origin definition at 1 mm or better at epoch and a temporal stability on the order of 0.1 mm/y, with similar numbers for the scale (0.1 ppb) and orientation components. These goals were derived from the requirements of Earth science problems that are currently the international community's highest priority. None of the geodetic positioning techniques can achieve this goal alone. This is due in part to the non-observability of certain attributes from a single technique. Another limitation is imposed from the extent and uniformity of the tracking network and the schedule of observational availability and number of suitable targets. The final limitation derives from the difficulty to "tie" the reference points of each technique at the same site, to an accuracy that will support the GGOS goals. The future GGOS network will address decisively the ground segment and to certain extent the space segment requirements. The JPL-proposed multi-technique mission GRASP (Geodetic Reference Antenna in Space) attempts to resolve the accurate tie between techniques, using their co-location in space, onboard a well-designed spacecraft equipped with GNSS receivers, a SLR retroreflector array, a VLBI beacon and a DORIS system. Using the anticipated system performance for all four techniques at the time the GGOS network is completed (ca 2020), we generated a number of simulated data sets for the development of a TRF. Our simulation studies examine the degree to which GRASP can improve the inter-technique "tie" issue compared to the classical approach, and the likely modus operandi for such a mission. The success of the examined scenarios is judged by the quality of the origin and scale definition of the resulting TRF.

  1. In-Vessel Composting of Simulated Long-Term Missions Space-Related Solid Wastes

    Science.gov (United States)

    Rodriguez-Carias, Abner A.; Sager, John; Krumins, Valdis; Strayer, Richard; Hummerick, Mary; Roberts, Michael S.

    2002-01-01

    Reduction and stabilization of solid wastes generated during space missions is a major concern for the Advanced Life Support - Resource Recovery program at the NASA, Kennedy Space Center. Solid wastes provide substrates for pathogen proliferation, produce strong odor, and increase storage requirements during space missions. A five periods experiment was conducted to evaluate the Space Operation Bioconverter (SOB), an in vessel composting system, as a biological processing technology to reduce and stabilize simulated long-term missions space related solid-wastes (SRSW). For all periods, SRSW were sorted into components with fast (FBD) and slow (SBD) biodegradability. Uneaten food and plastic were used as a major FBD and SBD components, respectively. Compost temperature (C), CO2 production (%), mass reduction (%), and final pH were utilized as criteria to determine compost quality. In period 1, SOB was loaded with a 55% FBD: 45% SBD mixture and was allowed to compost for 7 days. An eleven day second composting period was conducted loading the SOB with 45% pre-composted SRSW and 55% FBD. Period 3 and 4 evaluated the use of styrofoam as a bulking agent and the substitution of regular by degradable plastic on the composting characteristics of SRSW, respectively. The use of ceramic as a bulking agent and the relationship between initial FBD mass and heat production was investigated in period 5. Composting SRSW resulted in an acidic fermentation with a minor increase in compost temperature, low CO2 production, and slightly mass reduction. Addition of styrofoam as a bulking agent and substitution of regular by biodegradable plastic improved the composting characteristics of SRSW, as evidenced by higher pH, CO2 production, compost temperature and mass reduction. Ceramic as a bulking agent and increase the initial FBD mass (4.4 kg) did not improve the composting process. In summary, the SOB is a potential biological technology for reduction and stabilization of mission space

  2. Mercury Conditions for the MESSENGER Mission Simulated in High- Solar-Radiation Vacuum Tests

    Science.gov (United States)

    Wong, Wayne A.

    2003-01-01

    The MESSENGER (Mercury Surface, Space Environment, Geochemistry, and Ranging) spacecraft, planned for launch in March 2004, will perform two flybys of Mercury before entering a year-long orbit of the planet in September 2009. The mission will provide opportunities for detailed characterization of the surface, interior, atmosphere, and magnetosphere of the closest planet to the Sun. The NASA Glenn Research Center and the MESSENGER spacecraft integrator, the Johns Hopkins University Applied Physics Laboratory, have partnered under a Space Act Agreement to characterize a variety of critical components and materials under simulated conditions expected near Mercury. Glenn's Vacuum Facility 6, which is equipped with a solar simulator, can simulate the vacuum and high solar radiation anticipated in Mercury orbit. The MESSENGER test hardware includes a variety of materials and components that are being characterized during the Tank 6 vacuum tests, where the hardware will be exposed to up to 11 suns insolation, simulating conditions expected in Mercury orbit. In 2002, ten solar vacuum tests were conducted, including beginning of life, end of life, backside exposure, and solar panel thermal shock cycling tests. Components tested include candidate solar array panels, sensors, thermal shielding materials, and communication devices. As an example, for the solar panel thermal shock cycling test, two candidate solar array panels were suspended on a lift mechanism that lowered the panels into a liquid-nitrogen-cooled box. After reaching -140 C, the panels were then lifted out of the box and exposed to the equivalent of 6 suns (8.1 kilowatts per square meters). After five cold soak/heating cycles were completed successfully, there was no apparent degradation in panel performance. An anticipated 100-hr thermal shield life test is planned for autumn, followed by solar panel flight qualification tests in winter. Glenn's ongoing support to the MESSENGER program has been instrumental in

  3. Tolerance analysis through computational imaging simulations

    Science.gov (United States)

    Birch, Gabriel C.; LaCasse, Charles F.; Stubbs, Jaclynn J.; Dagel, Amber L.; Bradley, Jon

    2017-11-01

    The modeling and simulation of non-traditional imaging systems require holistic consideration of the end-to-end system. We demonstrate this approach through a tolerance analysis of a random scattering lensless imaging system.

  4. Exploration Mission Particulate Matter Filtration Technology Performance Testing in a Simulated Spacecraft Cabin Ventilation System

    Science.gov (United States)

    Agui, Juan H.; Vijayakumar, R.; Perry, Jay L.; Frederick, Kenneth R.; Mccormick, Robert M.

    2017-01-01

    Human deep space exploration missions will require advances in long-life, low maintenance airborne particulate matter filtration technology. As one of the National Aeronautics and Space Administrations (NASA) developments in this area, a prototype of a new regenerable, multi-stage particulate matter filtration technology was tested in an International Space Station (ISS) module simulation facility. As previously reported, the key features of the filter system include inertial and media filtration with regeneration and in-place media replacement techniques. The testing facility can simulate aspects of the cabin environment aboard the ISS and contains flight-like cabin ventilation system components. The filtration technology test article was installed at the inlet of the central ventilation system duct and instrumented to provide performance data under nominal flow conditions. In-place regeneration operations were also evaluated. The real-time data included pressure drop across the filter stages, process air flow rate, ambient pressure, humidity and temperature. In addition, two video cameras positioned at the filtration technology test articles inlet and outlet were used to capture the mechanical performance of the filter media indexing operation under varying air flow rates. Recent test results are presented and future design recommendations are discussed.

  5. Data-driven simulations of the Landsat Data Continuity Mission (LDCM) platform

    Science.gov (United States)

    Gerace, Aaron; Gartley, Mike; Schott, John; Raqueño, Nina; Raqueño, Rolando

    2011-06-01

    The Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) are two new sensors being developed by the Landsat Data Continuity Mission (LDCM) that will extend over 35 years of archived Landsat data. In a departure from the whiskbroom design used by all previous generations of Landsat, the LDCM system will employ a pushbroom technology. Although the newly adopted modular array, pushbroom architecture has several advantages over the previous whiskbroom design, registration of the multi-spectral data products is a concern. In this paper, the Digital Imaging and Remote Sensing Image Generation (DIRSIG) tool was used to simulate an LDCM collection, which gives the team access to data that would not otherwise be available prior to launch. The DIRSIG model was used to simulate the two-instrument LDCM payload in order to study the geometric and radiometric impacts of the sensor design on the proposed processing chain. The Lake Tahoe area located in eastern California was chosen for this work because of its dramatic change in elevation, which was ideal for studying the geometric effects of the new Landsat sensor design. Multi-modal datasets were used to create the Lake Tahoe site model for use in DIRSIG. National Elevation Dataset (NED) data were used to create the digital elevation map (DEM) required by DIRSIG, QuickBird data were used to identify different material classes in the scene, and ASTER and Hyperion spectral data were used to assign radiometric properties to those classes. In order to model a realistic Landsat orbit in these simulations, orbital parameters were obtained from a Landsat 7 two-line element set and propagated with the SGP4 orbital position model. Line-of-sight vectors defining how the individual detector elements of the OLI and TIRS instruments project through the optics were measured and provided by NASA. Additionally, the relative spectral response functions for the 9 bands of OLI and the 2 bands of TIRS were measured and provided by NASA

  6. End-to-End Multi-View Lipreading

    NARCIS (Netherlands)

    Petridis, Stavros; Wang, Yujiang; Li, Zuwei; Pantic, Maja

    2017-01-01

    Non-frontal lip views contain useful information which can be used to enhance the performance of frontal view lipreading. However, the vast majority of recent lipreading works, including the deep learning approaches which significantly outperform traditional approaches, have focused on frontal mouth

  7. End-to-end visual speech recognition with LSTMS

    NARCIS (Netherlands)

    Petridis, Stavros; Li, Zuwei; Pantic, Maja

    2017-01-01

    Traditional visual speech recognition systems consist of two stages, feature extraction and classification. Recently, several deep learning approaches have been presented which automatically extract features from the mouth images and aim to replace the feature extraction stage. However, research on

  8. CMDS System Integration and IAMD End-to-End Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Cruise Missile Defense Systems (CMDS) Project Office is establishing a secure System Integration Laboratory at the AMRDEC. This lab will contain tactical Signal...

  9. End-to-End Service Oriented Architectures (SOA) Security Project

    Science.gov (United States)

    2012-02-01

    Java 6.0 (javax.ws) platform and deployed on boston.cs.purdue.edu. TB stores all data regarding sessions and services in a MySQL database, setup on...pointcut designators. JBoss AOP [JBO2] and AspectJ [ASP1] are powerful frameworks that implement AOP for Java programs. Its pointcut designators... hibernate cglib enhanced proxies <attribute name="Ignore">*$$EnhancerByCGLIB$$*</attribute> --> <attribute name="Optimized">true</attribute

  10. End-to-end experiment management in HPC

    Energy Technology Data Exchange (ETDEWEB)

    Bent, John M [Los Alamos National Laboratory; Kroiss, Ryan R [Los Alamos National Laboratory; Torrez, Alfred [Los Alamos National Laboratory; Wingate, Meghan [Los Alamos National Laboratory

    2010-01-01

    Experiment management in any domain is challenging. There is a perpetual feedback loop cycling through planning, execution, measurement, and analysis. The lifetime of a particular experiment can be limited to a single cycle although many require myriad more cycles before definite results can be obtained. Within each cycle, a large number of subexperiments may be executed in order to measure the effects of one or more independent variables. Experiment management in high performance computing (HPC) follows this general pattern but also has three unique characteristics. One, computational science applications running on large supercomputers must deal with frequent platform failures which can interrupt, perturb, or terminate running experiments. Two, these applications typically integrate in parallel using MPI as their communication medium. Three, there is typically a scheduling system (e.g. Condor, Moab, SGE, etc.) acting as a gate-keeper for the HPC resources. In this paper, we introduce LANL Experiment Management (LEM), an experimental management framework simplifying all four phases of experiment management. LEM simplifies experiment planning by allowing the user to describe their experimental goals without having to fully construct the individual parameters for each task. To simplify execution, LEM dispatches the subexperiments itself thereby freeing the user from remembering the often arcane methods for interacting with the various scheduling systems. LEM provides transducers for experiments that automatically measure and record important information about each subexperiment; these transducers can easily be extended to collect additional measurements specific to each experiment. Finally, experiment analysis is simplified by providing a general database visualization framework that allows users to quickly and easily interact with their measured data.

  11. Using SIM for strong end-to-end Application Authentication

    OpenAIRE

    Lunde, Lars; Wangensteen, Audun

    2006-01-01

    Today the Internet is mostly used for services that require low or none security. The commercial and governmental applications have started to emerge but met problems since they require strong authentication, which is both difficult and costly to realize. The SIM card used in mobile phones is a tamper resistant device that contains strong authentication mechanisms. It would be very convenient and cost-efficient if Internet services could use authentication methods based on the SIM. This mast...

  12. Network analysis on skype end-to-end video quality

    NARCIS (Netherlands)

    Exarchakos, Georgios; Druda, Luca; Menkovski, Vlado; Liotta, Antonio

    2015-01-01

    Purpose – This paper aims to argue on the efficiency of Quality of Service (QoS)-based adaptive streamingwith regards to perceived quality Quality of Experience (QoE). Although QoS parameters are extensivelyused even by high-end adaptive streaming algorithms, achieved QoE fails to justify their use

  13. End to End Beam Dynamics of the ESS Linac

    DEFF Research Database (Denmark)

    Thomsen, Heine Dølrath

    2012-01-01

    The European Spallation Source, ESS, uses a linear accelerator to deliver a high intensity proton beam to the target station. The nominal beam power on target will be 5 MW at an energy of 2.5 GeV. We briefly describe the individual accelerating structures and transport lines through which we have...

  14. MISSION PROFILE AND DESIGN CHALLENGES FOR MARS LANDING EXPLORATION

    Directory of Open Access Journals (Sweden)

    J. Dong

    2017-07-01

    Full Text Available An orbiter and a descent module will be delivered to Mars in the Chinese first Mars exploration mission. The descent module is composed of a landing platform and a rover. The module will be released into the atmosphere by the orbiter and make a controlled landing on Martian surface. After landing, the rover will egress from the platform to start its science mission. The rover payloads mainly include the subsurface radar, terrain camera, multispectral camera, magnetometer, anemometer to achieve the scientific investigation of the terrain, soil characteristics, material composition, magnetic field, atmosphere, etc. The landing process is divided into three phases (entry phase, parachute descent phase and powered descent phase, which are full of risks. There exit lots of indefinite parameters and design constrain to affect the selection of the landing sites and phase switch (mortaring the parachute, separating the heat shield and cutting off the parachute. A number of new technologies (disk-gap-band parachute, guidance and navigation, etc. need to be developed. Mars and Earth have gravity and atmosphere conditions that are significantly different from one another. Meaningful environmental conditions cannot be recreated terrestrially on earth. A full-scale flight validation on earth is difficult. Therefore the end-to-end simulation and some critical subsystem test must be considered instead. The challenges above and the corresponding design solutions are introduced in this paper, which can provide reference for the Mars exploration mission.

  15. A noise simulator for eLISA: Migrating LISA Pathfinder knowledge to the eLISA mission

    OpenAIRE

    Armano, M.; Audley, H.; Auger, G.; Baird, J.; Binetruy, P.; Born, Michael; Bortoluzzi, D.; Brandt, N.; Bursi, A.; Caleno, M.; Cavalleri, A.; Cesarini, A.; Cruise, M.; Danzmann, Karsten; Diepholz, I.

    2015-01-01

    We present a new technical simulator for the eLISA mission, based on state space modeling techniques and developed in MATLAB. This simulator computes the coordinate and velocity over time of each body involved in the constellation, i.e. the spacecraft and its test masses, taking into account the different disturbances and actuations. This allows studying the contribution of instrumental noises and system imperfections on the residual acceleration applied on the TMs, the latter reflecting the ...

  16. Crew Transportation System Design Reference Missions

    Science.gov (United States)

    Mango, Edward J.

    2015-01-01

    Contains summaries of potential design reference mission goals for systems to transport humans to andfrom low Earth orbit (LEO) for the Commercial Crew Program. The purpose of this document is to describe Design Reference Missions (DRMs) representative of the end-to-end Crew Transportation System (CTS) framework envisioned to successfully execute commercial crew transportation to orbital destinations. The initial CTS architecture will likely be optimized to support NASA crew and NASA-sponsored crew rotation missions to the ISS, but consideration may be given in this design phase to allow for modifications in order to accomplish other commercial missions in the future. With the exception of NASA’s mission to the ISS, the remaining commercial DRMs are notional. Any decision to design or scar the CTS for these additional non-NASA missions is completely up to the Commercial Provider. As NASA’s mission needs evolve over time, this document will be periodically updated to reflect those needs.

  17. CH4 IPDA Lidar mission data simulator and processor for MERLIN: prototype development at LMD/CNRS/Ecole Polytechnique

    Directory of Open Access Journals (Sweden)

    Olivier Chomette

    2018-01-01

    Full Text Available The MEthane Remote sensing Lidar missioN (MERLIN, currently in phase C, is a joint cooperation between France and Germany on the development of a spatial Integrated Path Differential Absorption (IPDA LIDAR (LIght Detecting And Ranging to conduct global observations of atmospheric methane. This presentation will focus on the status of a LIDAR mission data simulator and processor developed at LMD (Laboratoire de Météorologie Dynamique, Ecole Polytechnique, France, for MERLIN to assess the performances in realistic observational situations.

  18. Changes in stress hormones and metabolism during a 105-day simulated Mars mission.

    Science.gov (United States)

    Strollo, Felice; Vassilieva, Galina; Ruscica, Massimiliano; Masini, Mariangela; Santucci, Daniela; Borgia, Luisa; Magni, Paolo; Celotti, Fabio; Nikiporuc, Igor

    2014-08-01

    The Mars-105 project was aimed at simulating crew's activities, workload, and communication during a mission to Mars, evaluating the homeostatic adaptations to prolonged confinement and cohabitation. Fasting plasma glucose (FPG) and insulin, C-peptide, leptin, cortisol, and NGF and BDNF plasma levels were monitored in six healthy nonsmoking male subjects taking part in a 105-d Mars mission simulation. Samples were collected from each subject before (0 wk), during (2.5 wk; 5 wk; 10 wk; 15 wk), and after confinement (+1 wk). Confinement resulted in impaired glucometabolic parameters, since FPG increased during the first 5 wk (baseline: 85.2 ± 10.8 mg · dl⁻¹; 2.5 wk: 98.4 ± 4.7 mg · dl⁻¹; 5 wk: 92.5 ± 6.0 mg · dl⁻¹) and insulin dropped at 2.5 wk (baseline: 14.4 ± 4.8 mU · L⁻¹; 2.5 wk: 7.7 ± 2.1 mU · L⁻¹), subsequently returning to baseline values. HOMA-IR paralleled plasma insulin, dropping to 1.8 ± 0.5 at 2.5 wk (baseline: 3.0 ± 1.2). At all time-points tested, plasma leptin levels were decreased (baseline: 4.4 ± 3.3 ng · dl⁻¹; 2.5 wk: 1.6 ± 1.2 ng · dl⁻¹; 5 wk: 1.3 ± 0.8 ng · dl⁻¹; 10 wk: 1.5 ± 1.1 ng · dl⁻¹; 15 wk:1.7 ± 0.8 ng · dl⁻¹), whereas cortisol levels were increased (baseline: 10.8 ± 4.9 ng · dl⁻¹; 2.5 wk: 16.8 ± 3.5 ng · dl⁻¹; 5 wk: 18.1 ± 7.6 ng · dl⁻¹; 10 wk: 18.1 ± 8.3 ng · dl⁻¹; 15 wk:14.2 ± 4.4 ng · dl⁻¹), resulting in a negative correlation between these hormones. BDNF levels increased only at 5 and 10 wk (baseline: 67.1 ± 36.0 pg · ml⁻¹; 5 wk: 164 ± 54 pg · ml⁻¹; and 10 wk: 110.2 ± 28.9 pg · ml⁻¹). The data obtained with the Mars-105 experiment suggest that environmental stress has a strong impact upon metabolic and stress response, indicating the need for further studies and the implementation of specific countermeasures.

  19. Damage assessment of mission essential buildings based on simulation studies of low yield explosives

    Science.gov (United States)

    Allen, Thomas G. L.

    2006-04-01

    There has been a lack of investigations related to low yield explosives instigated by terrorist on small but high occupancy buildings. Also, mitigating the threat of terrorist attacks against high occupancy buildings with network equipment essential to the mission of an organization is a challenging task. At the same time, it is difficult to predict how, why, and when terrorists may attack theses assets. Many factors must be considered in creating a safe building environment. Although it is possible that the dominant threat mode may change in the future, bombings have historically been a favorite tactic of terrorists. Ingredients for homemade bombs are easily obtained on the open market, as are the techniques for making bombs. Bombings are easy and quick to execute. This paper discusses the problems with and provides insights of experience gained in analyzing small scale explosions on older military base buildings. In this study, we examine the placement of various bombs on buildings using the shock wave simulation code CTH and examine the damage effects on the interior of the building, particularly the damage that is incurred on a computer center. These simulation experiments provide data on the effectiveness of a building's security and an understanding of the phenomenology of shocks as they propagate through rooms and corridors. It's purpose is to motivate researchers to take the seriousness of small yield explosives on moderately sized buildings. Visualizations from this analysis are used to understand the complex flow of the air blasts around corridors and hallways. Finally, we make suggestions for improving the mitigation of such terrorist attacks. The intent of this study is not to provide breakthrough technology, but to provide a tool and a means for analyzing the material hardness of a building and to eventually provide the incentive for more security. The information mentioned in this paper is public domain information and easily available via the

  20. An Optical Lightning Simulator in an Electrified Cloud-Resolving Model to Prepare the Future Space Lightning Missions

    Science.gov (United States)

    Bovalo, Christophe; Defer, Eric; Pinty, Jean-Pierre

    2016-04-01

    The future decade will see the launch of several space missions designed to monitor the total lightning activity. Among these missions, the American (Geostationary Lightning Mapper - GLM) and European (Lightning Imager - LI) optical detectors will be onboard geostationary satellites (GOES-R and MTG, respectively). For the first time, the total lightning activity will be monitored over the full Earth disk and at a very high temporal resolution (2 and 1 ms, respectively). Missions like the French Tool for the Analysis of Radiation from lightNIng and Sprites (TARANIS) and ISS-LIS will bring complementary information in order to better understand the lightning physics and to improve the weather prediction (nowcasting and forecasting). Such missions will generate a huge volume of new and original observations for the scientific community and weather prediction centers that have to be prepared. Moreover, before the launch of these missions, fundamental questions regarding the interpretation of the optical signal property and its relation to cloud optical thickness and lightning discharge processes need to be further investigated. An innovative approach proposed here is to use the synergy existing in the French MesoNH Cloud-Resolving Model (CRM). Indeed, MesoNH is one of the only CRM able to simulate the lifecycle of electrical charges generated within clouds through non-inductive charging process (dependent of the 1-moment microphysical scheme). The lightning flash geometry is based on a fractal law while the electrical field is diagnosed thanks to the Gauss' law. The lightning optical simulator is linked to the electrical scheme as the lightning radiance at 777.4 nm is a function of the lightning current, approximated by the charges neutralized along the lightning path. Another important part is the scattering of this signal by the hydrometeors (mainly ice particles) that is taken into account. Simulations at 1-km resolution are done over the Langmuir Laboratory (New

  1. Discrete event simulation and the resultant data storage system response in the operational mission environment of Jupiter-Saturn /Voyager/ spacecraft

    Science.gov (United States)

    Mukhopadhyay, A. K.

    1978-01-01

    The Data Storage Subsystem Simulator (DSSSIM) simulating (by ground software) occurrence of discrete events in the Voyager mission is described. Functional requirements for Data Storage Subsystems (DSS) simulation are discussed, and discrete event simulation/DSSSIM processing is covered. Four types of outputs associated with a typical DSSSIM run are presented, and DSSSIM limitations and constraints are outlined.

  2. The magnetic shield design and simulation of an X-ray spectrometer for Chang'E mission

    International Nuclear Information System (INIS)

    Zhang Jiayu; Wang Huanyu; Zhang Chengmo; Yang Jiawei; Liang Xiaohua; Wang Jinzhou; Cao Xuelei; Gao Min; Cui Xingzhu; Peng Wenxi

    2008-01-01

    Basic design methods about the magnetic shield of an X-ray spectrometer for Chang'E Mission were introduced in this paper. The real magnetic field distribution was obtained through the measure experiment, and according to the measure results, the simulation to evaluate the magnetic shield effect was carded on. The results showed that the collimator can play a good role in magnetic shield to the electron. (authors)

  3. Software Environment for Mission Design, Simulation, and Engineering Data Management, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — As NASA designs and develops the next generation of scientific and space exploration vehicles and missions, there is a growing need for a robust, flexible, and...

  4. Discrete Event Simulation of a Suppression of Enemy Air Defenses (SEAD) Mission

    National Research Council Canada - National Science Library

    Candir, Ahmet A

    2008-01-01

    ...) operations have been a crucial element of military air power for 50 years. Several developments and evolution in both air defense and attack systems suggest that SEAD missions will continue to have growing importance to air forces...

  5. Extended object-oriented Petri net model for mission reliability simulation of repairable PMS with common cause failures

    International Nuclear Information System (INIS)

    Wu, Xin-yang; Wu, Xiao-Yue

    2015-01-01

    Phased Mission Systems (PMS) have several phases with different success criteria. Generally, traditional analytical methods need to make some assumptions when they are applied for reliability evaluation and analysis of complex PMS, for example, the components are non-repairable or components are not subjected to common cause failures (CCF). However, the evaluation and analysis results may be inapplicable when the assumptions do not agree with practical situation. In this article, we propose an extended object-oriented Petri net (EOOPN) model for mission reliability simulation of repairable PMS with CCFs. Based on object-oriented Petri net (OOPN), EOOPN defines four reusable sub-models to depict PMS at system, phase, or component levels respectively, logic transitions to depict complex components reliability logics in a more readable form, and broadcast place to transmit shared information among components synchronously. After extension, EOOPN could deal with repairable PMS with both external and internal CCFs conveniently. The mission reliability modelling, simulation and analysis using EOOPN are illustrated by a PMS example. The results demonstrate that the proposed EOOPN model is effective. - Highlights: • EOOPN model was effective in reliability simulation for repairable PMS with CCFs. • EOOPN had modular and hierarchical structure. • New elements of EOOPN made the modelling process more convenient and friendlier. • EOOPN had better model reusability and readability than other PNs

  6. Investigation of Bio-Regenerative Life Support and Trash-to-Gas Experiment on a 4-Month Mars Simulation Mission

    Science.gov (United States)

    Caraccio, Anne; Poulet, Lucie; Hintze, Paul E.; Miles, John D.

    2014-01-01

    Future crewed missions to other planets or deep space locations will require regenerative Life Support Systems (LSS) as well as recycling processes for mission waste. Constant resupply of many commodity materials will not be a sustainable option for deep space missions, nor will stowing trash on board a vehicle or at a lunar or Martian outpost. The habitable volume will decline as the volume of waste increases. A complete regenerative environmentally controlled life support system (ECLSS) on an extra-terrestrial outpost will likely include physico-chemical and biological technologies, such as bioreactors and greenhouse modules. Physico-chemical LSS do not enable food production and bio-regenerative LSS are not stable enough to be used alone in space. Mission waste that cannot be recycled into the bio-regenerative ECLSS can include excess food, food packaging, clothing, tape, urine and fecal waste. This waste will be sent to a system for converting the trash into high value products. Two crew members on a 120 day Mars analog simulation, in collaboration with Kennedy Space Centers (KSC) Trash to Gas (TtG) project investigated a semi-closed loop system that treated non-edible biomass and other logistical waste for volume reduction and conversion into useful commodities. The purpose of this study is to show how plant growth affects the amount of resources required by the habitat and how spent plant material can be recycled. Real-time data was sent to the reactor at KSC in Florida for replicating the analog mission waste for laboratory operation. This paper discusses the 120 day mission plant growth activity, logistical and plant waste management, power and water consumption effects of the plant and logistical waste, and potential energy conversion techniques using KSCs TtG technology.

  7. Investigation of Bio-Regenerative Life Support and Trash-To-Gas Experiment on a 4 Month Mars Simulation Mission

    Science.gov (United States)

    Caraccio, Anne; Poulet, Lucie; Hintze, Paul E.; Miles, John D.

    2014-01-01

    Future crewed missions to other planets or deep space locations will require regenerative Life Support Systems (LSS) as well as recycling processes for mission waste. Constant resupply of many commodity materials will not be a sustainable option for deep space missions, nor will storing trash on board a vehicle or at a lunar or Martian outpost. The habitable volume will decline as the volume of waste increases. A complete regenerative environmentally controlled life support system (ECLSS) on an extra-terrestrial outpost will likely include physico-chemical and biological technologies, such as bioreactors and greenhouse modules. Physico-chemical LSS do not enable food production and bio-regenerative LSS are not stable enough to be used alone in space. Mission waste that cannot be recycled into the bio-regenerative ECLSS can include excess food, food packaging, clothing, tape, urine and fecal waste. This waste will be sent to a system for converting the trash into the high value products. Two crew members on a 120 day Mars analog simulation, in collaboration with Kennedy Space Centers (KSC) Trash to Gas (TtG) project investigated a semi-closed loop system that treated non-edible biomass and other logistical waste for volume reduction and conversion into useful commodities. The purposes of this study are to show the how plant growth affects the amount of resources required by the habitat and how spent plant material can be recycled. Real-time data was sent to the reactor at KSC in Florida for replicating the analog mission waste for laboratory operation. This paper discusses the 120 day mission plant growth activity, logistical and plant waste management, power and water consumption effects of the plant and logistical waste, and potential energy conversion techniques using KSCs TtG reactor technology.

  8. Lessons Learned from Biosphere 2: When Viewed as a Ground Simulation/Analogue for Long Duration Human Space Exploration and Settlement

    Science.gov (United States)

    MacCallum, T.; Poynter, J.; Bearden, D.

    A human mission to Mars, or a base on the Moon or Mars, is a longer and more complex mission than any space endeavor undertaken to date. Ground simulations provide a relevant, analogous environment for testing technologies and learning how to manage complex, long duration missions, while addressing inherent mission risks. Multiphase human missions and settlements that may preclude a rapid return to Earth, require high fidelity, end-to-end, at least full mission duration tests in order to evaluate a system's ability to sustain the crew for the entire mission and return the crew safely to Earth. Moreover, abort scenarios are essentially precluded in many mission scenarios, though certain risks may only become evident late in the mission. Aging and compounding effects cannot be simulated through accelerated tests for all aspects of the mission. Until such high fidelity long duration simulations are available, and in order to help prepare those simulations and mission designs, it is important to extract as many lessons as possible from analogous environments. Possibly the best analogue for a long duration space mission is the two year mission of Biosphere 2. Biosphere 2 is a three-acre materially closed ecological system that supported eight crewmembers with food, air and water in a sunlight driven bioregenerative system for two years. It was designed for research applicable to environmental management on Earth and the development of human life support for space. A brief overview of the two-year Biosphere 2 mission is presented, followed by select data and lessons learned that are applicable to the design and operation of a long duration human space mission, settlement or test bed. These lessons include technical, programmatic, and psychological issues

  9. Jake Garn Mission Simulator and Training Facility, Building 5, Historical Documentation

    Science.gov (United States)

    Slovinac, Trish; Deming, Joan

    2010-01-01

    In response to President George W. Bush's announcement in January 2004 that the Space Shuttle Program (SSP) would end in 2010, the National Aeronautics and Space Administration (NASA) completed a nation-wide historical survey and evaluation of NASA-owned facilities and properties (real property assets) at all its Centers and component facilities. The buildings and structures which supported the SSP were inventoried and assessed as per the criteria of eligibility for listing in the National Register of Historic Places (NRHP) in the context of this program. This study was performed in compliance with Section 110 of the National Historic Preservation Act (NHPA) of 1966 (Public Law 89-665), as amended; the National Environmental Policy Act (NEPA) of 1969 (Public Law 91-190); Executive Order (EO) 11593: Protection and Enhancement of the Cultural Environment; EO 13287, Preserve America, and other relevant legislation. As part of this nation-wide study, in September 2006, historical survey and evaluation of NASA-owned and managed facilities at was conducted by NASA's Lyndon B. Johnson Space Center (JSC) in Houston, Texas. The results of this study are presented in a report entitled, "Survey and Evaluation of NASA-owned Historic Facilities and Properties in the Context of the U.S. Space Shuttle Program, Lyndon B. Johnson Space Center, Houston, Texas," prepared in November 2007 by NASA JSC's contractor, Archaeological Consultants, Inc. As a result of this survey, the Jake Gam Mission Simulator and Training Facility (Building 5) was determined eligible for listing in the NRHP, with concurrence by the Texas State Historic Preservation Officer (SHPO). The survey concluded that Building 5 is eligible for the NRHP under Criteria A and C in the context of the U.S. Space Shuttle program (1969-2010). Because it has achieved significance within the past 50 years, Criteria Consideration G applies. At the time of this documentation, Building 5 was still used to support the SSP as an

  10. Field Simulation of a Drilling Mission to Mars to Search for Subsurface Life

    Science.gov (United States)

    Stoker, C. R.; Lemke, L. G.; Cannon, H.; Glass, B.; Dunagan, S.; Zavaleta, J.; Miller, D.; Gomez-Elvira, J.

    2005-01-01

    The discovery of near surface ground ice by the Mars Odyssey mission and the abundant evidence for recent Gulley features observed by the Mars Global Surveyor mission support longstanding theoretical arguments for subsurface liquid water on Mars. Thus, implementing the Mars program goal to search for life points to drilling on Mars to reach liquid water, collecting samples and analyzing them with instrumentation to detect in situ organisms and biomarker compounds. Searching for life in the subsurface of Mars will require drilling, sample extraction and handling, and new technologies to find and identify biomarker compounds and search for living organisms. In spite of its obvious advantages, robotic drilling for Mars exploration is in its technological infancy and has yet to be demonstrated in even a terrestrial field environment.

  11. XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework

    Science.gov (United States)

    Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò

    2017-08-01

    We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.

  12. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    Science.gov (United States)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  13. Radiation beamline testbeds for the simulation of planetary and spacecraft environments for human and robotic mission risk assessment

    Science.gov (United States)

    Wilkins, Richard

    The Center for Radiation Engineering and Science for Space Exploration (CRESSE) at Prairie View A&M University, Prairie View, Texas, USA, is establishing an integrated, multi-disciplinary research program on the scientific and engineering challenges faced by NASA and the inter-national space community caused by space radiation. CRESSE focuses on space radiation research directly applicable to astronaut health and safety during future long term, deep space missions, including Martian, lunar, and other planetary body missions beyond low earth orbit. The research approach will consist of experimental and theoretical radiation modeling studies utilizing particle accelerator facilities including: 1. NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory; 2. Proton Synchrotron at Loma Linda University Med-ical Center; and 3. Los Alamos Neutron Science Center (LANSCE) at Los Alamos National Laboratory. Specifically, CRESSE investigators are designing, developing, and building experimental test beds that simulate the lunar and Martian radiation environments for experiments focused on risk assessment for astronauts and instrumentation. The testbeds have been designated the Bioastronautics Experimental Research Testbeds for Environmental Radiation Nostrum Investigations and Education (BERT and ERNIE). The designs of BERT and ERNIE will allow for a high degree of flexibility and adaptability to modify experimental configurations to simulate planetary surface environments, planetary habitats, and spacecraft interiors. In the nominal configuration, BERT and ERIE will consist of a set of experimental zones that will simulate the planetary atmosphere (Solid CO2 in the case of the Martian surface.), the planetary surface, and sub-surface regions. These experimental zones can be used for dosimetry, shielding, biological, and electronic effects radiation studies in support of space exploration missions. BERT and ERNIE are designed to be compatible with the

  14. A simulation of the Four-way lunar Lander-Orbiter tracking mode for the Chang'E-5 mission

    Science.gov (United States)

    Li, Fei; Ye, Mao; Yan, Jianguo; Hao, Weifeng; Barriot, Jean-Pierre

    2016-06-01

    The Chang'E-5 mission is the third phase of the Chinese Lunar Exploration Program and will collect and return lunar samples. After sampling, the Orbiter and the ascent vehicle will rendezvous and dock, and both spacecraft will require high precision orbit navigation. In this paper, we present a novel tracking mode-Four-way lunar Lander-Orbiter tracking that possibly can be employed during the Chang'E-5 mission. The mathematical formulas for the Four-way lunar Lander-Orbiter tracking mode are given and implemented in our newly-designed lunar spacecraft orbit determination and gravity field recovery software, the LUnar Gravity REcovery and Analysis Software/System (LUGREAS). The simulated observables permit analysis of the potential contribution Four-way lunar Lander-Orbiter tracking could make to precision orbit determination for the Orbiter. Our results show that the Four-way lunar Lander-Orbiter Range Rate has better geometric constraint on the orbit, and is more sensitive than the traditional two-way range rate that only tracks data between the Earth station and lunar Orbiter. After combining the Four-way lunar Lander-Orbiter Range Rate data with the traditional two-way range rate data and considering the Lander position error and lunar gravity field error, the accuracy of precision orbit determination for the Orbiter in the simulation was improved significantly, with the biggest improvement being one order of magnitude, and the Lander position could be constrained to sub-meter level. This new tracking mode could provide a reference for the Chang'E-5 mission and have enormous potential for the positioning of future lunar farside Lander due to its relay characteristic.

  15. A MATLAB based Distributed Real-time Simulation of Lander-Orbiter-Earth Communication for Lunar Missions

    Science.gov (United States)

    Choudhury, Diptyajit; Angeloski, Aleksandar; Ziah, Haseeb; Buchholz, Hilmar; Landsman, Andre; Gupta, Amitava; Mitra, Tiyasa

    Lunar explorations often involve use of a lunar lander , a rover [1],[2] and an orbiter which rotates around the moon with a fixed radius. The orbiters are usually lunar satellites orbiting along a polar orbit to ensure visibility with respect to the rover and the Earth Station although with varying latency. Communication in such deep space missions is usually done using a specialized protocol like Proximity-1[3]. MATLAB simulation of Proximity-1 have been attempted by some contemporary researchers[4] to simulate all features like transmission control, delay etc. In this paper it is attempted to simulate, in real time, the communication between a tracking station on earth (earth station), a lunar orbiter and a lunar rover using concepts of Distributed Real-time Simulation(DRTS).The objective of the simulation is to simulate, in real-time, the time varying communication delays associated with the communicating elements with a facility to integrate specific simulation modules to study different aspects e.g. response due to a specific control command from the earth station to be executed by the rover. The hardware platform comprises four single board computers operating as stand-alone real time systems (developed by MATLAB xPC target and inter-networked using UDP-IP protocol). A time triggered DRTS approach is adopted. The earth station, the orbiter and the rover are programmed as three standalone real-time processes representing the communicating elements in the system. Communication from one communicating element to another constitutes an event which passes a state message from one element to another, augmenting the state of the latter. These events are handled by an event scheduler which is the fourth real-time process. The event scheduler simulates the delay in space communication taking into consideration the distance between the communicating elements. A unique time synchronization algorithm is developed which takes into account the large latencies in space

  16. Prospects of the ICESat-2 Laser Altimetry Mission for Savanna Ecosystem Structural Studies Based on Airborne Simulation Data

    Science.gov (United States)

    Gwenzi, David; Lefsky, Michael A.; Suchdeo, Vijay P.; Harding, David J.

    2016-01-01

    The next planned spaceborne lidar mission is the Ice, Cloud and land Elevation Satellite 2 (ICESat-2), which will use the Advanced Topographic Laser Altimeter System (ATLAS) sensor, a photon counting technique. To pre-validate the capability of this mission for studying three dimensional vegetation structure in savannas, we assessed the potential of the measurement approach to estimate canopy height in an oak savanna landscape. We used data from the Multiple Altimeter Beam Experimental Lidar (MABEL), an airborne photon counting lidar sensor developed by NASA's Goddard Space Flight Center. ATLAS-like data was generated using the MATLAS simulator, which adjusts MABEL data's detected number of signal and noise photons to that expected from the ATLAS instrument. Transects flown over the Tejon ranch conservancy in Kern County, California, USA were used for this work. For each transect we chose to use data from the near infrared channel that had the highest number of photons. We segmented each transect into 50 m, 25 m and 14 m long blocks and aggregated the photons in each block into a histogram based on their elevation values. We then used an automated algorithm to identify cut off points where the cumulative density of photons from the highest elevation indicates the presence of the canopy top and likewise where such cumulative density from the lowest elevation indicates the mean terrain elevation. MABEL derived height metrics were moderately correlated to discrete return lidar (DRL) derived height metrics r(sub 2) and RMSE values ranging from 0.60 to 0.73 and 2.9 m to 4.4 m respectively) but MATLAS simulation resulted in more modest correlations with DRL indices r(sub 2) ranging from 0.5 to 0.64 and RMSE from 3.6 m to 4.6 m). Simulations also indicated that the expected number of signal photons from ATLAS will be substantially lower, a situation that reduces canopy height estimation precision especially in areas of low density vegetation cover. On the basis of the

  17. Thermal simulations of the STIX instrument for ESA Solar Orbiter mission

    Science.gov (United States)

    Białek, Agata; Severyn, Karol; Grassmann, Kamil; Orleańskii, Piotr; Skup, Konrad R.; Arnold, Nicolas; Gröbelbauer, Hans-Peter; Hurford, Gordon J.; Krucker, Samuel; Bauer, Svend-Marian; Mann, Gottfied; Önel, Hakan; Bernet, Adeline; Blecha, Luc; Grimm, Oliver; Limousin, Olivier; Martignac, Jerome; Meuris, Aline

    2013-07-01

    The ESA Solar Orbiter mission, planned to be launched in 2017, is going to study the Sun with ten different instruments including the Spectrometer/Telescope for Imaging X-rays - STIX. The thermal environment on the elliptical orbit around the Sun - 0.28 AU at perihelion and 0.952 AU at aphelion - is extreme, where at one point of the orbit is very hot, while on another very cold. That makes the requirements for the heat fluxes exchanged between each instrument and the spacecraft, as well as between the instrument - subsystems, very restrictive. Here the authors discuss the thermal design with respect to the defined requirements and present the results of the thermal analyses performed with ESATAN TMS software.

  18. Intra-EVA Space-to-Ground Interactions when Conducting Scientific Fieldwork Under Simulated Mars Mission Constraints

    Science.gov (United States)

    Beaton, Kara H.; Chappell, Steven P.; Abercromby, Andrew F. J.; Lim, Darlene S. S.

    2018-01-01

    The Biologic Analog Science Associated with Lava Terrains (BASALT) project is a four-year program dedicated to iteratively designing, implementing, and evaluating concepts of operations (ConOps) and supporting capabilities to enable and enhance scientific exploration for future human Mars missions. The BASALT project has incorporated three field deployments during which real (non-simulated) biological and geochemical field science have been conducted at two high-fidelity Mars analog locations under simulated Mars mission conditions, including communication delays and data transmission limitations. BASALT's primary Science objective has been to extract basaltic samples for the purpose of investigating how microbial communities and habitability correlate with the physical and geochemical characteristics of chemically altered basalt environments. Field sites include the active East Rift Zone on the Big Island of Hawai'i, reminiscent of early Mars when basaltic volcanism and interaction with water were widespread, and the dormant eastern Snake River Plain in Idaho, similar to present-day Mars where basaltic volcanism is rare and most evidence for volcano-driven hydrothermal activity is relict. BASALT's primary Science Operations objective has been to investigate exploration ConOps and capabilities that facilitate scientific return during human-robotic exploration under Mars mission constraints. Each field deployment has consisted of ten extravehicular activities (EVAs) on the volcanic flows in which crews of two extravehicular and two intravehicular crewmembers conducted the field science while communicating across time delay and under bandwidth constraints with an Earth-based Mission Support Center (MSC) comprised of expert scientists and operators. Communication latencies of 5 and 15 min one-way light time and low (0.512 Mb/s uplink, 1.54 Mb/s downlink) and high (5.0 Mb/s uplink, 10.0 Mb/s downlink) bandwidth conditions were evaluated. EVA crewmembers communicated

  19. Results of the Simulation and Assimilation of Doppler Wind Lidar Observations in Preparation for European Space Agency's Aeolus Mission

    Science.gov (United States)

    McCarty, Will

    2011-01-01

    With the launch of the European Space Agency's Aeolus Mission in 2013, direct spaceborne measurements of vertical wind profiles are imminent via Doppler wind lidar technology. Part of the preparedness for such missions is the development of the proper data assimilation methodology for handling such observations. Since no heritage measurements exist in space, the Joint Observing System Simulation Experiment (Joint OSSE) framework has been utilized to generate a realistic proxy dataset as a precursor to flight. These data are being used for the development of the Gridpoint Statistical Interpolation (GSI) data assimilation system utilized at a number of centers through the United States including the Global Modeling and Assimilation Office (GMAO) at NASA/Goddard Space Flight Center and at the National Centers for Environmental Prediction (NOAA/NWS/NCEP) as an activity through the Joint Center for Satellite Data Assimilation. An update of this ongoing effort will be presented, including the methodology of proxy data generation, the limitations of the proxy data, the handling of line-of-sight wind measurements within the GSI, and the impact on both analyses and forecasts with the addition of the new data type.

  20. Application of Observing System Simulation Experiments (OSSEs) to determining science and user requirements for space-based missions

    Science.gov (United States)

    Atlas, R. M.

    2016-12-01

    Observing System Simulation Experiments (OSSEs) provide an effective method for evaluating the potential impact of proposed new observing systems, as well as for evaluating trade-offs in observing system design, and in developing and assessing improved methodology for assimilating new observations. As such, OSSEs can be an important tool for determining science and user requirements, and for incorporating these requirements into the planning for future missions. Detailed OSSEs have been conducted at NASA/ GSFC and NOAA/AOML in collaboration with Simpson Weather Associates and operational data assimilation centers over the last three decades. These OSSEs determined correctly the quantitative potential for several proposed satellite observing systems to improve weather analysis and prediction prior to their launch, evaluated trade-offs in orbits, coverage and accuracy for space-based wind lidars, and were used in the development of the methodology that led to the first beneficial impacts of satellite surface winds on numerical weather prediction. In this talk, the speaker will summarize the development of OSSE methodology, early and current applications of OSSEs and how OSSEs will evolve in order to enhance mission planning.

  1. Personality factors in flight operations. Volume 1: Leader characteristics and crew performance in a full-mission air transport simulation

    Science.gov (United States)

    Chidester, Thomas R.; Kanki, Barbara G.; Foushee, H. Clayton; Dickinson, Cortlandt L.; Bowles, Stephen V.

    1990-01-01

    Crew effectiveness is a joint product of the piloting skills, attitudes, and personality characteristics of team members. As obvious as this point might seem, both traditional approaches to optimizing crew performance and more recent training development highlighting crew coordination have emphasized only the skill and attitudinal dimensions. This volume is the first in a series of papers on this simulation. A subsequent volume will focus on patterns of communication within crews. The results of a full-mission simulation research study assessing the impact of individual personality on crew performance is reported. Using a selection algorithm described in previous research, captains were classified as fitting one of three profiles along a battery of personality assessment scales. The performances of 23 crews led by captains fitting each profile were contrasted over a one-and-one-half-day simulated trip. Crews led by captains fitting a positive Instrumental-Expressive profile (high achievement motivation and interpersonal skill) were consistently effective and made fewer errors. Crews led by captains fitting a Negative Expressive profile (below average achievement motivation, negative expressive style, such as complaining) were consistently less effective and made more errors. Crews led by captains fitting a Negative Instrumental profile (high levels of competitiveness, verbal aggressiveness, and impatience and irritability) were less effective on the first day but equal to the best on the second day. These results underscore the importance of stable personality variables as predictors of team coordination and performance.

  2. Body composition and metabolic changes during a 520-day mission simulation to Mars.

    Science.gov (United States)

    Strollo, F; Macchi, C; Eberini, I; Masini, M A; Botta, M; Vassilieva, G; Nichiporuk, I; Monici, M; Santucci, D; Celotti, F; Magni, P; Ruscica, M

    2018-03-12

    The "Mars-500 project" allowed to evaluate the changes in psychological/physiological adaptation over a prolonged confinement, in order to gather information for future missions. Here, we evaluated the impact of confinement and isolation on body composition, glucose metabolism/insulin resistance and adipokine levels. The "Mars-500 project" consisted of 520 consecutive days of confinement from June 3, 2010 to Nov 4, 2011. The crew was composed of six male subjects (three Russians, two Europeans, and one Chinese) with a median age of 31 years (range 27-38 years). During the 520-day confinement, total body mass and BMI progressively decreased, reaching a significant difference at the end (417 days) of the observation period (- 9.2 and - 5.5%, respectively). Fat mass remained unchanged. A progressive and significant increase of fasting plasma glucose was observed between 249 and 417 days (+ 10/+ 17% vs baseline), with a further increase at the end of confinement (up to + 30%). Median plasma insulin showed a non-significant early increment (60 days; + 86%). Total adiponectin halved (- 47%) 60 days after hatch closure, remaining at this nadir (- 51%) level for a further 60 days. High molecular weight adiponectin remained significantly lower from 60 to 168 days. Based on these data, countermeasures may be envisioned to balance the potentially harmful effects of prolonged confinement, including a better exercise program, with accurate monitoring of (1) the individual activity and (2) the relationship between body composition and metabolic derangement.

  3. Mission control team structure and operational lessons learned from the 2009 and 2010 NASA desert RATS simulated lunar exploration field tests

    Science.gov (United States)

    Bell, Ernest R.; Badillo, Victor; Coan, David; Johnson, Kieth; Ney, Zane; Rosenbaum, Megan; Smart, Tifanie; Stone, Jeffry; Stueber, Ronald; Welsh, Daren; Guirgis, Peggy; Looper, Chris; McDaniel, Randall

    2013-10-01

    The NASA Desert Research and Technology Studies (Desert RATS) is an annual field test of advanced concepts, prototype hardware, and potential modes of operation to be used on human planetary surface space exploration missions. For the 2009 and 2010 NASA Desert RATS field tests, various engineering concepts and operational exercises were incorporated into mission timelines with the focus of the majority of daily operations being on simulated lunar geological field operations and executed in a manner similar to current Space Shuttle and International Space Station missions. The field test for 2009 involved a two week lunar exploration simulation utilizing a two-man rover. The 2010 Desert RATS field test took this two week simulation further by incorporating a second two-man rover working in tandem with the 2009 rover, as well as including docked operations with a Pressurized Excursion Module (PEM). Personnel for the field test included the crew, a mission management team, engineering teams, a science team, and the mission operations team. The mission operations team served as the core of the Desert RATS mission control team and included certified NASA Mission Operations Directorate (MOD) flight controllers, former flight controllers, and astronaut personnel. The backgrounds of the flight controllers were in the areas of Extravehicular Activity (EVA), onboard mechanical systems and maintenance, robotics, timeline planning (OpsPlan), and spacecraft communicator (Capcom). With the simulated EVA operations, mechanized operations (the rover), and expectations of replanning, these flight control disciplines were especially well suited for the execution of the 2009 and 2010 Desert RATS field tests. The inclusion of an operations team has provided the added benefit of giving NASA mission operations flight control personnel the opportunity to begin examining operational mission control techniques, team compositions, and mission scenarios. This also gave the mission operations

  4. LISA Pathfinder E2E performance simulation: optical and self-gravity stability analysis

    Science.gov (United States)

    Brandt, N.; Fichter, W.; Kersten, M.; Lucarelli, S.; Montemurro, F.

    2005-05-01

    End-to-end (E2E) modelling and simulation, i.e. verifying the science performance of LISA Pathfinder (spacecraft and payload), is mandatory in order to minimize mission risks. In this paper, focus is on two particular applications of the E2E performance simulator currently being developed at EADS Astrium GmbH: the opto-dynamical stability and the self-gravity disturbance stability analysis. The E2E models applied here comprise the opto-dynamical modelling of the optical metrology systems (OMS) laser interferometry, the thermo-elastic distortion modelling of the OMS optical elements and the self-gravity disturbance model accounting for structural distortions. Preliminary analysis results are presented in detail, identifying shortcomings of the current LISA technology package (LTP) mounting baseline. As a consequence, the design is now being revised.

  5. LISA Pathfinder E2E performance simulation: optical and self-gravity stability analysis

    International Nuclear Information System (INIS)

    Brandt, N; Fichter, W; Kersten, M; Lucarelli, S; Montemurro, F

    2005-01-01

    End-to-end (E2E) modelling and simulation, i.e. verifying the science performance of LISA Pathfinder (spacecraft and payload), is mandatory in order to minimize mission risks. In this paper, focus is on two particular applications of the E2E performance simulator currently being developed at EADS Astrium GmbH: the opto-dynamical stability and the self-gravity disturbance stability analysis. The E2E models applied here comprise the opto-dynamical modelling of the optical metrology systems (OMS) laser interferometry, the thermo-elastic distortion modelling of the OMS optical elements and the self-gravity disturbance model accounting for structural distortions. Preliminary analysis results are presented in detail, identifying shortcomings of the current LISA technology package (LTP) mounting baseline. As a consequence, the design is now being revised

  6. Comparação entre dois fios de sutura não absorvíveis na anastomose traqueal término-terminal em cães Comparison of two nonabsorbable suture materials in the end-to-end tracheal anastomosis in dogs

    Directory of Open Access Journals (Sweden)

    Sheila Canevese Rahal

    1995-01-01

    Full Text Available Doze cães sem raça definida, com idade variando entre 1 e 6 anos e peso de 6 a 20kg, foram submetidos a ressecção traqueal e anastomose término-terminal, na qual foram testados os fios poliéster trançado não capilar e náilon monofilamento. Seis animais, cada três com um mesmo tipo de fio de sutura, sofreram a excisão equivalente a três anéis traqueais. Com 15 dias foi executada uma nova intervenção onde se ressecou o equivalente a mais seis anéis, perfazendo um total de nove. Ao final de outros 15 dias foram sacrificados. Os outros seis animais, cada três com um mesmo tipo de fio, foram submetidos à excisão equivalente a três anéis traqueais e mantidos por 43 dias. As traquéias foram avaliadas por exames clínicos, radiográficos, macroscópicos e histopatológicos. O fio de náilon monofilamento apresentou menos reação tecidual do que o poliéster trançado não capilar, promoveu uma anastomose segura e com menor chance de formação de granuloma.Twelve mongrel dogs, with age between 1 and 6 years old and weight between 12 and 40 pounds, were submitted to tracheal resection and end-to-end anastomosis in which were tested braided polyester no capillary and monofilament nylon materiais. Six animais, every threeones with a same type of suture material, suffered the excision equivalent to three tracheal rings. A new intervention was performed with fifteen days, in which the equivalent of more six tracheal rings were removed, completing the total of nine. At the end of more fifteen days they were sacrificed. The other six animals, every three with a same type of suture material, were submitted to the excision equivalent to three tracheal rings and maintained for 43 days. The tracheal anastomosis were evaluated to clinic, radiographic, macroscopic and histopathologic studies. The monofilament nylon material exhibited less reaction than polyester and promoted a secure anastomosis with less risk of granuloma formation.

  7. Evaluation of IEEE 802.11g and 802.16 for Lunar Surface Exploration Missions Using MACHETE Simulations

    Science.gov (United States)

    Segui, John; Jennings, Esther; Vyas, Hemali

    2009-01-01

    In this paper, we investigated the suitability of terrestrial wireless networking technologies for lunar surface exploration missions. Specifically, the scenario we considered consisted of two teams of collaborating astronauts, one base station and one rover, where the base station and the rover have the capability of acting as relays. We focused on the evaluation of IEEE 802.11g and IEEE 802.16 protocols, simulating homogeneous 802.11g network, homogeneous 802.16 network, and heterogeneous network using both 802.11g and 802.16. A mix of traffic flows were simulated, including telemetry, caution and warning, voice, command and file transfer. Each traffic type had its own distribution profile, data volume, and priority. We analyzed the loss and delay trade-offs of these wireless protocols with various link-layer options. We observed that 802.16 network managed the channel better than an 802.11g network due to controlled infrastructure and centralized scheduling. However, due to the centralized scheduling, 802.16 also had a longer delay. The heterogeneous (hybrid) of 802.11/802.16 achieved a better balance of performance in terms of data loss and delay compared to using 802.11 or 802.16 alone.

  8. Simulations of the response function of a plasma ion beam spectrometer for the Cassini mission to Saturn

    International Nuclear Information System (INIS)

    Vilppola, J.H.; Tanskanen, P.J.; Huomo, H.; Barraclough, B.L.

    1996-01-01

    To obtain very high (∼1%) energy resolution with spherical-section electrostatic analyzers requires high precision in both fabrication and in the alignment process. In order to aid in the calibration of the instrument and to help minimize fabrication costs, we have applied simulation models to the ion beam spectrometer for the NASA/ESA Cassini mission to Saturn. Previously we studied the effects of misalignment and simple irregularities of the hemispherical surfaces on the performance of an electrostatic analyzer. We have considered a hemispherical electrostatic analyzer equipped with an aperture plate to collimate the stray electric field at the entrance apertures. The influence of a curved entrance aperture has also been added to the simulation model, and its effects have been studied in detail. A cylindrical three-dimensional simultaneous overrelaxation algorithm has been introduced to solve for the stray electric field. The maximum loss of transmitted particles with respect to the transmission of an ideal instrument has been set at 10%. We demonstrate that the deviation in the distributions of the energies is less than 0.2% and that the deviation in the distributions of entrance angles of transmitted particles is less than 0.1 degree. It has been found that the energy resolution of an electrostatic analyzer can be improved from ΔE/E=(1.6±0.2)% to ΔE/E=(1.3±0.2)% by the introduction of front aperture plates. Through the introduction of curved entrance slits, the azimuthal angle resolution has changed from β=(1.4±0.1)degree for the simplified geometry simulation results of our previous article to β=(2.3±0.1)degree. We have confirmed that an accuracy of 25 μm in the alignment of the two hemispherical surfaces is sufficient to give the instrument the desired resolutions. copyright 1996 American Institute of Physics

  9. Investigation of bio-regenerative life support and Trash-to-gas experiment on a 4 month mars simulation mission

    OpenAIRE

    Caraccio, A.; Poulet, Lucie; Hintze, P.; Miles, J.D.

    2014-01-01

    Future crewed missions to other planets or deep space locations will require regenerative Life Support Systems (LSS) as well as recycling processes for mission waste. Constant resupply of many commodity materials will not be a sustainable option for deep space missions, nor will stowing trash on board a vehicle or at a lunar or Martian outpost. The habitable volume will decline as the volume of waste increases. A complete regenerative environmentally controlled life support system (ECLSS) on ...

  10. Interfacing Space Communications and Navigation Network Simulation with Distributed System Integration Laboratories (DSIL)

    Science.gov (United States)

    Jennings, Esther H.; Nguyen, Sam P.; Wang, Shin-Ywan; Woo, Simon S.

    2008-01-01

    NASA's planned Lunar missions will involve multiple NASA centers where each participating center has a specific role and specialization. In this vision, the Constellation program (CxP)'s Distributed System Integration Laboratories (DSIL) architecture consist of multiple System Integration Labs (SILs), with simulators, emulators, testlabs and control centers interacting with each other over a broadband network to perform test and verification for mission scenarios. To support the end-to-end simulation and emulation effort of NASA' exploration initiatives, different NASA centers are interconnected to participate in distributed simulations. Currently, DSIL has interconnections among the following NASA centers: Johnson Space Center (JSC), Kennedy Space Center (KSC), Marshall Space Flight Center (MSFC) and Jet Propulsion Laboratory (JPL). Through interconnections and interactions among different NASA centers, critical resources and data can be shared, while independent simulations can be performed simultaneously at different NASA locations, to effectively utilize the simulation and emulation capabilities at each center. Furthermore, the development of DSIL can maximally leverage the existing project simulation and testing plans. In this work, we describe the specific role and development activities at JPL for Space Communications and Navigation Network (SCaN) simulator using the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to simulate communications effects among mission assets. Using MACHETE, different space network configurations among spacecrafts and ground systems of various parameter sets can be simulated. Data that is necessary for tracking, navigation, and guidance of spacecrafts such as Crew Exploration Vehicle (CEV), Crew Launch Vehicle (CLV), and Lunar Relay Satellite (LRS) and orbit calculation data are disseminated to different NASA centers and updated periodically using the High Level Architecture (HLA). In

  11. A Lean, Fast Mars Round-trip Mission Architecture: Using Current Technologies for a Human Mission in the 2030s

    Science.gov (United States)

    Bailey, Lora; Folta, David; Barbee, Brent W.; Vaughn, Frank; Kirchman, Frank; Englander, Jacob; Campbell, Bruce; Thronson, Harley; Lin, Tzu Yu

    2013-01-01

    We present a lean fast-transfer architecture concept for a first human mission to Mars that utilizes current technologies and two pivotal parameters: an end-to-end Mars mission duration of approximately one year, and a deep space habitat of approximately 50 metric tons. These parameters were formulated by a 2012 deep space habitat study conducted at the NASA Johnson Space Center (JSC) that focused on a subset of recognized high- engineering-risk factors that may otherwise limit space travel to destinations such as Mars or near-Earth asteroid (NEA)s. With these constraints, we model and promote Mars mission opportunities in the 2030s enabled by a combination of on-orbit staging, mission element pre-positioning, and unique round-trip trajectories identified by state-of-the-art astrodynamics algorithms.

  12. The subsurface geology of Río Tinto: material examined during a simulated Mars drilling mission for the Mars Astrobiology Research and Technology Experiment (MARTE).

    Science.gov (United States)

    Prieto-Ballesteros, Olga; Martínez-Frías, Jesús; Schutt, John; Sutter, Brad; Heldmann, Jennifer L; Bell, Mary Sue; Battler, Melissa; Cannon, Howard; Gómez-Elvira, Javier; Stoker, Carol R

    2008-10-01

    The 2005 Mars Astrobiology Research and Technology Experiment (MARTE) project conducted a simulated 1-month Mars drilling mission in the Río Tinto district, Spain. Dry robotic drilling, core sampling, and biological and geological analytical technologies were collectively tested for the first time for potential use on Mars. Drilling and subsurface sampling and analytical technologies are being explored for Mars because the subsurface is the most likely place to find life on Mars. The objectives of this work are to describe drilling, sampling, and analytical procedures; present the geological analysis of core and borehole material; and examine lessons learned from the drilling simulation. Drilling occurred at an undisclosed location, causing the science team to rely only on mission data for geological and biological interpretations. Core and borehole imaging was used for micromorphological analysis of rock, targeting rock for biological analysis, and making decisions regarding the next day's drilling operations. Drilling reached 606 cm depth into poorly consolidated gossan that allowed only 35% of core recovery and contributed to borehole wall failure during drilling. Core material containing any indication of biology was sampled and analyzed in more detail for its confirmation. Despite the poorly consolidated nature of the subsurface gossan, dry drilling was able to retrieve useful core material for geological and biological analysis. Lessons learned from this drilling simulation can guide the development of dry drilling and subsurface geological and biological analytical technologies for future Mars drilling missions.

  13. MERLIN: a Franco-German LIDAR space mission for atmospheric methane

    Science.gov (United States)

    Bousquet, P.; Ehret, G.; Pierangelo, C.; Marshall, J.; Bacour, C.; Chevallier, F.; Gibert, F.; Armante, R.; Crevoisier, C. D.; Edouart, D.; Esteve, F.; Julien, E.; Kiemle, C.; Alpers, M.; Millet, B.

    2017-12-01

    The Methane Remote Sensing Lidar Mission (MERLIN), currently in phase C, is a joint cooperation between France and Germany on the development, launch and operation of a space LIDAR dedicated to the retrieval of total weighted methane (CH4) atmospheric columns. Atmospheric methane is the second most potent anthropogenic greenhouse gas, contributing 20% to climate radiative forcing but also plying an important role in atmospheric chemistry as a precursor of tropospheric ozone and low-stratosphere water vapour. Its short lifetime ( 9 years) and the nature and variety of its anthropogenic sources also offer interesting mitigation options in regards to the 2° objective of the Paris agreement. For the first time, measurements of atmospheric composition will be performed from space thanks to an IPDA (Integrated Path Differential Absorption) LIDAR (Light Detecting And Ranging), with a precision (target ±27 ppb for a 50km aggregation along the trace) and accuracy (target recall the MERLIN objectives and mission characteristics. We also propose an end-to-end error analysis, from the causes of random and systematic errors of the instrument, of the platform and of the data treatment, to the error on methane emissions. To do so, we propose an OSSE analysis (observing system simulation experiment) to estimate the uncertainty reduction on methane emissions brought by MERLIN XCH4. The originality of our inversion system is to transfer both random and systematic errors from the observation space to the flux space, thus providing more realistic error reductions than usually provided in OSSE only using the random part of errors. Uncertainty reductions are presented using two different atmospheric transport models, TM3 and LMDZ, and compared with error reduction achieved with the GOSAT passive mission.

  14. Simulating Large Area, High Intensity AM0 Illumination on Earth- Representative Testing at Elevated Temperatures for the BepiColombo and SolO Missions

    Science.gov (United States)

    Oberhuttinger, C.; Quabis, D.; Zimmermann, C. G.

    2014-08-01

    During both the BepiColombo and the Solar Orbiter (SolO) mission, severe environmental conditions with sun intensities up to 10.6 solar constants (SCs) resp. 12.8 SCs will be encountered. Therefore, a special cell design was developed which can withstand these environmental loads. To verify the solar cells under representative conditions, a set of specific tests is conducted. The key qualification test for these high intensity, high temperature (HIHT) missions is a combined test, which exposes a large number of cells simultaneously to the complete AM0 spectrum at the required irradiance and temperature. Such a test was set up in the VTC1.5 chamber located at ESTEC. This paper provides an overview of the challenges in designing a setup capable of achieving this HIHT simulation. The solutions that were developed will be presented. Also the performance of the setup will be illustrated by actual test results.

  15. Broadband permittivity measurements on porous planetary regoliths simulants, in relation with the Rosetta mission to 67P/C-G

    Science.gov (United States)

    Brouet, Yann; Levasseur-Regourd, Anny-Chantal; Encrenaz, Pierre; Sabouroux, Pierre; Heggy, Essam; Kofman, Wlodek; Thomas, Nick

    2015-04-01

    The Rosetta mission has successfully rendezvous comet 67P/Churyumov-Gerasimenko (hereafter 67P) last year and landed Philae module on its nucleus on 12 November it 2014. Among instruments onboard Rosetta, MIRO [1], composed of two radiometers, with receivers at 190 GHz and 563 GHz (center-band), is dedicated to the measurements of the subsurface and surface brightness temperatures. These values depend on the complex relative permittivity (hereafter permittivity) with ɛ' and ɛ'' the real and imaginary parts. The permittivity of the material depends on frequency, bulk density/porosity, composition and temperature [2]. Considering the very low bulk density of 67P nucleus (about 450 kg.m-3 [3]) and the suspected presence of a dust mantle in many areas of the nucleus [4], investigations on the permittivity of porous granular samples are needed to support the interpretation of MIRO data, as well as of other microwave experiments onboard Rosetta, e.g. CONSERT [5], a bistatic penetrating radar working at 90 MHz. We have developed a programme of permittivity measurements on porous granular samples over a frequency range from 50 MHz to 190 GHz under laboratory conditions (e.g. [6] and [7]). We present new results obtained on JSC-1A lunar soil simulant and ashes from Etna. The samples were split into several sub-samples with different size ranges covering a few to 500 μm. Bulk densities of the sub-samples were carefully measured and found to be in the 800-1400 kg.m-3 range. Sub-samples were also dried and volumetric moisture content was found to be below 0.6%. From 50 MHz to 6 GHz and at 190 GHz, the permittivity has been determined, respectively with a coaxial cell and with a quasi-optical bench mounted in transmission, both connected to a vector network analyzer. The results demonstrate the dispersive behaviours of ɛ' between 50 MHz and 190 GHz. Values of ɛ' remain within the 3.9-2.6 range for all sub-samples. At CONSERT frequency, ɛ'' is within the 0.01-0.09 range

  16. Study of individual and group affective processes in the crew of a simulated mission to Mars: Positive affectivity as a valuable indicator of changes in the crew affectivity

    Science.gov (United States)

    Poláčková Šolcová, Iva; Lačev, Alek; Šolcová, Iva

    2014-07-01

    The success of a long-duration space mission depends on various technical demands as well as on the psychological (cognitive, affective, and motivational) adaptation of crewmembers and the quality of interactions within the crew. We examined the ways crewmembers of a 520-day simulated spaceflight to Mars (held in the Institute for Biomedical Problems, in Moscow) experienced and regulated their moods and emotions. Results show that crewmembers experienced predominantly positive emotions throughout their 520-day isolation and the changes in mood of the crewmembers were asynchronous and balanced. The study suggests that during the simulation, crewmembers experienced and regulated their emotions differently than they usually do in their everyday life. In isolation, crewmembers preferred to suppress and neutralize their negative emotions and express overtly only emotions with positive valence. Although the affective processes were almost invariable throughout the simulation, two periods of time when the level of positive emotions declined were identified. Regarding the findings, the paper suggests that changes in positive affectivity could be a more valuable indicator of human experience in demanding but professional environments than changes in negative affectivity. Finally, the paper discusses the phenomenology of emotions during a real space mission.

  17. Interplanetary Trajectory Design for the Asteroid Robotic Redirect Mission Alternate Approach Trade Study

    Science.gov (United States)

    Merrill, Raymond Gabriel; Qu, Min; Vavrina, Matthew A.; Englander, Jacob A.; Jones, Christopher A.

    2014-01-01

    This paper presents mission performance analysis methods and results for the Asteroid Robotic Redirect Mission (ARRM) option to capture a free standing boulder on the surface of a 100 m or larger NEA. It details the optimization and design of heliocentric low-thrust trajectories to asteroid targets for the ARRM solar electric propulsion spacecraft. Extensive searches were conducted to determine asteroid targets with large pick-up mass potential and potential observation opportunities. Interplanetary trajectory approximations were developed in method based tools for Itokawa, Bennu, 1999 JU3, and 2008 EV5 and were validated by end-to-end integrated trajectories.

  18. A Review of New and Developing Technology to Significantly Improve Mars Sample-Return Missions

    Science.gov (United States)

    Carsey, F.; Brophy, J.; Gilmore, M.; Rodgers, D.; Wilcox, B.

    2000-07-01

    A JPL development activity was initiated in FY 1999 for the purpose of examining and evaluating technologies that could materially improve future (i.e., beyond the 2005 launch) Mars sample return missions. The scope of the technology review was comprehensive and end-to-end; the goal was to improve mass, cost, risk, and scientific return. A specific objective was to assess approaches to sample return with only one Earth launch. While the objective of the study was specifically for sample-return, in-situ missions can also benefit from using many of the technologies examined.

  19. Space Network IP Services (SNIS): An Architecture for Supporting Low Earth Orbiting IP Satellite Missions

    Science.gov (United States)

    Israel, David J.

    2005-01-01

    The NASA Space Network (SN) supports a variety of missions using the Tracking and Data Relay Satellite System (TDRSS), which includes ground stations in White Sands, New Mexico and Guam. A Space Network IP Services (SNIS) architecture is being developed to support future users with requirements for end-to-end Internet Protocol (IP) communications. This architecture will support all IP protocols, including Mobile IP, over TDRSS Single Access, Multiple Access, and Demand Access Radio Frequency (RF) links. This paper will describe this architecture and how it can enable Low Earth Orbiting IP satellite missions.

  20. Improving Integrated Operation in the Joint Integrated Mission Model (JIMM) and the Simulated Warfare Environment Data Transfer (SWEDAT) Protocol

    National Research Council Canada - National Science Library

    Mutschler, David W

    2005-01-01

    ...). It allows integrated operation of resources whereby the JIMM threat environment, stimulators virtual cockpits, systems under test, and other agents are combined within the same simulation exercise...

  1. Gas mission; Mission gaz

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    This preliminary report analyses the desirable evolutions of gas transport tariffing and examines some questions relative to the opening of competition on the French gas market. The report is made of two documents: a synthesis of the previous report with some recommendations about the tariffing of gas transport, about the modalities of network access to third parties, and about the dissociation between transport and trade book-keeping activities. The second document is the progress report about the opening of the French gas market. The first part presents the European problem of competition in the gas supply and its consequences on the opening and operation of the French gas market. The second part presents some partial syntheses about each topic of the mission letter of the Ministry of Economics, Finances and Industry: future evolution of network access tariffs, critical analysis of contractual documents for gas transport and delivery, examination of auxiliary services linked with the access to the network (modulation, balancing, conversion), consideration about the processing of network congestions and denied accesses, analysis of the metering dissociation between the integrated activities of gas operators. Some documents are attached in appendixes: the mission letter from July 9, 2001, the detailed analysis of the new temporary tariffs of GdF and CFM, the offer of methane terminals access to third parties, the compatibility of a nodal tariffing with the presence of three transport operators (GdF, CFM and GSO), the contract-type for GdF supply, and the contract-type for GdF connection. (J.S.)

  2. Modeling Constellation Virtual Missions Using the Vdot(Trademark) Process Management Tool

    Science.gov (United States)

    Hardy, Roger; ONeil, Daniel; Sturken, Ian; Nix, Michael; Yanez, Damian

    2011-01-01

    The authors have identified a software tool suite that will support NASA's Virtual Mission (VM) effort. This is accomplished by transforming a spreadsheet database of mission events, task inputs and outputs, timelines, and organizations into process visualization tools and a Vdot process management model that includes embedded analysis software as well as requirements and information related to data manipulation and transfer. This paper describes the progress to date, and the application of the Virtual Mission to not only Constellation but to other architectures, and the pertinence to other aerospace applications. Vdot s intuitive visual interface brings VMs to life by turning static, paper-based processes into active, electronic processes that can be deployed, executed, managed, verified, and continuously improved. A VM can be executed using a computer-based, human-in-the-loop, real-time format, under the direction and control of the NASA VM Manager. Engineers in the various disciplines will not have to be Vdot-proficient but rather can fill out on-line, Excel-type databases with the mission information discussed above. The author s tool suite converts this database into several process visualization tools for review and into Microsoft Project, which can be imported directly into Vdot. Many tools can be embedded directly into Vdot, and when the necessary data/information is received from a preceding task, the analysis can be initiated automatically. Other NASA analysis tools are too complex for this process but Vdot automatically notifies the tool user that the data has been received and analysis can begin. The VM can be simulated from end-to-end using the author s tool suite. The planned approach for the Vdot-based process simulation is to generate the process model from a database; other advantages of this semi-automated approach are the participants can be geographically remote and after refining the process models via the human-in-the-loop simulation, the

  3. Advanced Distributed Simulation Technology II (ADST-II) Distributed Interactive Fire Mission II Concept Evaluation Program Final Report

    National Research Council Canada - National Science Library

    1999-01-01

    ...), Fort Knox, KY sponsored the experiment. The experiment utilized a synthetic environment that employed virtual simulations to depict an Armor Platoon executing six basic platoon-level scenarios in realistic combat situations in various...

  4. [The mission].

    Science.gov (United States)

    Ruiz Moreno, J; Blanch Mon, A

    2000-01-01

    After having made a historical review of the concept of mission statement, of evaluating its importance (See Part I), of describing the bases to create a mission statement from a strategic perspective and of analyzing the advantages of this concept, probably more important as a business policy (See Parts I and II), the authors proceed to analyze the mission statement in health organizations. Due to the fact that a mission statement is lacking in the majority of health organizations, the strategy of health organizations are not exactly favored; as a consequence, neither are its competitive advantage nor the development of its essential competencies. After presenting a series of mission statements corresponding to Anglo-Saxon health organizations, the authors highlight two mission statements corresponding to our social context. The article finishes by suggesting an adequate sequence for developing a mission statement in those health organizations having a strategic sense.

  5. Computer simulations for the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission through NASA's "Project Spectra!"

    Science.gov (United States)

    Christofferson, R.; Wood, E. L.; Euler, G.

    2012-12-01

    "Project Spectra!" is a standards-based light science and engineering program on solar system exploration that includes both hands-on paper and pencil activities as well as Flash-based computer games that help students solidify understanding of high-level planetary and solar physics. Using computer interactive games where students experience and manipulate the information makes abstract concepts accessible. Visualizing lessons with multi-media tools solidifies understanding and retention of knowledge. Since students can choose what to watch and explore, the interactives accommodate a broad range of learning styles. Students can go back and forth through the interactives if they've missed a concept or wish to view something again. In the end, students are asked critical thinking questions and conduct web-based research. As a part of the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission education programming, we've developed two new "Project Spectra!" interactives that go hand-in-hand with a paper and pencil activity. The MAVEN mission will study volatiles in the upper atmosphere to help piece together Mars' climate history. In the first interactive, students explore black body radiation, albedo, and a simplified greenhouse effect to establish what factors contribute to overall planetary temperature and how they contribute. Students are asked to create a scenario in which a planet they build and design is able to maintain liquid water on the surface. In the second interactive, students are asked to consider Mars and the conditions needed for Mars to support water on the surface, keeping some variables fixed. Ideally, students will walk away with the very basic and critical elements required for climate studies, which has far-reaching implications beyond the study of Mars. These interactives are currently being pilot tested at Arvada High School in Colorado.

  6. Computer simulations for the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission through NASA's 'Project Spectra!'

    Science.gov (United States)

    Wood, E. L.

    2013-12-01

    'Project Spectra!' is a standards-based light science and engineering program on solar system exploration that includes both hands-on paper and pencil activities as well as Flash-based computer games that help students solidify understanding of high-level planetary and solar physics. Using computer interactive games where students experience and manipulate the information makes abstract concepts accessible. Visualizing lessons with multi-media tools solidifies understanding and retention of knowledge. Since students can choose what to watch and explore, the interactives accommodate a broad range of learning styles. Students can go back and forth through the interactives if they've missed a concept or wish to view something again. In the end, students are asked critical thinking questions and conduct web-based research. As a part of the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission education programming, we've developed two new 'Project Spectra!' interactives that go hand-in-hand with a paper and pencil activity. The MAVEN mission will study volatiles in the upper atmosphere to help piece together Mars' climate history. In the first interactive, students explore black body radiation, albedo, and a simplified greenhouse effect to establish what factors contribute to overall planetary temperature and how they contribute. Students are asked to create a scenario in which a planet they build and design is able to maintain liquid water on the surface. In the second interactive, students are asked to consider Mars and the conditions needed for Mars to support water on the surface, keeping some variables fixed. Ideally, students will walk away with the very basic and critical elements required for climate studies, which has far-reaching implications beyond the study of Mars. These interactives were pilot tested at Arvada High School in Colorado.

  7. Toward a first-principles integrated simulation of tokamak edge plasmas

    International Nuclear Information System (INIS)

    Chang, C S; Klasky, Scott A; Cummings, Julian; Samtaney, Ravi; Shoshani, A.; Sugiyama, L.; Keyes, David E; Ku, Seung-Hoe; Park, G.; Parker, Scott; Podhorszki, Norbert; Strauss, H.; Abbasi, H.; Adams, Mark; Barreto, Roselyne D; Bateman, Glenn; Bennett, K.; Chen, Yang; D'Azevedo, Eduardo; Docan, Ciprian; Ethier, Stephane; Feibush, E.; Greengard, Leslie; Hahm, Taik Soo; Hinton, Fred; Jin, Chen; Khan, A.; Kritz, Arnold; Krstic, Predrag S; Lao, T.; Lee, Wei-Li; Lin, Zhihong; Lofstead, J.; Mouallem, P. A.; Nagappan, M.; Pankin, A.; Parashar, Manish; Pindzola, Michael S.; Reinhold, Carlos O; Schultz, David Robert; Schwan, Karsten; Silver, D.; Sim, A.; Stotler, D.

    2008-01-01

    Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary first principles, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to deliver such a code integration framework by (1) building new kinetic codes XGC0 and XGC1, which can simulate the edge pedestal buildup; (2) using and improving the existing MHD codes ELITE, M3D-OMP, M3D-MPP and NIMROD, for study of large-scale edge instabilities called Edge Localized Modes (ELMs); and (3) integrating the codes into a framework using cutting-edge computer science technology. Collaborative effort among physics, computer science, and applied mathematics within CPES has created the first working version of the End-to-end Framework for Fusion Integrated Simulation (EFFIS), which can be used to study the pedestal-ELM cycles

  8. [Myanmar mission].

    Science.gov (United States)

    Alfandari, B; Persichetti, P; Pelissier, P; Martin, D; Baudet, J

    2004-06-01

    The authors report the accomplishment of humanitarian missions in plastic surgery performed by a small team in town practice in Yangon, about their 3 years experience in Myanmar with 300 consultations and 120 surgery cases. They underline the interest of this type of mission and provide us their reflexion about team training, the type of relation with the country where the mission is conducted and the type of right team.

  9. Multi-hop Relaying: An End-to-End Delay Analysis

    KAUST Repository

    Chaaban, Anas

    2015-12-01

    The impact of multi-hopping schemes on the communication latency in a relay channel is studied. The main aim is to characterize conditions under which such schemes decrease the communication latency given a reliability requirement. Both decode-forward (DF) and amplify-forward (AF) with block coding are considered, and are compared with the point-to-point (P2P) scheme which ignores the relay. Latency expressions for the three schemes are derived, and conditions under which DF and AF reduce latency are obtained for high signal-to-noise ratio (SNR). Interestingly, these conditions are more strict when compared to the conditions under which the same multi-hopping schemes achieve higher long-term (information-theoretic) rates than P2P. It turns out that the relation between the sourcedestination SNR and the harmonic mean of the SNR’s of the channels to and from the relay dictates whether multi-hopping reduces latency or not.

  10. Adaptive end-to-end optimization of mobile video streaming using QoS negotiation

    NARCIS (Netherlands)

    Taal, Jacco R.; Langendoen, Koen; van der Schaaf, Arjen; van Dijk, H.W.; Lagendijk, R. (Inald) L.

    Video streaming over wireless links is a non-trivial problem due to the large and frequent changes in the quality of the underlying radio channel combined with latency constraints. We believe that every layer in a mobile system must be prepared to adapt its behavior to its environment. Thus layers

  11. End-to-End Verification of Information-Flow Security for C and Assembly Programs

    Science.gov (United States)

    2016-04-01

    seL4 security verification [18] avoids this issue in the same way. In that work, the authors frame their solution as a restriction that disallows...identical: (σ, σ′1) ∈ TM ∧ (σ, σ′2) ∈ TM =⇒ Ol(σ′1) = Ol(σ′2) The successful security verifications of both seL4 and mCertiKOS provide reasonable...evidence that this restriction on specifications is not a major hindrance for usability. Unlike the seL4 verification, however, our framework runs into a

  12. MONTAGE: A Methodology for Designing Composable End-to-End Secure Distributed Systems

    Science.gov (United States)

    2012-08-01

    and verification, from PSOS [NF03] to the recent seL4 [KEH+09]. While they make considerable progress toward high-assurance OS, these works are not...of the specification itself. Examples include the seL4 microkernel work by Klein et al. [KEH+09], which presents the experience of formally proving...David Cock, Philip Derrin, Dhammika Elkaduwe, Kai Engelhardt, Rafal Kolanski, Michael Norrish, Thomas Sewell, Harvey Tuch, and Simon Winwood. sel4

  13. Future Wireless Network: MyNET Platform and End-to-End Network Slicing

    OpenAIRE

    Zhang, Hang

    2016-01-01

    Future wireless networks are facing new challenges. These new challenges require new solutions and strategies of the network deployment, management, and operation. Many driving factors are decisive in the re-definition and re-design of the future wireless network architecture. In the previously published paper "5G Wireless Network - MyNET and SONAC", MyNET and SONAC, a future network architecture, are described. This paper elaborates MyNET platform with more details. The design principles of ...

  14. The Knowledge Graph for End-to-End Learning on Heterogeneous Knowledge

    NARCIS (Netherlands)

    Wilcke, W.X.; Bloem, P.; de Boer, Viktor

    2018-01-01

    In modern machine learning,raw data is the preferred input for our models. Where a decade ago data scientists were still engineering features, manually picking out the details we thought salient, they now prefer the data in their raw form. As long as we can assume that all relevant and irrelevant

  15. Network Slicing in Industry 4.0 Applications: Abstraction Methods and End-to-End Analysis

    DEFF Research Database (Denmark)

    Nielsen, Jimmy Jessen; Popovski, Petar; Kalør, Anders Ellersgaard

    2018-01-01

    Industry 4.0 refers to the fourth industrial revolution, and introduces modern communication and computation technologies such as 5G, cloud computing and Internet of Things to industrial manufacturing systems. As a result, many devices, machines and applications will rely on connectivity, while...... having different requirements from the network, ranging from high reliability and low latency to high data rates. Furthermore, these industrial networks will be highly heterogeneous as they will feature a number of diverse communication technologies. In this article, we propose network slicing...

  16. End-to-End Deep Learning Model For Automatic Sleep Staging Using Raw PSG Waveforms

    DEFF Research Database (Denmark)

    Olesen, Alexander Neergaard; Peppard, P. E.; Sorensen, H. B.

    2018-01-01

    Deep learning has seen significant progress over the last few years, especially in computer vision, where competitions such as the ImageNet challenge have been the driving factor behind many new model architectures far superior to humans in image recognition. We propose a novel method for automatic...... accuracy, precision and recall were 84.93%, 97.42% and 97.02%, respectively. Evaluating on the validation set yielded an overall accuracy of 85.07% and overall precision/recall of 98.54% and 95.72%, respectively. Conclusion: Preliminary results indicate that state of the art deep learning models can...... sleep staging, which relies on current advances in computer vision models eliminating the need for feature engineering or other transformations of input data. By exploiting the high capacity for complex learning in a state of the art object recognition model, we can effectively use raw PSG signals...

  17. End-to-End Mechanisms for Rate-Adaptive Multicast Streaming over the Internet

    OpenAIRE

    Rimac, Ivica

    2005-01-01

    Continuous media applications over packet-switched networks are becoming more and more popular. Radio stations, for example, already use streaming technology to disseminate their content to users on the Internet, and video streaming services are expected to experience similar popularity. In contrast to traditional television and radio broadcast systems, however, prevalent Internet streaming solutions are based on unicast communication and raise scalability and efficiency issues. Multicast com...

  18. An end-to-end security auditing approach for service oriented architectures

    NARCIS (Netherlands)

    Azarmi, M.; Bhargava, B.; Angin, P.; Ranchal, R.; Ahmed, N.; Sinclair, A.; Linderman, M.; Ben Othmane, L.

    2012-01-01

    Service-Oriented Architecture (SOA) is becoming a major paradigm for distributed application development in the recent explosion of Internet services and cloud computing. However, SOA introduces new security challenges not present in the single-hop client-server architectures due to the involvement

  19. Enhancing end-to-end QoS for multimedia streaming in IMS-based networks

    NARCIS (Netherlands)

    Ozcelebi, T.; Radovanovic, I.; Chaudron, M.R.V.

    2007-01-01

    Convergence of the emerging IP Multimedia Subsystem(IMS) includes unlicensed, nondedicated and nondeterministic hence uncontrollable. computer access, networks for IP multimedia services. It enables provision of resource demanding real-time services and multimedia communication raising new

  20. An end-to-end computing model for the Square Kilometre Array

    NARCIS (Netherlands)

    Jongerius, R.; Wijnholds, S.; Nijboer, R.; Corporaal, H.

    2014-01-01

    For next-generation radio telescopes such as the Square Kilometre Array, seemingly minor changes in scientific constraints can easily push computing requirements into the exascale domain. The authors propose a model for engineers and astronomers to understand these relations and make tradeoffs in

  1. AAL Security and Privacy: transferring XACML policies for end-to-end acess and usage control

    NARCIS (Netherlands)

    Vlamings, H.G.M.; Koster, R.P.

    2010-01-01

    Ambient Assisted Living (AAL) systems and services aim to provide a solution for growing healthcare expenses and degradation of life quality of elderly using information and communication technology. Inparticular AAL solutions are being created that are heavily based on web services an sensor

  2. Topological Constraints on Identifying Additive Link Metrics via End-to-end Paths Measurements

    Science.gov (United States)

    2012-09-20

    identifiable if and only ifR in (1) has full column rank, i.e., rank(R) = n. In other words, to uniquely determine w, there must be n linearly...be identified from paths traversing l1; similar argument applies to l2. Moreover, similar analysis as in the proof of this lemma shows that none of

  3. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    Science.gov (United States)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  4. End-to-end requirements management for multiprojects in the construction industry

    DEFF Research Database (Denmark)

    Wörösch, Michael

    Performance Concrete and insulation materials – is used. By means of action research and interviews of case project staff it has become evident that many elements of formalized requirements management are missing in the case project. To fill those gaps and be able to manage requirements end...... with regards to requirements management. As the literature study gives little new information, a series of interviews are initiated with experts from industry and universities. Those interviews reveal major shortcomings in the way requirements are handled in Danish construction companies today. In order...... to give managers of construction projects a useful and guiding tool for formally managing requirements that is rooted in practice, the “Conceptual requirements management framework”, is created. The framework builds upon the gathered empirical data, obtained by action research, interviews, and available...

  5. Ubiquitous Monitoring Solution for Wireless Sensor Networks with Push Notifications and End-to-End Connectivity

    Directory of Open Access Journals (Sweden)

    Luis M. L. Oliveira

    2014-01-01

    Full Text Available Wireless Sensor Networks (WSNs belongs to a new trend in technology in which tiny and resource constrained devices are wirelessly interconnected and are able to interact with the surrounding environment by collecting data such as temperature and humidity. Recently, due to the huge growth in the use of mobile devices with Internet connection, smartphones are becoming the center of future ubiquitous wireless networks. Interconnecting WSNs with smartphones and the Internet is a big challenge and new architectures are required due to the heterogeneity of these devices. Taking into account that people are using smartphones with Internet connection, there is a good opportunity to propose a new architecture for wireless sensors monitoring using push notifications and smartphones. Then, this paper proposes a ubiquitous approach for WSN monitoring based on a REST Web Service, a relational database, and an Android mobile application. Real-time data sensed by WSNs are sent directly to a smartphone or stored in a database and requested by the mobile application using a well-defined RESTful interface. A push notification system was created in order to alert mobile users when a sensor parameter overcomes a given threshold. The proposed architecture and mobile application were evaluated and validated using a laboratory WSN testbed and are ready for use.

  6. Designing a holistic end-to-end intelligent network analysis and security platform

    Science.gov (United States)

    Alzahrani, M.

    2018-03-01

    Firewall protects a network from outside attacks, however, once an attack entering a network, it is difficult to detect. Recent significance accidents happened. i.e.: millions of Yahoo email account were stolen and crucial data from institutions are held for ransom. Within two year Yahoo’s system administrators were not aware that there are intruder inside the network. This happened due to the lack of intelligent tools to monitor user behaviour in internal network. This paper discusses a design of an intelligent anomaly/malware detection system with proper proactive actions. The aim is to equip the system administrator with a proper tool to battle the insider attackers. The proposed system adopts machine learning to analyse user’s behaviour through the runtime behaviour of each node in the network. The machine learning techniques include: deep learning, evolving machine learning perceptron, hybrid of Neural Network and Fuzzy, as well as predictive memory techniques. The proposed system is expanded to deal with larger network using agent techniques.

  7. Improving End-To-End Tsunami Warning for Risk Reduction on Canada’s West Coast

    Science.gov (United States)

    2015-01-01

    in 2014, up from 455 cals in 2013 (Chamber of Shipping, 2014). Even the more traditional forms of marine tourism such as sports fishing have been...some of the most noteworthy areas of new economic activity to emerge have been aquaculture, recreation and tourism , research and oil, gas and other...Risk Reduction on Canada’s West Coast (CSSP-2013-TI-1033) 3   annual value of output over $590 milion (Fisheries and Oceans Canada, 2013). Tourism

  8. Research on the Establishment and Evaluation of End - to - End Service Quality Index System

    Science.gov (United States)

    Wei, Chen; Jing, Tao; Ji, Yutong

    2018-01-01

    From the perspective of power data networks, put forward the index system model to measure the quality of service, covering user experience, business performance, network capacity support, etc., and gives the establishment and use of each layer index in the model.

  9. Increasing Army Supply Chain Performance: Using an Integrated End to End Metrics System

    Science.gov (United States)

    2017-01-01

    Sched Deliver Sched Delinquent Contracts Current Metrics PQDR/SDRs Forecasting Accuracy Reliability Demand Management Asset Mgmt Strategies Pipeline...are identified and characterized by statistical analysis. The study proposed a framework and tool for inventory management based on factors such as

  10. End-to-end unsupervised deformable image registration with a convolutional neural network

    NARCIS (Netherlands)

    de Vos, Bob D.; Berendsen, Floris; Viergever, Max A.; Staring, Marius; Išgum, Ivana

    2017-01-01

    In this work we propose a deep learning network for deformable image registration (DIRNet). The DIRNet consists of a convolutional neural network (ConvNet) regressor, a spatial transformer, and a resampler. The ConvNet analyzes a pair of fixed and moving images and outputs parameters for the spatial

  11. An end-to-end assessment of extreme weather impacts on food security

    Science.gov (United States)

    Chavez, Erik; Conway, Gordon; Ghil, Michael; Sadler, Marc

    2015-11-01

    Both governments and the private sector urgently require better estimates of the likely incidence of extreme weather events, their impacts on food crop production and the potential consequent social and economic losses. Current assessments of climate change impacts on agriculture mostly focus on average crop yield vulnerability to climate and adaptation scenarios. Also, although new-generation climate models have improved and there has been an exponential increase in available data, the uncertainties in their projections over years and decades, and at regional and local scale, have not decreased. We need to understand and quantify the non-stationary, annual and decadal climate impacts using simple and communicable risk metrics that will help public and private stakeholders manage the hazards to food security. Here we present an `end-to-end’ methodological construct based on weather indices and machine learning that integrates current understanding of the various interacting systems of climate, crops and the economy to determine short- to long-term risk estimates of crop production loss, in different climate and adaptation scenarios. For provinces north and south of the Yangtze River in China, we have found that risk profiles for crop yields that translate climate into economic variability follow marked regional patterns, shaped by drivers of continental-scale climate. We conclude that to be cost-effective, region-specific policies have to be tailored to optimally combine different categories of risk management instruments.

  12. End-to-End Key Exchange through Disjoint Paths in P2P Networks

    Directory of Open Access Journals (Sweden)

    Daouda Ahmat

    2015-01-01

    Full Text Available Due to their inherent features, P2P networks have proven to be effective in the exchange of data between autonomous peers. Unfortunately, these networks are subject to various security threats that cannot be addressed readily since traditional security infrastructures, which are centralized, cannot be applied to them. Furthermore, communication reliability across the Internet is threatened by various attacks, including usurpation of identity, eavesdropping or traffic modification. Thus, in order to overcome these security issues and allow peers to securely exchange data, we propose a new key management scheme over P2P networks. Our approach introduces a new method that enables a secret key exchange through disjoint paths in the absence of a trusted central coordination point which would be required in traditional centralized security systems.

  13. Effect of 3 Key Factors on Average End to End Delay and Jitter in MANET

    Directory of Open Access Journals (Sweden)

    Saqib Hakak

    2015-01-01

    Full Text Available A mobile ad-hoc network (MANET is a self-configuring infrastructure-less network of mobile devices connected by wireless links where each node or mobile device is independent to move in any desired direction and thus the links keep moving from one node to another. In such a network, the mobile nodes are equipped with CSMA/CA (carrier sense multiple access with collision avoidance transceivers and communicate with each other via radio. In MANETs, routing is considered one of the most difficult and challenging tasks. Because of this, most studies on MANETs have focused on comparing protocols under varying network conditions. But to the best of our knowledge no one has studied the effect of other factors on network performance indicators like throughput, jitter and so on, revealing how much influence a particular factor or group of factors has on each network performance indicator. Thus, in this study the effects of three key factors, i.e. routing protocol, packet size and DSSS rate, were evaluated on key network performance metrics, i.e. average delay and average jitter, as these parameters are crucial for network performance and directly affect the buffering requirements for all video devices and downstream networks.

  14. Intelligent End-To-End Resource Virtualization Using Service Oriented Architecture

    NARCIS (Netherlands)

    Onur, E.; Sfakianakis, E.; Papagianni, C.; Karagiannis, Georgios; Kontos, T.; Niemegeers, I.G.M.M.; Niemegeers, I.; Chochliouros, I.; Heemstra de Groot, S.M.; Sjödin, P.; Hidell, M.; Cinkler, T.; Maliosz, M.; Kaklamani, D.I.; Carapinha, J.; Belesioti, M.; Futrps, E.

    2009-01-01

    Service-oriented architecture can be considered as a philosophy or paradigm in organizing and utilizing services and capabilities that may be under the control of different ownership domains. Virtualization provides abstraction and isolation of lower level functionalities, enabling portability of

  15. End-to-end workflow for finite element analysis of tumor treating fields in glioblastomas

    Science.gov (United States)

    Timmons, Joshua J.; Lok, Edwin; San, Pyay; Bui, Kevin; Wong, Eric T.

    2017-11-01

    Tumor Treating Fields (TTFields) therapy is an approved modality of treatment for glioblastoma. Patient anatomy-based finite element analysis (FEA) has the potential to reveal not only how these fields affect tumor control but also how to improve efficacy. While the automated tools for segmentation speed up the generation of FEA models, multi-step manual corrections are required, including removal of disconnected voxels, incorporation of unsegmented structures and the addition of 36 electrodes plus gel layers matching the TTFields transducers. Existing approaches are also not scalable for the high throughput analysis of large patient volumes. A semi-automated workflow was developed to prepare FEA models for TTFields mapping in the human brain. Magnetic resonance imaging (MRI) pre-processing, segmentation, electrode and gel placement, and post-processing were all automated. The material properties of each tissue were applied to their corresponding mask in silico using COMSOL Multiphysics (COMSOL, Burlington, MA, USA). The fidelity of the segmentations with and without post-processing was compared against the full semi-automated segmentation workflow approach using Dice coefficient analysis. The average relative differences for the electric fields generated by COMSOL were calculated in addition to observed differences in electric field-volume histograms. Furthermore, the mesh file formats in MPHTXT and NASTRAN were also compared using the differences in the electric field-volume histogram. The Dice coefficient was less for auto-segmentation without versus auto-segmentation with post-processing, indicating convergence on a manually corrected model. An existent but marginal relative difference of electric field maps from models with manual correction versus those without was identified, and a clear advantage of using the NASTRAN mesh file format was found. The software and workflow outlined in this article may be used to accelerate the investigation of TTFields in glioblastoma patients by facilitating the creation of FEA models derived from patient MRI datasets.

  16. Hardware Support for Malware Defense and End-to-End Trust

    Science.gov (United States)

    2017-02-01

    this problem is described in section 3.1.5. 3.1.3. SOFTWARE ARCHITECTURE Starting from the Chromebook hardware platform, this project removed the...personalities (KVM Virtual Machines) of Android , while including our overall integrity architecture with integrity measurement, appraisal, and...attestation, both for the native Linux, and for the Android guests. The overall architecture developed in this project is shown in Figure 1. 3.1.4

  17. CLOUD SECURITY AND COMPLIANCE - A SEMANTIC APPROACH IN END TO END SECURITY

    OpenAIRE

    Kalaiprasath, R.; Elankavi, R.; Udayakumar, R.

    2017-01-01

    The Cloud services are becoming an essential part of many organizations. Cloud providers have to adhere to security and privacy policies to ensure their users' data remains confidential and secure. Though there are some ongoing efforts on developing cloud security standards, most cloud providers are implementing a mish-mash of security and privacy controls. This has led to confusion among cloud consumers as to what security measures they should expect from the cloud services, and whether thes...

  18. End-to-end information extraction without token-level supervision

    DEFF Research Database (Denmark)

    Palm, Rasmus Berg; Hovy, Dirk; Laws, Florian

    2017-01-01

    Most state-of-the-art information extraction approaches rely on token-level labels to find the areas of interest in text. Unfortunately, these labels are time-consuming and costly to create, and consequently, not available for many real-life IE tasks. To make matters worse, token-level labels...... and output text. We evaluate our model on the ATIS data set, MIT restaurant corpus and the MIT movie corpus and compare to neural baselines that do use token-level labels. We achieve competitive results, within a few percentage points of the baselines, showing the feasibility of E2E information extraction...

  19. Multi-hop Relaying: An End-to-End Delay Analysis

    KAUST Repository

    Chaaban, Anas; Sezgin, Aydin

    2015-01-01

    The impact of multi-hopping schemes on the communication latency in a relay channel is studied. The main aim is to characterize conditions under which such schemes decrease the communication latency given a reliability requirement. Both decode

  20. End-to-end workflow for finite element analysis of tumor treating fields in glioblastomas.

    Science.gov (United States)

    Timmons, Joshua J; Lok, Edwin; San, Pyay; Bui, Kevin; Wong, Eric T

    2017-10-12

    Tumor Treating Fields (TTFields) therapy is an approved modality of treatment for glioblastoma. Patient anatomy-based finite element analysis (FEA) has the potential to reveal not only how these fields affect tumor control but also how to improve efficacy. While the automated tools for segmentation speed up the generation of FEA models, multi-step manual corrections are required, including removal of disconnected voxels, incorporation of unsegmented structures and the addition of 36 electrodes plus gel layers matching the TTFields transducers. Existing approaches are also not scalable for the high throughput analysis of large patient volumes. A semi-automated workflow was developed to prepare FEA models for TTFields mapping in the human brain. Magnetic resonance imaging (MRI) pre-processing, segmentation, electrode and gel placement, and post-processing were all automated. The material properties of each tissue were applied to their corresponding mask in silico using COMSOL Multiphysics (COMSOL, Burlington, MA, USA). The fidelity of the segmentations with and without post-processing was compared against the full semi-automated segmentation workflow approach using Dice coefficient analysis. The average relative differences for the electric fields generated by COMSOL were calculated in addition to observed differences in electric field-volume histograms. Furthermore, the mesh file formats in MPHTXT and NASTRAN were also compared using the differences in the electric field-volume histogram. The Dice coefficient was less for auto-segmentation without versus auto-segmentation with post-processing, indicating convergence on a manually corrected model. An existent but marginal relative difference of electric field maps from models with manual correction versus those without was identified, and a clear advantage of using the NASTRAN mesh file format was found. The software and workflow outlined in this article may be used to accelerate the investigation of TTFields in glioblastoma patients by facilitating the creation of FEA models derived from patient MRI datasets.

  1. Towards End-to-End Lane Detection: an Instance Segmentation Approach

    OpenAIRE

    Neven, Davy; De Brabandere, Bert; Georgoulis, Stamatios; Proesmans, Marc; Van Gool, Luc

    2018-01-01

    Modern cars are incorporating an increasing number of driver assist features, among which automatic lane keeping. The latter allows the car to properly position itself within the road lanes, which is also crucial for any subsequent lane departure or trajectory planning decision in fully autonomous cars. Traditional lane detection methods rely on a combination of highly-specialized, hand-crafted features and heuristics, usually followed by post-processing techniques, that are computationally e...

  2. The Challenge of Ensuring Human Rights in the End-to-End Supply Chain

    DEFF Research Database (Denmark)

    Wieland, Andreas; Handfield, Robert B.

    2014-01-01

    Certification programs have their merits and their limitations. With the growing availability of social media, analytics tools, and supply chain data, a smarter set of solutions could soon be possible....

  3. Design and Evaluation for the End-to-End Detection of TCP/IP Header Manipulation

    Science.gov (United States)

    2014-06-01

    3.2.2 Outsourcing middleboxes Jingling [86] is a prototype outsourcing architecture where the network forwards data out to external “Feature...The relation to our problem is that Jingling could help proactively address broken and inadvertent middlebox behaviors, depending on the administrative

  4. Mining Fashion Outfit Composition Using An End-to-End Deep Learning Approach on Set Data

    OpenAIRE

    Li, Yuncheng; Cao, LiangLiang; Zhu, Jiang; Luo, Jiebo

    2016-01-01

    Composing fashion outfits involves deep understanding of fashion standards while incorporating creativity for choosing multiple fashion items (e.g., Jewelry, Bag, Pants, Dress). In fashion websites, popular or high-quality fashion outfits are usually designed by fashion experts and followed by large audiences. In this paper, we propose a machine learning system to compose fashion outfits automatically. The core of the proposed automatic composition system is to score fashion outfit candidates...

  5. Building an End-to-end System for Long Term Soil Monitoring

    Science.gov (United States)

    Szlavecz, K.; Terzis, A.; Musaloiu-E., R.; Cogan, J.; Szalay, A.; Gray, J.

    2006-05-01

    We have developed and deployed an experimental soil monitoring system in an urban forest. Wireless sensor nodes collect data on soil temperature, soil moisture, air temperature, and light. Data are uploaded into a SQL Server database, where they are calibrated and reorganized into an OLAP data cube. The data are accessible on-line using a web services interface with various visual tools. Our prototype system of ten nodes has been live since Sep 2005, and in 5 months of operation over 6 million measurements have been collected. At a high level, our experiment was a success: we detected variations in soil condition corresponding to topography and external environmental parameters as expected. However, we encountered a number of challenging technical problems: need for low-level programming at multiple levels, calibration across space and time, and cross- reference of measurements with external sources. Based upon the experience with this system we are now deploying 200 mode nodes with close to a thousand sensors spread over multiple sites in the context of the Baltimore Ecosystem Study LTER. www

  6. End-to-end integrated security and performance analysis on the DEGAS Choreographer platform

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Gilmore, Stephen; Haenel, Valentin

    2005-01-01

    We present a software tool platform which facilitates security and performance analysis of systems which starts and ends with UML model descriptions. A UML project is presented to the platform for analysis, formal content is extracted in the form of process calculi descriptions, analysed with the......We present a software tool platform which facilitates security and performance analysis of systems which starts and ends with UML model descriptions. A UML project is presented to the platform for analysis, formal content is extracted in the form of process calculi descriptions, analysed...

  7. Hoe kunnen end-to-end processen worden geborgd in de organisatie?

    NARCIS (Netherlands)

    Strikwerda, H.

    2017-01-01

    Processen waarin kennis, informatie en materiaal worden getransformeerd in goederen en diensten, vormen de kern van organiseren. Dat is een van de oudste uitgangspunten in de bedrijfskunde. Processen zijn in het scientific management en daarmee in lean six sigma het object van analyse en verbetering

  8. SecMon: End-to-End Quality and Security Monitoring System

    OpenAIRE

    Ciszkowski, Tomasz; Eliasson, Charlott; Fiedler, Markus; Kotulski, Zbigniew; Lupu, Radu; Mazurczyk, Wojciech

    2008-01-01

    The Voice over Internet Protocol (VoIP) is becoming a more available and popular way of communicating for Internet users. This also applies to Peer-to-Peer (P2P) systems and merging these two have already proven to be successful (e.g. Skype). Even the existing standards of VoIP provide an assurance of security and Quality of Service (QoS), however, these features are usually optional and supported by limited number of implementations. As a result, the lack of mandatory and widely applicable Q...

  9. The Contribution of the Future SWOT Mission to Improve Simulations of River Stages and Stream-Aquifer Interactions at Regional Scale

    Science.gov (United States)

    Saleh, Firas; Filipo, Nicolas; Biancamaria, Sylvain; Habets, Florence; Rodriguez, Enersto; Mognard, Nelly

    2013-09-01

    The main objective of this study is to provide a realistic simulation of river stage in regional river networks in order to improve the quantification of stream-aquifer exchanges and better assess the associated aquifer responses that are often impacted by the magnitude and the frequency of the river stage fluctuations. This study extends the earlier work to improve the modeling of the Seine basin with a focus on simulating the hydrodynamics behavior of the Bassée alluvial wetland, a 120 km reach of the Seine River valley located south- east of Paris. The Bassée is of major importance for the drinking-water supply of Paris and surroundings, in addition to its particular hydrodynamic behavior due to the presence of a number of gravels. In this context, the understanding of stream-aquifer interactions is required for water quantity and quality preservation. A regional distributed process-based hydro(geo)logical model, Eau-Dyssée, is used. It aims at the integrated modeling of the hydrosystem to manage the various elements involved in the quantitative and qualitative aspects of water resources. Eau-Dyssée simulates pseudo 3D flow in aquifer systems solving the diffusivity equation with a finite difference numerical scheme. River flow is simulated with a Muskingum model. In addition to the in-stream discharge, a river stage estimate is needed to calculate the water exchange at the stream-aquifer interface using a conductance model. In this context, the future SWOT mission and its high-spatial resolution imagery can provide surface water level measurements at the regional scale that will permit to better characterize the Bassée complex hydro(geo)logical system and better assess soil water content. Moreover, the Bassée is considered as a potential target for the framework of the AirSWOT airborne campaign in France, 2013.

  10. New vision solar system exploration missions study: Analysis of the use of biomodal space nuclear power systems to support outer solar system exploration missions. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-08

    This report presents the results of an analysis of the capability of nuclear bimodal systems to perform outer solar system exploration missions. Missions of interest include orbiter mission s to Mars, Jupiter, Saturn, Uranus, Neptune, and Pluto. An initial technology baseline consisting of a NEBA 10 kWe, 1000 N thrust, 850 s, 1500 kg bimodal system was selected, and its performance examined against a data base for trajectories to outer solar system planetary destinations to select optimal direct and gravity assisted trajectories for study. A conceptual design for a common bimodal spacecraft capable of performing missions to all the planetary destinations was developed and made the basis of end to end mission designs for orbiter missions to Jupiter, Saturn, and Neptune. Concepts for microspacecraft capable of probing Jupiter`s atmosphere and exploring Titan were also developed. All mission designs considered use the Atlas 2AS for launch. It is shown that the bimodal nuclear power and propulsion system offers many attractive option for planetary missions, including both conventional planetary missions in which all instruments are carried by a single primary orbiting spacecraft, and unconventional missions in which the primary spacecraft acts as a carrier, relay, and mother ship for a fleet of micro spacecraft deployed at the planetary destination.

  11. Design, Simulation, Software Development, and Testing of a Compact Aircraft Tracking Payload for the CanX-7 Nanosatellite Mission

    Science.gov (United States)

    Bennett, Ian Graham

    Automatic Dependent Surveillance-Broadcast (ADS-B) is quickly becoming the new standard for more efficient air traffic control, but as a satellite/ground-based hybrid system it faces limitations on its usefulness over oceans and remote areas. Tracking of aircraft from space presents many challenges that if overcome will greatly increase the safety and efficiency of commercial air travel in these areas. This thesis presents work performed to develop a flight-ready ADS-B receiver payload for the CanX-7 technology demonstration satellite. Work presented includes a simulation of payload performance and coverage area, the design and testing of a single-feed circularly polarized L-band antenna, the design of software to control the payload and manage its data, and verification of the performance of the hardware prior to integration with the satellite and launch. Also included is a short overview of results from the seven-month aircraft tracking campaign conducted with the spacecraft.

  12. Systems Engineering and Application of System Performance Modeling in SIM Lite Mission

    Science.gov (United States)

    Moshir, Mehrdad; Murphy, David W.; Milman, Mark H.; Meier, David L.

    2010-01-01

    The SIM Lite Astrometric Observatory will be the first space-based Michelson interferometer operating in the visible wavelength, with the ability to perform ultra-high precision astrometric measurements on distant celestial objects. SIM Lite data will address in a fundamental way questions such as characterization of Earth-mass planets around nearby stars. To accomplish these goals it is necessary to rely on a model-based systems engineering approach - much more so than most other space missions. This paper will describe in further detail the components of this end-to-end performance model, called "SIM-sim", and show how it has helped the systems engineering process.

  13. Primitive chain network simulations of probe rheology.

    Science.gov (United States)

    Masubuchi, Yuichi; Amamoto, Yoshifumi; Pandey, Ankita; Liu, Cheng-Yang

    2017-09-27

    Probe rheology experiments, in which the dynamics of a small amount of probe chains dissolved in immobile matrix chains is discussed, have been performed for the development of molecular theories for entangled polymer dynamics. Although probe chain dynamics in probe rheology is considered hypothetically as single chain dynamics in fixed tube-shaped confinement, it has not been fully elucidated. For instance, the end-to-end relaxation of probe chains is slower than that for monodisperse melts, unlike the conventional molecular theories. In this study, the viscoelastic and dielectric relaxations of probe chains were calculated by primitive chain network simulations. The simulations semi-quantitatively reproduced the dielectric relaxation, which reflects the effect of constraint release on the end-to-end relaxation. Fair agreement was also obtained for the viscoelastic relaxation time. However, the viscoelastic relaxation intensity was underestimated, possibly due to some flaws in the model for the inter-chain cross-correlations between probe and matrix chains.

  14. Evaluation of regional-scale water level simulations using various river routing schemes within a hydrometeorological modelling framework for the preparation of the SWOT mission

    Science.gov (United States)

    Häfliger, V.; Martin, E.; Boone, A. A.; Habets, F.; David, C. H.; Garambois, P. A.; Roux, H.; Ricci, S. M.; Thévenin, A.; Berthon, L.; Biancamaria, S.

    2014-12-01

    The ability of a regional hydrometeorological model to simulate water depth is assessed in order to prepare for the SWOT (Surface Water and Ocean Topography) mission that will observe free surface water elevations for rivers having a width larger than 50/100 m. The Garonne river (56 000 km2, in south-western France) has been selected owing to the availability of operational gauges, and the fact that different modeling platforms, the hydrometeorological model SAFRAN-ISBA-MODCOU and several fine scale hydraulic models, have been extensively evaluated over two reaches of the river. Several routing schemes, ranging from the simple Muskingum method to time-variable parameter kinematic and diffusive waves schemes with time varying parameters, are tested using predetermined hydraulic parameters. The results show that the variable flow velocity scheme is advantageous for discharge computations when compared to the original Muskingum routing method. Additionally, comparisons between water level computations and in situ observations led to root mean square errors of 50-60 cm for the improved Muskingum method and 40-50 cm for the kinematic-diffusive wave method, in the downstream Garonne river. The error is larger than the anticipated SWOT resolution, showing the potential of the mission to improve knowledge of the continental water cycle. Discharge computations are also shown to be comparable to those obtained with high-resolution hydraulic models over two reaches. However, due to the high variability of river parameters (e.g. slope and river width), a robust averaging method is needed to compare the hydraulic model outputs and the regional model. Sensitivity tests are finally performed in order to have a better understanding of the mechanisms which control the key hydrological processes. The results give valuable information about the linearity, Gaussianity and symetry of the model, in order to prepare the assimilation of river heights in the model.

  15. Simulation of a long focal length Wolter-I telescope for hard X-ray astronomy. Application to the Simbol-X and PheniX space missions

    International Nuclear Information System (INIS)

    Chauvin, M.

    2011-01-01

    The future of hard X-ray astronomy relies on the development of new instruments able to focus photons of a hundred keV. Indeed, focalization allows an important improvement in sensitivity and angular resolution. Achieved by grazing incidence reflections on Wolter-I mirrors, its use currently limited to tens of keV can be extended to higher energies thanks to a specific coating and a large focal length. As X-ray observations are only possible above the atmosphere, the size of the observatories, and hence their focal length, was limited by the launcher capacity. Over the past few years, different technologies like extendible masts or formation flight have been studied to go beyond this limit. To gain a better understanding of these telescopes, I detail the Wolter-I mirror geometry, their coating reflectivity, the detection in semi-conductor as well as the dynamic related to extendible masts and formation flight. These telescopes are complex optical systems, subject to deformations during observation and need a fine metrology system to measure these deformations for image correction. To study their performance, I developed a code reproducing the real functioning of such a telescope. Each photon is considered individually, its path and interactions depend on the behavior of the telescope structure along with time. Each component of the telescope is modeled, as well as the metrology needed for the restitution of its dynamic. The path of the photon is computed in a three dimensional vector space, using Monte-Carlo methods to reproduce the mirror defaults, their reflectivity and the interactions in the detector. The simulation produces images and energy spectra, from which we can infer the angular resolution, the field of view, the effective area and the detection efficiency. In 2006, the Simbol-X mission was selected in the framework of the formation flight studies. This concept allows a large focal length, the telescope being distributed on two independent spacecrafts

  16. Mission operations technology

    Science.gov (United States)

    Varsi, Giulio

    In the last decade, the operation of a spacecraft after launch has emerged as a major component of the total cost of the mission. This trend is sustained by the increasing complexity, flexibility, and data gathering capability of the space assets and by their greater reliability and consequent longevity. The trend can, however, be moderated by the progressive transfer of selected functions from the ground to the spacecraft and by application, on the ground, of new technology. Advances in ground operations derive from the introduction in the mission operations environment of advanced microprocessor-based workstations in the class of a few million instructions per second and from the selective application of artificial intelligence technology. In the last few years a number of these applications have been developed, tested in operational settings and successfully demonstrated to users. Some are now being integrated in mission operations facilities. An analysis of mission operations indicates that the key areas are: concurrent control of multiple missions; automated/interactive production of command sequences of high integrity at low cost; automated monitoring of spacecraft health and automated aides for fault diagnosis; automated allocation of resources; automated processing of science data; and high-fidelity, high-speed spacecraft simulation. Examples of major advances in selected areas are described.

  17. An Investigation of the Mechanical Properties of Some Martian Regolith Simulants with Respect to the Surface Properties at the InSight Mission Landing Site

    Science.gov (United States)

    Delage, Pierre; Karakostas, Foivos; Dhemaied, Amine; Belmokhtar, Malik; Lognonné, Philippe; Golombek, Matt; De Laure, Emmanuel; Hurst, Ken; Dupla, Jean-Claude; Kedar, Sharon; Cui, Yu Jun; Banerdt, Bruce

    2017-10-01

    In support of the InSight mission in which two instruments (the SEIS seismometer and the HP3 heat flow probe) will interact directly with the regolith on the surface of Mars, a series of mechanical tests were conducted on three different regolith simulants to better understand the observations of the physical and mechanical parameters that will be derived from InSight. The mechanical data obtained were also compared to data on terrestrial sands. The density of the regolith strongly influences its mechanical properties, as determined from the data on terrestrial sands. The elastoplastic compression volume changes were investigated through oedometer tests that also provided estimates of possible changes in density with depth. The results of direct shear tests provided values of friction angles that were compared with that of a terrestrial sand, and an extrapolation to lower density provided a friction angle compatible with that estimated from previous observations on the surface of Mars. The importance of the contracting/dilating shear volume changes of sands on the dynamic penetration of the mole was determined, with penetration facilitated by the ˜1.3 Mg/m3 density estimated at the landing site. Seismic velocities, measured by means of piezoelectric bender elements in triaxial specimens submitted to various isotropic confining stresses, show the importance of the confining stress, with lesser influence of density changes under compression. A power law relation of velocity as a function of confining stress with an exponent of 0.3 was identified from the tests, allowing an estimate of the surface seismic velocity of 150 m/s. The effect on the seismic velocity of a 10% proportion of rock in the regolith was also studied. These data will be compared with in situ data measured by InSight after landing.

  18. Landsat Data Continuity Mission (LDCM) space to ground mission data architecture

    Science.gov (United States)

    Nelson, Jack L.; Ames, J.A.; Williams, J.; Patschke, R.; Mott, C.; Joseph, J.; Garon, H.; Mah, G.

    2012-01-01

    The Landsat Data Continuity Mission (LDCM) is a scientific endeavor to extend the longest continuous multi-spectral imaging record of Earth's land surface. The observatory consists of a spacecraft bus integrated with two imaging instruments; the Operational Land Imager (OLI), built by Ball Aerospace & Technologies Corporation in Boulder, Colorado, and the Thermal Infrared Sensor (TIRS), an in-house instrument built at the Goddard Space Flight Center (GSFC). Both instruments are integrated aboard a fine-pointing, fully redundant, spacecraft bus built by Orbital Sciences Corporation, Gilbert, Arizona. The mission is scheduled for launch in January 2013. This paper will describe the innovative end-to-end approach for efficiently managing high volumes of simultaneous realtime and playback of image and ancillary data from the instruments to the reception at the United States Geological Survey's (USGS) Landsat Ground Network (LGN) and International Cooperator (IC) ground stations. The core enabling capability lies within the spacecraft Command and Data Handling (C&DH) system and Radio Frequency (RF) communications system implementation. Each of these systems uniquely contribute to the efficient processing of high speed image data (up to 265Mbps) from each instrument, and provide virtually error free data delivery to the ground. Onboard methods include a combination of lossless data compression, Consultative Committee for Space Data Systems (CCSDS) data formatting, a file-based/managed Solid State Recorder (SSR), and Low Density Parity Check (LDPC) forward error correction. The 440 Mbps wideband X-Band downlink uses Class 1 CCSDS File Delivery Protocol (CFDP), and an earth coverage antenna to deliver an average of 400 scenes per day to a combination of LGN and IC ground stations. This paper will also describe the integrated capabilities and processes at the LGN ground stations for data reception using adaptive filtering, and the mission operations approach fro- the LDCM

  19. GNC Architecture Design for ARES Simulation. Revision 3.0. Revision 3.0

    Science.gov (United States)

    Gay, Robert

    2006-01-01

    The purpose of this document is to describe the GNC architecture and associated interfaces for all ARES simulations. Establishing a common architecture facilitates development across the ARES simulations and provides an efficient mechanism for creating an end-to-end simulation capability. In general, the GNC architecture is the frame work in which all GNC development takes place, including sensor and effector models. All GNC software applications have a standard location within the architecture making integration easier and, thus more efficient.

  20. Spacelab life sciences 2 post mission report

    Science.gov (United States)

    Buckey, Jay C.

    1994-01-01

    Jay C. Buckey, M.D., Assistant Professor of Medicine at The University of Texas Southwestern Medical Center at Dallas served as an alternate payload specialist astronaut for the Spacelab Life Sciences 2 Space Shuttle Mission from January 1992 through December 1993. This report summarizes his opinions on the mission and offers suggestions in the areas of selection, training, simulations, baseline data collection and mission operations. The report recognizes the contributions of the commander, payload commander and mission management team to the success of the mission. Dr. Buckey's main accomplishments during the mission are listed.

  1. A decision model for planetary missions

    Science.gov (United States)

    Hazelrigg, G. A., Jr.; Brigadier, W. L.

    1976-01-01

    Many techniques developed for the solution of problems in economics and operations research are directly applicable to problems involving engineering trade-offs. This paper investigates the use of utility theory for decision making in planetary exploration space missions. A decision model is derived that accounts for the objectives of the mission - science - the cost of flying the mission and the risk of mission failure. A simulation methodology for obtaining the probability distribution of science value and costs as a function spacecraft and mission design is presented and an example application of the decision methodology is given for various potential alternatives in a comet Encke mission.

  2. The effect of long-term confinement and the efficacy of exercise countermeasures on muscle strength during a simulated mission to Mars: data from the Mars500 study.

    Science.gov (United States)

    Gaffney, Christopher J; Fomina, Elena; Babich, Dennis; Kitov, Vladimir; Uskov, Konstantin; Green, David A

    2017-11-13

    Isolation and long duration spaceflight are associated with musculoskeletal deconditioning. Mars500 was a unique, high-fidelity analogue of the psychological challenges of a 520-day manned mission to Mars. We aimed to explore the effect of musculoskeletal deconditioning on three outcome measures: (1) if lower limb muscle strength was reduced during the 520-day isolation; (2) if type I or II muscle fibres were differentially affected; and (3) whether any 70-day exercise interventions prevented any isolation-induced loss of strength. Six healthy male subjects (mean ± SEM) (34 ± 3 years; 1.76 ± 0.02 metres; 83.7 ± 4.8 kg) provided written, informed consent to participate. The subjects' maximal voluntary contraction (MVC) was assessed isometrically in the calf (predominantly type I fibres), and maximal voluntary isokinetic force (MVIF) was assessed in the quadriceps/hamstrings (predominantly type II fibres) at 0.2 and 0.4 ms -1 using the Multifunctional Dynamometer for Space (MDS) at 35-day intervals throughout Mars500. Exercise interventions were completed 3-7 days/week throughout the 520-day isolation in a counterbalanced design excluding 142-177 days (rest period) and 251-284 days (simulated Mars landing). Exercise interventions included motorized treadmill running, non-motorized treadmill running, cycle ergometry, elastomer-based resistance exercise, whole-body vibration (WBV), and resistance exercise using MDS. Calf MVC did not reduce across the 520-day isolation and MDS increased strength by 18% compared to before that of 70-day exercise intervention. In contrast, there was a significant bilateral loss of MVIF across the 520 days at both 0.2 ms -1 (R 2  = 0.53; P = 0.001) and 0.4 ms -1 (0.4 ms -1 ; R 2  = 0.42; P = 0.007). WBV (+ 3.7 and 8.8%) and MDS (+ 4.9 and 5.2%) afforded the best protection against isolation-induced loss of MVIF, although MDS was the only intervention to prevent bilateral loss of calf MVC and leg MVIF at 0

  3. Flight code validation simulator

    Science.gov (United States)

    Sims, Brent A.

    1996-05-01

    An End-To-End Simulation capability for software development and validation of missile flight software on the actual embedded computer has been developed utilizing a 486 PC, i860 DSP coprocessor, embedded flight computer and custom dual port memory interface hardware. This system allows real-time interrupt driven embedded flight software development and checkout. The flight software runs in a Sandia Digital Airborne Computer and reads and writes actual hardware sensor locations in which Inertial Measurement Unit data resides. The simulator provides six degree of freedom real-time dynamic simulation, accurate real-time discrete sensor data and acts on commands and discretes from the flight computer. This system was utilized in the development and validation of the successful premier flight of the Digital Miniature Attitude Reference System in January of 1995 at the White Sands Missile Range on a two stage attitude controlled sounding rocket.

  4. Sociometric and ethological approach to the assessment of individual and group behavior in extra long-term isolation during simulated interplanetary mission

    Science.gov (United States)

    Gushin, Vadim; Tafforin, Carole; Kuznetsova, Polina; Vinokhodova, Alla; Chekalina, Angelina

    Several factors, such as hazard to life, reduced social communications, isolation, high workload, monotony, etc., can cause deconditioning of individual status and group dynamics in long-term spaceflight. New approaches to the assessment of group behavior are being developed in order to create necessary counter-measures and to keep optimal psychological climate in the crew. Psychological methods combined with ethological approach to dynamic monitoring of the isolated crew had been tested and validated in Mars-500 experiment. The experiment (duration of 520 days) was designed to simulate the living and working conditions of a piloted mission to Mars. The Mars-500 crew was composed of three Russians, two Europeans and one Chinese. We used psychological tests: sociometric questionnaire to assess group status (popularity) of the crewmembers (monthly), color choice test to assess the level of frustration and anxiety (twice a month). We performed observations from video recordings of group discussions (monthly) and during breakfast time (twice a month). The video analysis was supplied with a software based-solution: The Observer XT®. The results showed that occurrence of collateral acts may indicate psychological stress and fatigue in crewmembers under isolation and that facial expressions may indicate less anxiety. The data of psychological tests allowed to define two subgroups in the crew. The first one consisted of the subjects with high group status and lower level of frustration (not anxious), the second one consisted of less popular subjects, having respectively higher anxiety level. The video analysis showed two times more manifestations of facial expressions and interpersonal communications for the first subgroup. We also identified the subgroups on the basis of their verbal expressions in Russian and in English. Video observation of individual and group behavior, combined with other psychological tests gives opportunity to emphasize more objectively the signs

  5. Simulations

    CERN Document Server

    Ngada, Narcisse

    2015-06-15

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  6. Simulation Facilities and Test Beds for Galileo

    Science.gov (United States)

    Schlarmann, Bernhard Kl.; Leonard, Arian

    2002-01-01

    Galileo is the European satellite navigation system, financed by the European Space Agency (ESA) and the European Commission (EC). The Galileo System, currently under definition phase, will offer seamless global coverage, providing state-of-the-art positioning and timing services. Galileo services will include a standard service targeted at mass market users, an augmented integrity service, providing integrity warnings when fault occur and Public Regulated Services (ensuring a continuity of service for the public users). Other services are under consideration (SAR and integrated communications). Galileo will be interoperable with GPS, and will be complemented by local elements that will enhance the services for specific local users. In the frame of the Galileo definition phase, several system design and simulation facilities and test beds have been defined and developed for the coming phases of the project, respectively they are currently under development. These are mainly the following tools: Galileo Mission Analysis Simulator to design the Space Segment, especially to support constellation design, deployment and replacement. Galileo Service Volume Simulator to analyse the global performance requirements based on a coverage analysis for different service levels and degrades modes. Galileo System Simulation Facility is a sophisticated end-to-end simulation tool to assess the navigation performances for a complete variety of users under different operating conditions and different modes. Galileo Signal Validation Facility to evaluate signal and message structures for Galileo. Galileo System Test Bed (Version 1) to assess and refine the Orbit Determination &Time Synchronisation and Integrity algorithms, through experiments relying on GPS space infrastructure. This paper presents an overview on the so called "G-Facilities" and describes the use of the different system design tools during the project life cycle in order to design the system with respect to

  7. Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing

    Science.gov (United States)

    Some, Raphael; Doyle, Richard; Bergman, Larry; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael

    2013-01-01

    Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and mission. Onboard computing can be aptly viewed as a "technology multiplier" in that advances provide direct dramatic improvements in flight functions and capabilities across the NASA mission classes, and enable new flight capabilities and mission scenarios, increasing science and exploration return. Space-qualified computing technology, however, has not advanced significantly in well over ten years and the current state of the practice fails to meet the near- to mid-term needs of NASA missions. Recognizing this gap, the NASA Game Changing Development Program (GCDP), under the auspices of the NASA Space Technology Mission Directorate, commissioned a study on space-based computing needs, looking out 15-20 years. The study resulted in a recommendation to pursue high-performance spaceflight computing (HPSC) for next-generation missions, and a decision to partner with the Air Force Research Lab (AFRL) in this development.

  8. Active Debris Removal mission design in Low Earth Orbit

    Science.gov (United States)

    Martin, Th.; Pérot, E.; Desjean, M.-Ch.; Bitetti, L.

    2013-03-01

    Active Debris Removal (ADR) aims at removing large sized intact objects ― defunct satellites, rocket upper-stages ― from space crowded regions. Why? Because they constitute the main source of the long-term debris environment deterioration caused by possible future collisions with fragments and worse still with other intact but uncontrolled objects. In order to limit the growth of the orbital debris population in the future (referred to as the Kessler syndrome), it is now highly recommended to carry out such ADR missions, together with the mitigation measures already adopted by national agencies (such as postmission disposal). At the French Space Agency, CNES, and in the frame of advanced studies, the design of such an ADR mission in Low Earth Orbit (LEO) is under evaluation. A two-step preliminary approach has been envisaged. First, a reconnaissance mission based on a small demonstrator (˜500 kg) rendezvousing with several targets (observation and in-flight qualification testing). Secondly, an ADR mission based on a larger vehicle (inherited from the Orbital Transfer Vehicle (OTV) concept) being able to capture and deorbit several preselected targets by attaching a propulsive kit to these targets. This paper presents a flight dynamics level tradeoff analysis between different vehicle and mission concepts as well as target disposal options. The delta-velocity, times, and masses required to transfer, rendezvous with targets and deorbit are assessed for some propelled systems and propellant less options. Total mass budgets are then derived for two end-to-end study cases corresponding to the reconnaissance and ADR missions mentioned above.

  9. Mars Analog Research and Technology Experiment (MARTE): A Simulated Mars Drilling Mission to Search for Subsurface Life at the Rio Tinto, Spain

    Science.gov (United States)

    Stoker, Carol; Lemke, Larry; Mandell, Humboldt; McKay, David; George, Jeffrey; Gomez-Alvera, Javier; Amils, Ricardo; Stevens, Todd; Miller, David

    2003-01-01

    The MARTE (Mars Astrobiology Research and Technology Experiment) project was selected by the new NASA ASTEP program, which supports field experiments having an equal emphasis on Astrobiology science and technology development relevant to future Astrobiology missions. MARTE will search for a hypothesized subsurface anaerobic chemoautotrophic biosphere in the region of the Tinto River in southwestern Spain while also demonstrating technology needed to search for a subsurface biosphere on Mars. The experiment is informed by the strategy for searching for life on Mars.

  10. IMP - INTEGRATED MISSION PROGRAM

    Science.gov (United States)

    Dauro, V. A.

    1994-01-01

    IMP is a simulation language that is used to model missions around the Earth, Moon, Mars, or other planets. It has been used to model missions for the Saturn Program, Apollo Program, Space Transportation System, Space Exploration Initiative, and Space Station Freedom. IMP allows a user to control the mission being simulated through a large event/maneuver menu. Up to three spacecraft may be used: a main, a target and an observer. The simulation may begin at liftoff, suborbital, or orbital. IMP incorporates a Fehlberg seventh order, thirteen evaluation Runge-Kutta integrator with error and step-size control to numerically integrate the equations of motion. The user may choose oblate or spherical gravity for the central body (Earth, Mars, Moon or other) while a spherical model is used for the gravity of an additional perturbing body. Sun gravity and pressure and Moon gravity effects are user-selectable. Earth/Mars atmospheric effects can be included. The optimum thrust guidance parameters are calculated automatically. Events/maneuvers may involve many velocity changes, and these velocity changes may be impulsive or of finite duration. Aerobraking to orbit is also an option. Other simulation options include line-of-sight communication guidelines, a choice of propulsion systems, a soft landing on the Earth or Mars, and rendezvous with a target vehicle. The input/output is in metric units, with the exception of thrust and weight which are in English units. Input is read from the user's input file to minimize real-time keyboard input. Output includes vehicle state, orbital and guide parameters, event and total velocity changes, and propellant usage. The main output is to the user defined print file, but during execution, part of the input/output is also displayed on the screen. An included FORTRAN program, TEKPLOT, will display plots on the VDT as well as generating a graphic file suitable for output on most laser printers. The code is double precision. IMP is written in

  11. Proba-V Mission Exploitation Platform

    Science.gov (United States)

    Goor, E.

    2017-12-01

    VITO and partners developed the Proba-V Mission Exploitation Platform (MEP) as an end-to-end solution to drastically improve the exploitation of the Proba-V (an EC Copernicus contributing mission) EO-data archive, the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers (e.g. the EC Copernicus Global Land Service) and end-users. The analysis of time series of data (PB range) is addressed, as well as the large scale on-demand processing of near real-time data on a powerful and scalable processing environment. New features are still developed, but the platform is yet fully operational since November 2016 and offers A time series viewer (browser web client and API), showing the evolution of Proba-V bands and derived vegetation parameters for any country, region, pixel or polygon defined by the user. Full-resolution viewing services for the complete data archive. On-demand processing chains on a powerfull Hadoop/Spark backend. Virtual Machines can be requested by users with access to the complete data archive mentioned above and pre-configured tools to work with this data, e.g. various toolboxes and support for R and Python. This allows users to immediately work with the data without having to install tools or download data, but as well to design, debug and test applications on the platform. Jupyter Notebooks is available with some examples python and R projects worked out to show the potential of the data. Today the platform is already used by several international third party projects to perform R&D activities on the data, and to develop/host data analysis toolboxes. From the Proba-V MEP, access to other data sources such as Sentinel-2 and landsat data is also addressed. Selected components of the MEP are also deployed on public cloud infrastructures in various R&D projects. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to

  12. Distributed Mission Operations Within-Simulator Training Effectiveness Baseline Study. Volume 5. Using the Pathfinder Methodology to Assess Pilot Knowledge Structure Changes

    National Research Council Canada - National Science Library

    Schreiber, Brian T; DiSalvo, Pam; Stock, William A; Bennett, Jr., Winston

    2006-01-01

    ...) Within Simulator Training Effectiveness Baseline Study as described in Volume I, Summary Report, of AFRL-HE-AZ-TR-2006-0015, the current work examined pilots who participated in a Pathfinder data...

  13. Evaluating the Impacts of Mission Training via Distributed Simulation on Live Exercise Performance: Results from the US/UK "Red Skies" Study

    National Research Council Canada - National Science Library

    Smith, Ebb; McIntyre, Heather; Gehr, Sara E; Schurig, Margaret; Symons, Steve; Schreiber, Brian; Bennett Jr., Winston

    2007-01-01

    .... The most recent collaborative study, named "Red Skies" involved extending our work to include field assessments of the training benefits derived from involvement in a simulation-based distributed...

  14. Towards a Multi-Mission, Airborne Science Data System Environment

    Science.gov (United States)

    Crichton, D. J.; Hardman, S.; Law, E.; Freeborn, D.; Kay-Im, E.; Lau, G.; Oswald, J.

    2011-12-01

    NASA earth science instruments are increasingly relying on airborne missions. However, traditionally, there has been limited common infrastructure support available to principal investigators in the area of science data systems. As a result, each investigator has been required to develop their own computing infrastructures for the science data system. Typically there is little software reuse and many projects lack sufficient resources to provide a robust infrastructure to capture, process, distribute and archive the observations acquired from airborne flights. At NASA's Jet Propulsion Laboratory (JPL), we have been developing a multi-mission data system infrastructure for airborne instruments called the Airborne Cloud Computing Environment (ACCE). ACCE encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation. This includes improving data system interoperability across each instrument. A principal characteristic is being able to provide an agile infrastructure that is architected to allow for a variety of configurations of the infrastructure from locally installed compute and storage services to provisioning those services via the "cloud" from cloud computer vendors such as Amazon.com. Investigators often have different needs that require a flexible configuration. The data system infrastructure is built on the Apache's Object Oriented Data Technology (OODT) suite of components which has been used for a number of spaceborne missions and provides a rich set of open source software components and services for constructing science processing and data management systems. In 2010, a partnership was formed between the ACCE team and the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to support the data processing and data management needs

  15. End-To-End Solution for Integrated Workload and Data Management using glideinWMS and Globus Online

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Grid computing has enabled scientific communities to effectively share computing resources distributed over many independent sites. Several such communities, or Virtual Organizations (VO), in the Open Science Grid and the European Grid Infrastructure use the glideinWMS system to run complex application work-flows. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, dynamically-sized overlay Condor batch system on Grid resources. While the WMS addresses the management of compute resources, however, data management in the Grid is still the responsibility of the VO. In general, large VOs have resources to develop complex custom solutions, while small VOs would rather push this responsibility to the infrastructure. The latter requires a tight integration of the WMS and the data management layers, an approach still not common in modern Grids. In this paper we describe a solution developed to address this shortcoming in the context of Center for Enabling Distributed Petascale Scienc...

  16. End-to-End Deep Neural Networks and Transfer Learning for Automatic Analysis of Nation-State Malware

    Directory of Open Access Journals (Sweden)

    Ishai Rosenberg

    2018-05-01

    Full Text Available Malware allegedly developed by nation-states, also known as advanced persistent threats (APT, are becoming more common. The task of attributing an APT to a specific nation-state or classifying it to the correct APT family is challenging for several reasons. First, each nation-state has more than a single cyber unit that develops such malware, rendering traditional authorship attribution algorithms useless. Furthermore, the dataset of such available APTs is still extremely small. Finally, those APTs use state-of-the-art evasion techniques, making feature extraction challenging. In this paper, we use a deep neural network (DNN as a classifier for nation-state APT attribution. We record the dynamic behavior of the APT when run in a sandbox and use it as raw input for the neural network, allowing the DNN to learn high level feature abstractions of the APTs itself. We also use the same raw features for APT family classification. Finally, we use the feature abstractions learned by the APT family classifier to solve the attribution problem. Using a test set of 1000 Chinese and Russian developed APTs, we achieved an accuracy rate of 98.6%

  17. End-to-end encryption in on-line payment systems : The industry reluctance and the role of laws

    NARCIS (Netherlands)

    Kasiyanto, Safari

    2016-01-01

    Various security breaches at third-party payment processors show that online payment systems are the primary target for cyber-criminals. In general, the security of online payment systems relies on a number of factors, namely technical factors, processing factors, and legal factors. The industry

  18. Portable air quality sensor unit for participatory monitoring: an end-to-end VESNA-AQ based prototype

    Science.gov (United States)

    Vucnik, Matevz; Robinson, Johanna; Smolnikar, Miha; Kocman, David; Horvat, Milena; Mohorcic, Mihael

    2015-04-01

    Key words: portable air quality sensor, CITI-SENSE, participatory monitoring, VESNA-AQ The emergence of low-cost easy to use portable air quality sensors units is opening new possibilities for individuals to assess their exposure to air pollutants at specific place and time, and share this information through the Internet connection. Such portable sensors units are being used in an ongoing citizen science project called CITI-SENSE, which enables citizens to measure and share the data. The project aims through creating citizens observatories' to empower citizens to contribute to and participate in environmental governance, enabling them to support and influence community and societal priorities as well as associated decision making. An air quality measurement system based on VESNA sensor platform was primarily designed within the project for the use as portable sensor unit in selected pilot cities (Belgrade, Ljubljana and Vienna) for monitoring outdoor exposure to pollutants. However, functionally the same unit with different set of sensors could be used for example as an indoor platform. The version designed for the pilot studies was equipped with the following sensors: NO2, O3, CO, temperature, relative humidity, pressure and accelerometer. The personal sensor unit is battery powered and housed in a plastic box. The VESNA-based air quality (AQ) monitoring system comprises the VESNA-AQ portable sensor unit, a smartphone app and the remote server. Personal sensor unit supports wireless connection to an Android smartphone via built-in Wi-Fi. The smartphone in turn serves also as the communication gateway towards the remote server using any of available data connections. Besides the gateway functionality the role of smartphone is to enrich data coming from the personal sensor unit with the GPS location, timestamps and user defined context. This, together with an accelerometer, enables the user to better estimate ones exposure in relation to physical activities, time and location. The end user can monitor the measured parameters through a smartphone application. The smartphone app implements a custom developed LCSP (Lightweight Client Server Protocol) protocol which is used to send requests to the VESNA-AQ unit and to exchange information. When the data is obtained from the VESNA-AQ unit, the mobile application visualizes the data. It also has an option to forward the data to the remote server in a custom JSON structure over a HTTP POST request. The server stores the data in the database and in parallel translates the data to WFS and forwards it to the main CITI-SENSE platform over WFS-T in a common XML format over HTTP POST request. From there data can be accessed through the Internet and visualised in different forms and web applications developed by the CITI-SENSE project. In the course of the project, the collected data will be made publicly available enabling the citizens to participate in environmental governance. Acknowledgements: CITI-SENSE is a Collaborative Project partly funded by the EU FP7-ENV-2012 under grant agreement no 308524 (www.citi-sense.eu).

  19. End-To-End Solution for Integrated Workload and Data Management using GlideinWMS and Globus Online

    International Nuclear Information System (INIS)

    Mhashilkar, Parag; Miller, Zachary; Weiss, Cathrin; Kettimuthu, Rajkumar; Garzoglio, Gabriele; Holzman, Burt; Duan, Xi; Lacinski, Lukasz

    2012-01-01

    Grid computing has enabled scientific communities to effectively share computing resources distributed over many independent sites. Several such communities, or Virtual Organizations (VO), in the Open Science Grid and the European Grid Infrastructure use the GlideinWMS system to run complex application work-flows. GlideinWMS is a pilot-based workload management system (WMS) that creates an on-demand, dynamically-sized overlay Condor batch system on Grid resources. While the WMS addresses the management of compute resources, however, data management in the Grid is still the responsibility of the VO. In general, large VOs have resources to develop complex custom solutions, while small VOs would rather push this responsibility to the infrastructure. The latter requires a tight integration of the WMS and the data management layers, an approach still not common in modern Grids. In this paper we describe a solution developed to address this shortcoming in the context of Center for Enabling Distributed Peta-scale Science (CEDPS) by integrating GlideinWMS with Globus Online (GO). Globus Online is a fast, reliable file transfer service that makes it easy for any user to move data. The solution eliminates the need for the users to provide custom data transfer solutions in the application by making this functionality part of the GlideinWMS infrastructure. To achieve this, GlideinWMS uses the file transfer plug-in architecture of Condor. The paper describes the system architecture and how this solution can be extended to support data transfer services other than Globus Online when used with Condor or GlideinWMS.

  20. Supporting end-to-end resource virtualization for Web 2.0 applications using Service Oriented Architecture

    NARCIS (Netherlands)

    Papagianni, C.; Karagiannis, Georgios; Tselikas, N. D.; Sfakianakis, E.; Chochliouros, I. P.; Kabilafkas, D.; Cinkler, T.; Westberg, L.; Sjödin, P.; Hidell, M.; Heemstra de Groot, S.M.; Kontos, T.; Katsigiannis, C.; Pappas, C.; Antonakopoulou, A.; Venieris, I.S.

    2008-01-01

    In recent years, technologies have been introduced offering a large amount of computing and networking resources. New applications such as Google AdSense and BitTorrent can profit from the use of these resources. An efficient way of discovering and reserving these resources is by using the Service

  1. Probability distribution function of the polymer end-to-end molecule vector after retraction and its application to step deformation

    Czech Academy of Sciences Publication Activity Database

    Kharlamov, Alexander; Rolón-Garrido, V. H.; Filip, Petr

    2010-01-01

    Roč. 19, č. 4 (2010), s. 190-194 ISSN 1022-1344 R&D Projects: GA ČR GA103/09/2066 Institutional research plan: CEZ:AV0Z20600510 Keywords : polymer chains * molecular modeling * shear * stress Subject RIV: BK - Fluid Dynamics Impact factor: 1.440, year: 2010

  2. End-to-End Privacy Protection for Facebook Mobile Chat based on AES with Multi-Layered MD5

    Directory of Open Access Journals (Sweden)

    Wibisono Sukmo Wardhono

    2018-01-01

    Full Text Available As social media environments become more interactive and amount of users grown tremendously, privacy is a matter of increasing concern. When personal data become a commodity, social media company can share users data to another party such as government. Facebook, inc is one of the social media company that frequently asked for user’s data. Although this private data request mechanism through a formal and valid legal process, it still undermine the fundamental right to information privacy. In This Case, social media users need protection against privacy violation from social media platform provider itself.  Private chat is the most favorite feature of a social media. Inside a chat room, user can share their private information contents. Cryptography is one of data protection methods that can be used to hides private communication data from unauthorized parties. In our study, we proposed a system that can encrypt chatting content based on AES and multi-layered MD5 to ensure social media users have privacy protection against social media company that use user informations as a commodity. In addition, this system can make users convenience to share their private information through social media platform.

  3. Investigating end-to-end security in the fifth generation wireless capabilities and IoT extensions

    Science.gov (United States)

    Uher, J.; Harper, J.; Mennecke, R. G.; Patton, P.; Farroha, B.

    2016-05-01

    The emerging 5th generation wireless network will be architected and specified to meet the vision of allowing the billions of devices and millions of human users to share spectrum to communicate and deliver services. The expansion of wireless networks from its current role to serve these diverse communities of interest introduces new paradigms that require multi-tiered approaches. The introduction of inherently low security components, like IoT devices, necessitates that critical data be better secured to protect the networks and users. Moreover high-speed communications that are meant to enable the autonomous vehicles require ultra reliable and low latency paths. This research explores security within the proposed new architectures and the cross interconnection of the highly protected assets with low cost/low security components forming the overarching 5th generation wireless infrastructure.

  4. An Anthological Review of Research Utilizing MontyLingua: a Python-Based End-to-End Text Processor

    Directory of Open Access Journals (Sweden)

    2008-06-01

    Full Text Available MontyLingua, an integral part of ConceptNet which is currently the largest commonsense knowledge base, is an English text processor developed using Python programming language in MIT Media Lab. The main feature of MontyLingua is the coverage for all aspects of English text processing from raw input text to semantic meanings and summary generation, yet each component in MontyLingua is loosely-coupled to each other at the architectural and code level, which enabled individual components to be used independently or substituted. However, there has been no review exploring the role of MontyLingua in recent research work utilizing it. This paper aims to review the use of and roles played by MontyLingua and its components in research work published in 19 articles between October 2004 and August 2006. We had observed a diversified use of MontyLingua in many different areas, both generic and domain-specific. Although the use of text summarizing component had not been observe, we are optimistic that it will have a crucial role in managing the current trend of information overload in future research.

  5. SPAN: A Network Providing Integrated, End-to-End, Sensor-to-Database Solutions for Environmental Sciences

    Science.gov (United States)

    Benzel, T.; Cho, Y. H.; Deschon, A.; Gullapalli, S.; Silva, F.

    2009-12-01

    In recent years, advances in sensor network technology have shown great promise to revolutionize environmental data collection. Still, wide spread adoption of these systems by domain experts has been lacking, and these have remained the purview of the engineers who design them. While there are many data logging options for basic data collection in the field currently, scientists are often required to visit the deployment sites to retrieve their data and manually import it into spreadsheets. Some advanced commercial software systems do allow scientists to collect data remotely, but most of these systems only allow point-to-point access, and require proprietary hardware. Furthermore, these commercial solutions preclude the use of sensors from other manufacturers or integration with internet based database repositories and compute engines. Therefore, scientists often must download and manually reformat their data before uploading it to the repositories if they wish to share their data. We present an open-source, low-cost, extensible, turnkey solution called Sensor Processing and Acquisition Network (SPAN) which provides a robust and flexible sensor network service. At the deployment site, SPAN leverages low-power generic embedded processors to integrate variety of commercially available sensor hardware to the network of environmental observation systems. By bringing intelligence close to the sensed phenomena, we can remotely control configuration and re-use, establish rules to trigger sensor activity, manage power requirements, and control the two-way flow of sensed data as well as control information to the sensors. Key features of our design include (1) adoption of a hardware agnostic architecture: our solutions are compatible with several programmable platforms, sensor systems, communication devices and protocols. (2) information standardization: our system supports several popular communication protocols and data formats, and (3) extensible data support: our system works with several existing data storage systems, data models and web based services as needed by the domain experts; examples include standard MySQL databases, Sensorbase (from UCLA), as well as SPAN Cloud, a system built using Google's Application Engine that allows scientists to use Google's cloud computing cyber-infrastructure. We provide a simple, yet flexible data access control mechanism that allows groups of researchers to share their data in SPAN Cloud. In this talk, we will describe the SPAN architecture, its components, our development plans, our vision for the future and results from current deployments that continue to drive the design of our system.

  6. Secondary link adaptation in cognitive radio networks: End-to-end performance with cross-layer design

    KAUST Repository

    Ma, Hao; Yang, Yuli; Aissa, Sonia

    2012-01-01

    the optimal boundary points in closed form to choose the AMC transmission modes by taking into account the channel state information from the secondary transmitter to both the primary receiver and the secondary receiver. Moreover, numerical results

  7. End-to-end performance of cooperative relaying in spectrum-sharing systems with quality of service requirements

    KAUST Repository

    Asghari, Vahid Reza; Aissa, Sonia

    2011-01-01

    We propose adopting a cooperative relaying technique in spectrum-sharing cognitive radio (CR) systems to more effectively and efficiently utilize available transmission resources, such as power, rate, and bandwidth, while adhering to the quality

  8. Defense Computers: DOD Y2K Functional End-to-End Testing Progress and Test Event Management

    National Research Council Canada - National Science Library

    1999-01-01

    ... (DOD) which relies on a complex and broad array of interconnected computer systems-including weapons, command and control, satellite, inventory management, transportation management, health, financial...

  9. Generic Black-Box End-to-End Attack Against State of the Art API Call Based Malware Classifiers

    OpenAIRE

    Rosenberg, Ishai; Shabtai, Asaf; Rokach, Lior; Elovici, Yuval

    2017-01-01

    In this paper, we present a black-box attack against API call based machine learning malware classifiers, focusing on generating adversarial sequences combining API calls and static features (e.g., printable strings) that will be misclassified by the classifier without affecting the malware functionality. We show that this attack is effective against many classifiers due to the transferability principle between RNN variants, feed forward DNNs, and traditional machine learning classifiers such...

  10. Understanding Effect of Constraint Release Environment on End-to-End Vector Relaxation of Linear Polymer Chains

    KAUST Repository

    Shivokhin, Maksim E.; Read, Daniel J.; Kouloumasis, Dimitris; Kocen, Rok; Zhuge, Flanco; Bailly, Christian; Hadjichristidis, Nikolaos; Likhtman, Alexei E.

    2017-01-01

    of a linear probe chain. For this purpose we first validate the ability of the model to consistently predict both the viscoelastic and dielectric response of monodisperse and binary mixtures of type A polymers, based on published experimental data. We

  11. Future Mission Trends and their Implications for the Deep Space Network

    Science.gov (United States)

    Abraham, Douglas S.

    2006-01-01

    Planning for the upgrade and/or replacement of Deep Space Network (DSN) assets that typically operate for forty or more years necessitates understanding potential customer needs as far into the future as possible. This paper describes the methodology Deep Space Network (DSN) planners use to develop this understanding, some key future mission trends that have emerged from application of this methodology, and the implications of the trends for the DSN's future evolution. For NASA's current plans out to 2030, these trends suggest the need to accommodate: three times as many communication links, downlink rates two orders of magnitude greater than today's, uplink rates some four orders of magnitude greater, and end-to-end link difficulties two-to-three orders of magnitude greater. To meet these challenges, both DSN capacity and capability will need to increase.

  12. Molecular dynamics simulation of joining process of Ag-Au nanowires and mechanical properties of the hybrid nanojoint

    Directory of Open Access Journals (Sweden)

    Su Ding

    2015-05-01

    Full Text Available The nanojoining process of Ag-Au hybrid nanowires at 800K was comprehensively studied by virtue of molecular dynamics (MD simulation. Three kinds of configurations including end-to-end, T-like and X-like were built in the simulation aiming to understand the nanojoining mechanism. The detailed dynamic evolution of atoms, crystal structure transformation and defects development during the nanojoining processes were performed. The results indicate that there are two stages in the nanojoining process of Ag-Au nanowires which are atom diffusion and new bonds formation. Temperature is a key parameter affecting both stages ascribed to the energy supply and the optimum temperature for Ag-Au nanojoint with diameter of 4.08 nm has been discussed. The mechanical properties of the nanojoint were examined with simulation of tensile test on the end-to-end joint. It was revealed that the nanojoint was strong enough to resist fracture at the joining area.

  13. STS payloads mission control study. Volume 2-A, Task 1: Joint products and functions for preflight planning of flight operations, training and simulations

    Science.gov (United States)

    1976-01-01

    Specific products and functions, and associated facility availability, applicable to preflight planning of flight operations were studied. Training and simulation activities involving joint participation of STS and payload operations organizations, are defined. The prelaunch activities required to prepare for the payload flight operations are emphasized.

  14. DEPSCOR: Research on ARL's Intelligent Control Architecture: Hierarchical Hybrid-Model Based Design, Verification, Simulation, and Synthesis of Mission Control for Autonomous Underwater Vehicles

    National Research Council Canada - National Science Library

    Kumar, Ratnesh; Holloway, Lawrence E

    2007-01-01

    ... modeling, verification, simulation and automated synthesis of coordinators has lead to research in this area. We have worked and are working on these issues with Applied Research Laboratory (ARL) at Pennsylvania State University (PSU) who have designed autonomous underwater vehicles for over 50 years primarily under the support of the U.S. Navy through the Office of Naval Research (ONR).

  15. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable p...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used....

  16. Results from the NASA Spacecraft Fault Management Workshop: Cost Drivers for Deep Space Missions

    Science.gov (United States)

    Newhouse, Marilyn E.; McDougal, John; Barley, Bryan; Stephens Karen; Fesq, Lorraine M.

    2010-01-01

    Fault Management, the detection of and response to in-flight anomalies, is a critical aspect of deep-space missions. Fault management capabilities are commonly distributed across flight and ground subsystems, impacting hardware, software, and mission operations designs. The National Aeronautics and Space Administration (NASA) Discovery & New Frontiers (D&NF) Program Office at Marshall Space Flight Center (MSFC) recently studied cost overruns and schedule delays for five missions. The goal was to identify the underlying causes for the overruns and delays, and to develop practical mitigations to assist the D&NF projects in identifying potential risks and controlling the associated impacts to proposed mission costs and schedules. The study found that four out of the five missions studied had significant overruns due to underestimating the complexity and support requirements for fault management. As a result of this and other recent experiences, the NASA Science Mission Directorate (SMD) Planetary Science Division (PSD) commissioned a workshop to bring together invited participants across government, industry, and academia to assess the state of the art in fault management practice and research, identify current and potential issues, and make recommendations for addressing these issues. The workshop was held in New Orleans in April of 2008. The workshop concluded that fault management is not being limited by technology, but rather by a lack of emphasis and discipline in both the engineering and programmatic dimensions. Some of the areas cited in the findings include different, conflicting, and changing institutional goals and risk postures; unclear ownership of end-to-end fault management engineering; inadequate understanding of the impact of mission-level requirements on fault management complexity; and practices, processes, and tools that have not kept pace with the increasing complexity of mission requirements and spacecraft systems. This paper summarizes the

  17. Mission Exploitation Platform PROBA-V

    Science.gov (United States)

    Goor, Erwin

    2016-04-01

    VITO and partners developed an end-to-end solution to drastically improve the exploitation of the PROBA-V EO-data archive (http://proba-v.vgt.vito.be/), the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers and end-users. The analysis of time series of data (+1PB) is addressed, as well as the large scale on-demand processing of near real-time data. From November 2015 an operational Mission Exploitation Platform (MEP) PROBA-V, as an ESA pathfinder project, will be gradually deployed at the VITO data center with direct access to the complete data archive. Several applications will be released to the users, e.g. - A time series viewer, showing the evolution of PROBA-V bands and derived vegetation parameters for any area of interest. - Full-resolution viewing services for the complete data archive. - On-demand processing chains e.g. for the calculation of N-daily composites. - A Virtual Machine will be provided with access to the data archive and tools to work with this data, e.g. various toolboxes and support for R and Python. After an initial release in January 2016, a research platform will gradually be deployed allowing users to design, debug and test applications on the platform. From the MEP PROBA-V, access to Sentinel-2 and landsat data will be addressed as well, e.g. to support the Cal/Val activities of the users. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to the complete data archive. To realise this, private cloud technology (openStack) is used and a distributed processing environment is built based on Hadoop. The Hadoop ecosystem offers a lot of technologies (Spark, Yarn, Accumulo, etc.) which we integrate with several open-source components. The impact of this MEP on the user community will be high and will completely change the way of working with the data and hence open the large time series to a larger

  18. Science Parametrics for Missions to Search for Earth-like Exoplanets by Direct Imaging

    Science.gov (United States)

    Brown, Robert A.

    2015-01-01

    We use Nt , the number of exoplanets observed in time t, as a science metric to study direct-search missions like Terrestrial Planet Finder. In our model, N has 27 parameters, divided into three categories: 2 astronomical, 7 instrumental, and 18 science-operational. For various "27-vectors" of those parameters chosen to explore parameter space, we compute design reference missions to estimate Nt . Our treatment includes the recovery of completeness c after a search observation, for revisits, solar and antisolar avoidance, observational overhead, and follow-on spectroscopy. Our baseline 27-vector has aperture D = 16 m, inner working angle IWA = 0.039'', mission time t = 0-5 yr, occurrence probability for Earth-like exoplanets η = 0.2, and typical values for the remaining 23 parameters. For the baseline case, a typical five-year design reference mission has an input catalog of ~4700 stars with nonzero completeness, ~1300 unique stars observed in ~2600 observations, of which ~1300 are revisits, and it produces N 1 ~ 50 exoplanets after one year and N 5 ~ 130 after five years. We explore offsets from the baseline for 10 parameters. We find that N depends strongly on IWA and only weakly on D. It also depends only weakly on zodiacal light for Z end-to-end efficiency for h > 0.2, and scattered starlight for ζ revisits, solar and antisolar avoidance, and follow-on spectroscopy are all important factors in estimating N.

  19. Simulation

    CERN Document Server

    Ross, Sheldon

    2006-01-01

    Ross's Simulation, Fourth Edition introduces aspiring and practicing actuaries, engineers, computer scientists and others to the practical aspects of constructing computerized simulation studies to analyze and interpret real phenomena. Readers learn to apply results of these analyses to problems in a wide variety of fields to obtain effective, accurate solutions and make predictions about future outcomes. This text explains how a computer can be used to generate random numbers, and how to use these random numbers to generate the behavior of a stochastic model over time. It presents the statist

  20. NASA's Preparations for ESA's L3 Gravitational Wave Mission

    Science.gov (United States)

    Stebbins, Robin

    2016-01-01

    Telescope Subsystem - Jeff Livas (GSFC): Demonstrate pathlength stability, straylight and manufacturability. Phase Measurement System - Bill Klipstein (JPL): Key measurement functions demonstrated. Incorporate full flight functionality. Laser Subsystem - Jordan Camp (GSFC): ECL master oscillator, phase noise of fiber power amplifier, demonstrate end-to-end performance in integrated system, lifetime. Micronewton Thrusters - John Ziemer (JPL): Propellant storage and distribution, system robustness, manufacturing yield, lifetime. Arm-locking Demonstration - Kirk McKenzie (JPL): Studying a demonstration of laser frequency stabilization with GRACE Follow-On. Torsion Pendulum - John Conklin (UF): Develop U.S. capability with GRS and torsion pendulum test bed. Multi-Axis Heterodyne Interferometry - Ira Thorpe (GSFC): Investigate test mass/optical bench interface. UV LEDs - John Conklin+ (UF): Flight qualify UV LEDs to replace mercury lamps in discharging system. Optical Bench - Guido Mueller (UF): Investigate alternate designs and fabrication processes to ease manufacturability. LISA researchers at JPL are leading the Laser Ranging Interferometer instrument on the GRACE Follow-On mission.

  1. Intelligent Mission Controller Node

    National Research Council Canada - National Science Library

    Perme, David

    2002-01-01

    The goal of the Intelligent Mission Controller Node (IMCN) project was to improve the process of translating mission taskings between real-world Command, Control, Communications, Computers, and Intelligence (C41...

  2. Critical Robotic Lunar Missions

    Science.gov (United States)

    Plescia, J. B.

    2018-04-01

    Perhaps the most critical missions to understanding lunar history are in situ dating and network missions. These would constrain the volcanic and thermal history and interior structure. These data would better constrain lunar evolution models.

  3. Dukovany ASSET mission preparation

    Energy Technology Data Exchange (ETDEWEB)

    Kouklik, I [NPP Dukovany (Czech Republic)

    1997-12-31

    We are in the final stages of the Dukovany ASSET mission 1996 preparation. I would like to present some of our recent experiences. Maybe they would be helpful to other plants, that host ASSET missions in future.

  4. Dukovany ASSET mission preparation

    International Nuclear Information System (INIS)

    Kouklik, I.

    1996-01-01

    We are in the final stages of the Dukovany ASSET mission 1996 preparation. I would like to present some of our recent experiences. Maybe they would be helpful to other plants, that host ASSET missions in future

  5. An integrated radar model solution for mission level performance and cost trades

    Science.gov (United States)

    Hodge, John; Duncan, Kerron; Zimmerman, Madeline; Drupp, Rob; Manno, Mike; Barrett, Donald; Smith, Amelia

    2017-05-01

    A fully integrated Mission-Level Radar model is in development as part of a multi-year effort under the Northrop Grumman Mission Systems (NGMS) sector's Model Based Engineering (MBE) initiative to digitally interconnect and unify previously separate performance and cost models. In 2016, an NGMS internal research and development (IR and D) funded multidisciplinary team integrated radio frequency (RF), power, control, size, weight, thermal, and cost models together using a commercial-off-the-shelf software, ModelCenter, for an Active Electronically Scanned Array (AESA) radar system. Each represented model was digitally connected with standard interfaces and unified to allow end-to-end mission system optimization and trade studies. The radar model was then linked to the Air Force's own mission modeling framework (AFSIM). The team first had to identify the necessary models, and with the aid of subject matter experts (SMEs) understand and document the inputs, outputs, and behaviors of the component models. This agile development process and collaboration enabled rapid integration of disparate models and the validation of their combined system performance. This MBE framework will allow NGMS to design systems more efficiently and affordably, optimize architectures, and provide increased value to the customer. The model integrates detailed component models that validate cost and performance at the physics level with high-level models that provide visualization of a platform mission. This connectivity of component to mission models allows hardware and software design solutions to be better optimized to meet mission needs, creating cost-optimal solutions for the customer, while reducing design cycle time through risk mitigation and early validation of design decisions.

  6. Computer graphics aid mission operations. [NASA missions

    Science.gov (United States)

    Jeletic, James F.

    1990-01-01

    The application of computer graphics techniques in NASA space missions is reviewed. Telemetric monitoring of the Space Shuttle and its components is discussed, noting the use of computer graphics for real-time visualization problems in the retrieval and repair of the Solar Maximum Mission. The use of the world map display for determining a spacecraft's location above the earth and the problem of verifying the relative position and orientation of spacecraft to celestial bodies are examined. The Flight Dynamics/STS Three-dimensional Monitoring System and the Trajectroy Computations and Orbital Products System world map display are described, emphasizing Space Shuttle applications. Also, consideration is given to the development of monitoring systems such as the Shuttle Payloads Mission Monitoring System and the Attitude Heads-Up Display and the use of the NASA-Goddard Two-dimensional Graphics Monitoring System during Shuttle missions and to support the Hubble Space Telescope.

  7. The Sentinel-3 Surface Topography Mission (S-3 STM): Level 2 SAR Ocean Retracker

    Science.gov (United States)

    Dinardo, S.; Lucas, B.; Benveniste, J.

    2015-12-01

    ) and delivered as baseline for industrial implementation. For operational needs, thanks to the fine tuning of the fitting library parameters and the usage of look-up table for Bessel functions computation, the CPU execution time was accelerated over 100 times and made the execution in par with real time. In the course of the ESA-funded project CryoSat+ for Ocean (CP4O), new technical evolutions for the algorithm have been proposed (as usage of PTR width look up table and application of a stack masking). One of the main outcomes of the CP4O project was that, with these latest evolutions, the SAMOSA SAR retracking was giving equivalent results to CNES CPP retracking prototype, which was built with a totally different approach, which enforces the validation results. Work actually is underway to align the industrial implementation with the last new evolutions. Further, in order to test the algorithm with a dataset as realistic as possible, a set of simulated Test Data Set (generated by S-3 STM End-to-End Simulator) has been created by CLS following the specifications as described in a test data set requirements document drafted by ESA. In this work, we will show the baseline algorithm details, the evolutions, the impact of the evolutions and the results obtained processing the CryoSat-2 data and the simulated test data set.

  8. Mobile Ad Hoc Networks in Bandwidth-Demanding Mission-Critical Applications: Practical Implementation Insights

    KAUST Repository

    Bader, Ahmed

    2016-09-28

    There has been recently a growing trend of using live video feeds in mission-critical applications. Real-time video streaming from front-end personnel or mobile agents is believed to substantially improve situational awareness in mission-critical operations such as disaster relief, law enforcement, and emergency response. Mobile Ad Hoc Networks (MANET) is a natural contender in such contexts. However, classical MANET routing schemes fall short in terms of scalability, bandwidth and latency; all three metrics being quite essential for mission-critical applications. As such, autonomous cooperative routing (ACR) has gained traction as the most viable MANET proposition. Nonetheless, ACR is also associated with a few implementation challenges. If they go unaddressed, will deem ACR practically useless. In this paper, efficient and low-complexity remedies to those issues are presented, analyzed, and validated. The validation is based on field experiments carried out using software-defined radio (SDR) platforms. Compared to classical MANET routing schemes, ACR was shown to offer up to 2X better throughput, more than 4X reduction in end-to-end latency, while observing a given target of transport rate normalized to energy consumption.

  9. The STEREO Mission

    CERN Document Server

    2008-01-01

    The STEREO mission uses twin heliospheric orbiters to track solar disturbances from their initiation to 1 AU. This book documents the mission, its objectives, the spacecraft that execute it and the instruments that provide the measurements, both remote sensing and in situ. This mission promises to unlock many of the mysteries of how the Sun produces what has become to be known as space weather.

  10. VEGA Space Mission

    Science.gov (United States)

    Moroz, V.; Murdin, P.

    2000-11-01

    VEGA (mission) is a combined spacecraft mission to VENUS and COMET HALLEY. It was launched in the USSR at the end of 1984. The mission consisted of two identical spacecraft VEGA 1 and VEGA 2. VEGA is an acronym built from the words `Venus' and `Halley' (`Galley' in Russian spelling). The basic design of the spacecraft was the same as has been used many times to deliver Soviet landers and orbiter...

  11. Predicting SPE Fluxes: Coupled Simulations and Analysis Tools

    Science.gov (United States)

    Gorby, M.; Schwadron, N.; Linker, J.; Caplan, R. M.; Wijaya, J.; Downs, C.; Lionello, R.

    2017-12-01

    Presented here is a nuts-and-bolts look at the coupled framework of Predictive Science Inc's Magnetohydrodynamics Around a Sphere (MAS) code and the Energetic Particle Radiation Environment Module (EPREM). MAS simulated coronal mass ejection output from a variety of events can be selected as the MHD input to EPREM and a variety of parameters can be set to run against: bakground seed particle spectra, mean free path, perpendicular diffusion efficiency, etc.. A standard set of visualizations are produced as well as a library of analysis tools for deeper inquiries. All steps will be covered end-to-end as well as the framework's user interface and availability.

  12. Proba-V Mission Exploitation Platform

    Science.gov (United States)

    Goor, Erwin; Dries, Jeroen

    2017-04-01

    VITO and partners developed the Proba-V Mission Exploitation Platform (MEP) as an end-to-end solution to drastically improve the exploitation of the Proba-V (a Copernicus contributing mission) EO-data archive (http://proba-v.vgt.vito.be/), the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers and end-users. The analysis of time series of data (+1PB) is addressed, as well as the large scale on-demand processing of near real-time data on a powerful and scalable processing environment. Furthermore data from the Copernicus Global Land Service is in scope of the platform. From November 2015 an operational Proba-V MEP environment, as an ESA operation service, is gradually deployed at the VITO data center with direct access to the complete data archive. Since autumn 2016 the platform is operational and yet several applications are released to the users, e.g. - A time series viewer, showing the evolution of Proba-V bands and derived vegetation parameters from the Copernicus Global Land Service for any area of interest. - Full-resolution viewing services for the complete data archive. - On-demand processing chains on a powerfull Hadoop/Spark backend e.g. for the calculation of N-daily composites. - Virtual Machines can be provided with access to the data archive and tools to work with this data, e.g. various toolboxes (GDAL, QGIS, GrassGIS, SNAP toolbox, …) and support for R and Python. This allows users to immediately work with the data without having to install tools or download data, but as well to design, debug and test applications on the platform. - A prototype of jupyter Notebooks is available with some examples worked out to show the potential of the data. Today the platform is used by several third party projects to perform R&D activities on the data, and to develop/host data analysis toolboxes. In parallel the platform is further improved and extended. From the MEP PROBA-V, access to Sentinel-2 and landsat data will

  13. Application of a Detailed Emission Model for Heavy Duty Diesel Engine Simulations Application d'un modèle détaillé d'émissions pour la simulation de gros moteurs diesel

    Directory of Open Access Journals (Sweden)

    Magnusson I.

    2006-12-01

    Full Text Available A detailed chemical model describing the formation of soot and NO is applied to simulate emission formation in a heavy duty diesel engine. Cylinder flow and spray development is simulated using an engine CFD code - Speedstar. Combustion is described using a simple eddy break-up model. Modeling of the emission-chemistry/turbulent-flow interaction is based on a flamelet approach. Contrary to a typical flamelet concept, transport equations are solved for mass fractions of soot and NO. The reason being that these major emission constituencies are assumed to change slowly in comparison to typical time scales for chemical processes or transport processes important for combustion. Chemical reactions leading to production and destruction of soot and NO are, however, assumed to be fast. Soot and NO source terms are therefore evaluated from a flamelet library using a presumed probability density function and integrating over mixture fraction space. Results from simulations are compared to engine measurements inform of exhaust emission data and cylinder pressure. Un modèle avec chimie détaillée décrivant la formation des suies et du NO est appliqué à la simulation de la formation des polluants dans un gros moteur Diesel. L'écoulement et le spray sont modélisés avec le code de calcul Speedstar. La combustion est représentée par le modèle eddy break-up . La modélisation de l'interaction entre l'écoulement turbulent et la chimie des polluants est basée sur une approche de type flamelet . Cependant, à la différence d'autres travaux, des équations de transport pour les fractions massiques de suies et de NO sont résolues. Cela est justifié par la supposition que les temps caractéristiques de formation de ces composés sont longs comparés à ceux associés aux phénomènes de transport et aux réactions chimiques associées à la combustion. Cependant, les vitesses de réaction se rapportant aux suies et au NO sont supposées rapides. Cela

  14. Mission of Mercy.

    Science.gov (United States)

    Humenik, Mark

    2014-01-01

    Some dentists prefer solo charity work, but there is much to be said for collaboration within the profession in reaching out to those who are dentally underserved. Mission of Mercy (MOM) programs are regularly organized across the country for this purpose. This article describes the structure, reach, and personal satisfaction to be gained from such missions.

  15. GPS test range mission planning

    Science.gov (United States)

    Roberts, Iris P.; Hancock, Thomas P.

    The principal features of the Test Range User Mission Planner (TRUMP), a PC-resident tool designed to aid in deploying and utilizing GPS-based test range assets, are reviewed. TRUMP features time history plots of time-space-position information (TSPI); performance based on a dynamic GPS/inertial system simulation; time history plots of TSPI data link connectivity; digital terrain elevation data maps with user-defined cultural features; and two-dimensional coverage plots of ground-based test range assets. Some functions to be added during the next development phase are discussed.

  16. Model-Based Systems Engineering for Capturing Mission Architecture System Processes with an Application Case Study - Orion Flight Test 1

    Science.gov (United States)

    Bonanne, Kevin H.

    2011-01-01

    Model-based Systems Engineering (MBSE) is an emerging methodology that can be leveraged to enhance many system development processes. MBSE allows for the centralization of an architecture description that would otherwise be stored in various locations and formats, thus simplifying communication among the project stakeholders, inducing commonality in representation, and expediting report generation. This paper outlines the MBSE approach taken to capture the processes of two different, but related, architectures by employing the Systems Modeling Language (SysML) as a standard for architecture description and the modeling tool MagicDraw. The overarching goal of this study was to demonstrate the effectiveness of MBSE as a means of capturing and designing a mission systems architecture. The first portion of the project focused on capturing the necessary system engineering activities that occur when designing, developing, and deploying a mission systems architecture for a space mission. The second part applies activities from the first to an application problem - the system engineering of the Orion Flight Test 1 (OFT-1) End-to-End Information System (EEIS). By modeling the activities required to create a space mission architecture and then implementing those activities in an application problem, the utility of MBSE as an approach to systems engineering can be demonstrated.

  17. The Science and Technology of Future Space Missions

    Science.gov (United States)

    Bonati, A.; Fusi, R.; Longoni, F.

    1999-12-01

    processing. Powerful computers with customized architectures are designed and developed. High-speed intercommunication networks are studied and tested. In parallel to the hardware research activities, software development is undertaken for several purposes: digital and video compression algorithms, payload and spacecraft control and diagnostics, scientific processing algorithms, etc. Besides, embedded Java virtual machines are studied for tele-science applications (direct link between scientist console and scientific payload). At system engineering level, the demand for spacecraft autonomy is increased for planetology missions: reliable intelligent systems that can operate for long periods of time without human intervention from ground are requested and investigated. A technologically challenging but less glamorous area of development is represented by the laboratory equipment for end-to-end testing (on ground) of payload instruments. The main fields are cryogenics, laser and X-ray optics, microwave radiometry, UV and infrared testing systems.

  18. GPS Navigation for the Magnetospheric Multi-Scale Mission

    Science.gov (United States)

    Bamford, William; Mitchell, Jason; Southward, Michael; Baldwin, Philip; Winternitz, Luke; Heckler, Gregory; Kurichh, Rishi; Sirotzky, Steve

    2009-01-01

    utilizing a TDMA schedule to distribute a science quality message to all constellation members every ten seconds. Additionally the system generates one-way range measurements between formation members which is used as input to the Kalman filter. In preparation for the MMS Preliminary Design Review (PDR), the Navigator was required to pass a series of Technology Readiness Level (TRL) tests to earn the necessary TRL-6 classification. The TRL-6 level is achieved by demonstrating a prototype unit in a relevant end-to-end environment. The IRAS unit was able to meet all requirements during the testing phase, and has thus been TRL-6 qualified

  19. Astronaut Neil Armstrong participates in simulation training

    Science.gov (United States)

    1969-01-01

    Astronaut Neil A. Armstrong, Apollo 11 commander, participates in simulation training in preparation for the scheduled lunar landing mission. He is in the Apollo Lunar Module Mission SImulator in the Kennedy Space Center's Flight Crew Training Building.

  20. NASA CYGNSS Tropical Cyclone Mission

    Science.gov (United States)

    Ruf, Chris; Atlas, Robert; Majumdar, Sharan; Ettammal, Suhas; Waliser, Duane

    2017-04-01

    The NASA Cyclone Global Navigation Satellite System (CYGNSS) mission consists of a constellation of eight microsatellites that were launched into low-Earth orbit on 15 December 2016. Each observatory carries a four-channel bistatic scatterometer receiver to measure near surface wind speed over the ocean. The transmitter half of the scatterometer is the constellation of GPS satellites. CYGNSS is designed to address the inadequacy in observations of the inner core of tropical cyclones (TCs) that result from two causes: 1) much of the TC inner core is obscured from conventional remote sensing instruments by intense precipitation in the eye wall and inner rain bands; and 2) the rapidly evolving (genesis and intensification) stages of the TC life cycle are poorly sampled in time by conventional polar-orbiting, wide-swath surface wind imagers. The retrieval of wind speed by CYGNSS in the presence of heavy precipitation is possible due to the long operating wavelength used by GPS (19 cm), at which scattering and attenuation by rain are negligible. Improved temporal sampling by CYGNSS is possible due to the use of eight spacecraft with 4 scatterometer channels on each one. Median and mean revisit times everywhere in the tropics are 3 and 7 hours, respectively. Wind speed referenced to 10m height above the ocean surface is retrieved from CYGNSS measurements of bistatic radar cross section in a manner roughly analogous to that of conventional ocean wind scatterometers. The technique has been demonstrated previously from space by the UK-DMC and UK-TDS missions. Wind speed is retrieved with 25 km spatial resolution and an uncertainty of 2 m/s at low wind speeds and 10% at wind speeds above 20 m/s. Extensive simulation studies conducted prior to launch indicate that there will be a significant positive impact on TC forecast skill for both track and intensity with CYGNSS measurements assimilated into HWRF numerical forecasts. Simulations of CYGNSS spatial and temporal sampling